copy the linklink copied!

Annex B. Methodology

copy the linklink copied!

Criteria for inclusion in this report

To be included to form the basis of an empirical comparative analysis in this report, the representative deliberative processes needed to meet the three defining characteristics identified through the OECD’s analysis:

  1. 1. Deliberation (deliberative processes had to have at least one full day of face-to-face meetings).

Deliberation involves weighing carefully different options, which requires accurate and relevant information and a diversity of perspectives; a shared evaluative framework for reaching decisions, and a requirement for participants to apply these shared criteria to weigh trade-offs and find common ground to reach a group decision (see, for example, Matthew, 1999; Carson, 2017; Bone et al., 2006).

The criteria of one full day of meetings was established to operationalise the fact that deliberation requires time.

  1. 2. Representativeness (participants of the deliberative process were randomly selected and demographically stratified).

Representativeness is achieved through random selection (sortition) and demographic stratification (a process that ensures that the group broadly matches the demographic profile of the community against census or other similar data).

Random selection with demographic stratification is also a shared thread between cases since the overarching aim of the research is to explore innovative forms of participation. While not new in itself, as the practice of sortition dates back to Ancient Athens and has been used in many places around the world at various times throughout history, its modern incarnation is novel. It helps to overcome some of the key challenges involved in designing stakeholder participation, notably those related to the representativeness, diversity, and inclusiveness of participants.

  1. 3. Impact (deliberative process were commissioned by a public authority).

Impact means that decision makers agree to respond to and act on recommendations (see, for example, Farrell et al., 2019; Carson and Elstub, 2019).

The report excludes deliberative processes conducted purely for academic or experimental purposes without a direct link to public decisions. The link to an authority that will eventually decide on a policy issue has an impact on numerous factors, such as who decides to participate, the response rate, and the dropout rate. Removing the link to power makes participation less meaningful and makes it more likely that only those with a strong interest in the topic will choose to participate. It is also likely why experiments have lower response rates and higher dropout rates than the average. That does not mean that experiments are not useful for other purposes, such as research. However, including such cases in this study would skew the analysis and conclusions about their use for governance.

copy the linklink copied!

Data collection

The data collection for this report was through desk research, a targeted call for submissions to the OECD Innovative Citizen Participation Network (ICPN) and international Democracy R&D Network of deliberative practitioners, and an open call through the OECD Toolkit and Case Navigator for Open Government platform. More details about the collection can be found h

The case collection was not limited to OECD Member countries, however, only seven examples were found in non-Member countries. The analysis thus focuses on OECD Member countries for comparability reasons.

The data collection took place from 6th March to 31st October 2019. The cases needed to have been completed by the end of October 2019 to be included. Cases that were in progress at that time were omitted for comparability reasons (with an exception for ongoing permanent deliberative processes), since the criteria for analysis includes the response by the public authority and evaluation of the process and impact.

Desk research

The first step involved extensive desk research to collect as many cases of deliberative processes as possible for this study. A wide range of academic literature was consulted, including previous overarching studies of deliberative processes, books, and articles analysing specific models or particular cases.

Guides, handbooks, and other documents related to principles and good practices of deliberative processes were consulted as well. Most of them were published by practitioners and organisers of multiple deliberative processes, as well as research organisations (including, but not limited to, Mass LBP, United Nations Democracy Fund, newDemocracy Foundation, Jefferson Center, and the Democracy R&D network).

Project archives of key organisations that have delivered deliberative processes provided extensive documentation of certain cases. These often include online reports of deliberative processes that explain the random selection recruitment method, number of participants and their demographics, experts and stakeholders involved, and other details.

In addition, online news articles and other media sources were used to identify potential deliberative processes for the database.

Online databases were consulted and filtered to identify the cases that match the criteria of the study. These included:

Targeted call to OECD ICPN and Democracy R&D Networks

In tandem with the desk research, a call for cases was targeted at the members of the OECD Innovative Citizen Participation Network, which consists of innovators and practitioners of innovative citizen participation practices. The full list of network members can be found at the end of the Annex.

A similar targeted call for cases was opened to the members of the Democracy R&D Network, an international network of organisations, associations, and individuals helping decision makers take hard decisions and build public trust through deliberative processes.

More about the Democracy R&D network: https://democracyrd.org/.

Qualitative interviews were conducted with several members of both networks, with a goal to gather more details about the cases of deliberative processes they facilitated. These were particularly important in the situations where details were not readily available online. The interviewees included representatives of The Danish Board of Technology Foundation, Healthy Democracy, Missions Publiques, G1000, the Nexus institute, Tokyo Metropolitan University, as well as organisers of the Polish Citizens’ Juries/Panels and those of the Ostbelgien Model.

Open call through OECD Toolkit Navigator

In addition to the targeted call, there was a public call for cases opened on the OECD Toolkit and Case Navigator for Open Government platform for the period of 4th July-31st August 2019. The aim of the call was to open up the data collection for input from the wider public.

The platform is available here: https://www.oecd.org/gov/open-government-toolkit-navigator.htm.

copy the linklink copied!

Data cleaning and validation process

The collected data went through a cleaning and validation process. Due to the fact that the cases collected dated from 1986 and exact individuals who were commissioners as well as organisers of those cases could not be identified or were no longer in positions, the validation efforts were concentrated on the most recent cases. All the cases collected that took place in 2018-2019 were validated by contacting the organisations that were responsible for their implementation to verify the accuracy of each data point. Some of the earlier cases have also been validated, if they were organised by the same organisations that conducted and validated cases for 2018-2019. In total, the data for 81 out of 282 cases has been validated.

For variables where qualitative data was collected, especially where textual description was provided, the key information that reoccurred across most cases was identified and used for analysis. For example, variable 26 is a description of the details of the random selection process of the participants. From the overall responses, several factors, such as the number of citizens who received invitations to participate, the stratification criteria, and the database used for contacting citizens, were identified as recurring and important. Hence, these elements were used for further analysis.

copy the linklink copied!

Variables used for analysis

For each deliberative process that met the three criteria for inclusion in the study, the OECD attempted to collect data pertaining to 60 different variables, based on availability. The variables were set with the intention to gather detailed data on the process of organising and preparing deliberative processes, their participants, organisers, commissioners, funders, outcomes, and lessons learned. The full list of variables can be found in Table A ‎B.1.

copy the linklink copied!
Table A ‎B.1. Variables of the OECD Database of Representative Deliberative Processes and Institutions (2020)

Variables

Deliverables

1.

Project title

The title of the deliberative process

2.

Deliberative model (categorised by OECD)

The model of the deliberative process, categorised as one of the 12 models introduced in the study.

3.

Deliberative model (named by organisers)

The model of the deliberative process, as indicated by the organisers.

4.

Ad hoc or Institutionalised?

The nature of the deliberative process (an ad hoc initiative, or a permanent institutionalised process).

5.

If institutionalised, is there a legal document establishing its functioning? (i.e. terms of reference)

For institutionalised processes, existence of a legal document establishing the functioning of the deliberative process.

6.

Institutionalisation regulations URL

A web link to the legal document establishing the functioning of the deliberative process.

7.

Project name

Original name of the deliberative process (original language, or title to a broader project that the deliberative process pertains to).

8.

Project description

The goal of the deliberative process.

9.

Was there a dedicated committee/group set up in relation to the deliberative process? (i.e. expert group, advisory committee)

Whether there was a dedicated committee/group set up in relation to the deliberative process (i.e. expert group, advisory committee).

10.

Advisory committee members

The members of the dedicated committee/group set up in relation to the deliberative process (public officials, experts, civil society organisations, academics, business, and citizens).

11.

The role of the advisory committee

The role of the dedicated committee/group set up in relation to the deliberative process (oversight, design and facilitation, ensuring balanced information, providing expert knowledge).

12.

Project URL

The web link to the deliberative process description (either on the website of the commissioning public authority or the implementing organisation).

13.

Year(s) of project

The year(s) of the duration of the deliberative process.

14.

Country

The country in which deliberative process took place.

15.

OECD member?

Whether a country was an OECD Member at the time of data collection.

16.

Level of government

The level of government at which the deliberative process took place.

17.

Place (Country/State/Region/City)

Depending on the level of government, either the country, the state/region or the city where deliberative process took place.

18.

Implementing organisation

The organisation that was commissioned/assigned by the public authority to implement the deliberative process.

19.

Organisation URL

The web link to the organisation that was commissioned/assigned by the public authority to implement the deliberative process.

20.

Organisation type

The type of the organisation that was commissioned/assigned by the public authority to implement the deliberative process (academia, civil society organisation, private company or a public organisation).

21.

Issue category

The topic of the policy issue addressed through a deliberative process.

22.

Was the Jury/Assembly/Panel independent with mandate to set its rules of procedure?

The independence of a deliberative process with mandate to set its rules of procedure.

23.

Number of panels of the deliberative process

The number of deliberative panels in the deliberative process. A panel is considered separate if it is comprised of different people who did not participate in the previous panels of the same deliberative process. With the exception of when some participants of different local level panels are brought together for a regional or national level panel, which is also considered as a separate panel.

24.

Total number of participants

The total number of participants across all panels of a single deliberative process.

25.

Participant selection method

The method used for participant random selection (one-stage random selection, two-stage random selection, three-stage random selection, targeted selection, random selection (for when it is not clear what was the exact random selection procedure) and other).

26.

Participant selection methodology details

The detailed description of how random participant selection took place (stages, numbers of citizens invited, stratification criteria etc.

27.

What was the method for participant selection?

The channel used for inviting randomly selected participants (post, phone, email, leaflets, survey, in person, other).

28.

Who was the invitation to participate from?

The person from whom the invitation to participate was sent (minister, member of parliament, mayor, prime minister, president, local councillor, premier, head of public institution, specific government department, other).

29.

Response rate to invitation

The percentage of randomly selected invited citizens who agreed to participate in the deliberative process.

30.

Duration of participation selection process

The length of the process of random selection of participants (in weeks).

31.

Duration of preparation/planning/agenda setting phase before 1st participant meeting

The length of the preparation/planning/agenda setting for the deliberative process, excluding participant selection (in weeks).

32.

Remuneration of participants

Whether/how, participants of the deliberative process were remunerated (remunerated, non-remunerated, transport expenses compensated, expenses covered).

33.

Which stakeholders were involved in the process design?

The stakeholders that were involved in designing the deliberative process (academics, citizens, civil society organisations, government officials, private companies, none).

34.

What did the stakeholders, involved in the process design, bring to the table?

The contribution of different stakeholders to the design of the deliberative process.

35.

Was a dedicated online platform/tool used to keep participants up to date, informed and connected during the process?

The use of a dedicated online platform or tool to keep participants up to date informed and connected throughout the process (yes/no).

36.

Name of the platform for participant communication.

The name of the online platform used.

37.

How has process been communicated?

The communication efforts that were deployed related to communicating about the deliberative process to the broader public.

38.

Elected officials part of the panel

Whether part of the participants of the deliberative process were public officials.

39.

How many elected officials took part in the panel?

The number of the public officials that were part of the participants of a deliberative process.

40.

Total duration of face-to-face meetings (in days)

The duration of the face-to-face meetings of the participants during the deliberative process (in days).

41.

Total duration between 1st participant meeting date and last meeting date (in weeks)

The duration of the deliberative process (from the first participant meeting to the last, in weeks).

42.

Was there an initial survey to measure the beliefs of participants?

Whether there was a survey conducted to measure participant opinions at the start of the deliberative process (yes/no).

43.

Learning component of the process

The learning components of the deliberative process (introductory learning material before the first meeting, reading material between meetings, experts available during meetings for presentations and questions, participants could request information, there were specific learning sessions).

44.

Was there a connection to other forms of engagement? If so, what were they?

Whether there have been other forms of citizen engagement in relation to the deliberative process and what they were (select from surveys, consultations, roundtable discussions and other).

45.

Please provide further details on other forms of engagement

Detailed description of other forms of engagement in relation to the deliberative process.

46.

Outcome

The outcome of the deliberative process (vote, recommendations etc.

47.

Outcome (file number)

The number of the report/article/other document outlining the recommendations that were produced/collective opinions discovered during the deliberative process in the database of outcome documents.

48.

Were final recommendations discussed face-to-face with the public authority?

Whether participants of deliberative process discussed their recommendations face-to-face with the public authority that commissioned them.

49.

Response and follow-up by public authority

The response of the government authority to the recommendations (implementation of recommendations, response to the participants or broader public),

50.

Was there a change in administration during the period when deliberative process took place?

Whether there was a transition of power in the public authority that commissioned the deliberative process, while the process was taking place.

51.

Implementation of recommendation is being monitored

Whether the implementation of the recommendations produced during deliberative process have been monitored.

52.

If yes, how is implementation of recommendations is being monitored?

The ways in which the implementation of the deliberative processes have been monitored.

53.

Has the process been evaluated?

Whether the deliberative process has been evaluated.

54.

If the process has been evaluated, how?

What kind of evaluation was conducted (academic analysis, participant exit survey and other).

55.

Evaluation of the process URL

The web link to an evaluation report/study/survey results/article of the deliberative process.

56.

Challenges encountered

The challenges that the organisers of the deliberative process encountered while designing, implementing and evaluating a deliberative process and after.

57.

Lessons learned

The lessons the organisers of the deliberative process learned from the experience.

58.

Total cost (not mandatory to fill in)

The total cost of the deliberative process.

59.

Currency

Currency in which costs have been indicated.

60.

Funding source(s)

The organisations that funded/commissioned the deliberative process and the funding sources they used to pay for the deliberative process.

Re-classifying the model of some cases

Initially, the representative deliberative process model (variable 3) was inserted for each case as either the one that was indicated by the process organisers or the name that appears in the process title (ex. Citizens’ Jury on Climate would be categorised as a ‘citizens’ jury’). Drawing on the complete dataset, the OECD identified 12 models of deliberative processes (Chapter ‎2), which were characterised by various common characteristics across different cases.

After the 12 models of deliberative processes were defined, all deliberative processes in the database were reclassified to fall into one of the 12 categories based on their characteristics. Hence, variable 2 indicates the model of deliberative process that corresponds to the 12 models identified in this study. For example, community panels, reference panels, citizens’ panels and citizens’ juries have been brought together under the umbrella term Citizens’ Juries/Panels. Below is the table used for reclassification.

copy the linklink copied!
Table A ‎B.2. Classification of models of deliberative processes

 

Model

Includes

1.

Citizens' Assembly

Citizens' Assembly

2.

Citizens' Jury/Panel

Citizens' Jury, Citizens' Panel, Reference Panel, Community Panel

3.

Consensus Conference

Consensus Conference

4.

Planning Cell

Planning Cell, Citizen Deliberation Meeting

5.

G1000

G1000

6.

Citizens' Council

Citizens' Council

7.

Citizens' Dialogues

Citizens' Summit, Citizens' Forum, Citizens' Dialogues, Citizens' Workshop, Citizens' Hearing, Deliberative event

8.

Deliberative Poll/Survey

Deliberative Poll, Deliberative Survey

9.

World Wide Views

World Wide Views, Europe Wide Views

10.

Citizens' Initiative Review

Citizens' Initiative Review

11.

The Ostbelgien Model

The Ostbelgien Model

12.

City Observatory

City Observatory

Source: OECD Database of Representative Deliberative Processes and Institutions (2020).

In five cases, the model appearing in the title of the representative deliberative process did not match the set characteristics of the corresponding model identified by the OECD Secretariat. For example, a process titled “Citizens’ Assembly on Social Care” did not meet the characteristics of the Citizens’ Assembly model identified by the Secretariat based on the data; in all but name it fit the model of a Citizens’ Jury/Panel. This is partially down to an ongoing debate and confusion about terminology among practitioners and academics, with the same terms being applied to different processes, largely driven by different political contexts. The OECD acknowledges these differences and has attempted to group the processes with similar design characteristics, regardless of what they are called, for the purpose of international comparative analysis. For this reason, five processes that were titled as "Citizens' Assemblies" (three in the UK and two in Canada) have been reclassified as Citizens'/Juries Panels for the analysis of deliberative models in this study, to allow for a more accurate comparative analysis1.

copy the linklink copied!

Members of the OECD Innovative Citizen Participation Network

As part of this study, the OECD has been engaging with a network of practitioners, civil servants, academics, researchers, and designers to frame the topic and scope of research, to gather feedback and inputs to the research in an ongoing manner, and to strengthen the ties between these important groups of actors. From the OECD Secretariat, Claudia Chwalisz, Ieva Česnulaitytė, and Alessandro Bellantoni co-ordinate the network.

The ICPN was convened at full-day meetings in June 2019, where they helped identify the research questions and suggested sources for the data collection, and in January 2020, where they provided rich comments and feedback regarding the report’s preliminary findings. These meetings were possible thanks to support from the Royal Society of Arts, Manufactures, and Commerce (RSA), the Electoral Reform Society (ERS), and the Open Society Foundations (OSF).

Members:

  • Yago Bermejo Abati, Co-founder, Deliberativa Spain

  • Eddy Adams, Thematic Pole Manager, Social Innovation and Human Capital, URBACT

  • Alberto Alemanno, Founder, The Good Lobby and Jean Monnet Professor, HEC Paris

  • Jon Alexander, Co-founder, New Citizenship Project

  • Sarah Allan, Head of Engagement, Involve

  • Graham Allen, Co-ordinator, Citizens’ Convention on UK Democracy

  • Theo Bass, Programme Manager, UK Research and Innovation

  • Tonu Basu, Lead of Thematic Engagement, Open Government Partnership

  • Luca Belgiorno-Nettis, Founder, newDemocracy Foundation

  • Javier Bikandi, Head of Innovation, Basque government

  • Jessica Blair, Director, Electoral Reform Society in Wales

  • Jan Boelen, Rector, Karlsruhe University of Art & Design, Director, Atelier Luma

  • Stephen Boucher, Founder, Political Creativity

  • Éric Buge, Officer, French Parliament

  • Didier Caluwaerts, Assistant Professor, Vrije Universiteit Brussel

  • Elizabeth Canovan, Assistant Secretary General, Department of the Taoiseach

  • Damian Carmichael, Open Government Lead, Department of Industry, Science, Energy, and Resources

  • Lyn Carson, Director of Research, newDemocracy Foundation

  • Ed Cox, Director, Royal Society of the Arts, Manufactures, and Commerce (RSA)

  • Nicole Curato, Associate Professor, Centre for Deliberative Democracy & Global Governance, University of Canberra

  • Fiona Curran, Social Policy and Public Service Reform Officer, Department of the Taoiseach

  • Yves Dejaeghere, Director, G1000 Organisation

  • Natalia Domagala, Head of Data Ethics Policy, UK Department for Digital, Culture, Media, and Sport

  • Laurie Drake, Director of Research and Learning, MASS LBP

  • Kezia Dugdale, Director, John Smith Centre

  • Zakia Elvang, Co-founder, We Do Democracy

  • Oliver Escobar, Professor, University of Edinburgh

  • Gorka Espiau Idoiaga, CRIEM Professor of Practice 2016-2019, McGill

  • David Farrell, Professor, University College Dublin

  • Jessica Feldman, Assistant Professor, American University of Paris

  • Jim Fishkin, Professor, Stanford University

  • Frances Foley, Project Director, Citizens' Convention on UK Democracy

  • Paulina Fröhlich, Head of “Future of Democracy” Program, Das Progressive Zentrum

  • Karin Fuller, Outreach and Engagement Lead, Treasury Board of Canada Secretariat

  • Jessica Garland, Director of Policy and Research, Electoral Reform Society

  • Marcin Gerwin, Center for Climate Assemblies

  • Doreen Grove, Head of Open Government, Scottish Government

  • Dominik Hierlemann, Senior Expert, Bertelsmann Stiftung

  • Lauren Howard, Outreach and Engagement Specialist, Treasury Board of Canada Secretariat

  • Tim Hughes, Director, Involve

  • Darren Hughes, Chief Executive, Electoral Reform Society

  • Amelie Klein, Curator, Vitra Design Museum

  • Hélène Landemore, Professor, Yale University

  • Aline Lara Rezende, Assistant Curator, Ljubljana Biennial of Design

  • Panthea Lee, Principal, Reboot

  • Dimitri Lemaire, Director, Particitiz

  • Josef Lentsch, Managing Partner, Innovation in Politics Institute

  • Juha Leppänen, Chief Executive, Demos Helsinki

  • Miriam Levin, UK Department of Digital, Culture, Media, and Sport

  • Rose Longhurst, Program Officer, Open Society Foundations

  • Peter MacLeod, Principal, MASS LBP

  • Arantxa Mendiharat, Co-founder, Deliberativa Spain

  • Geoff Mulgan, Professor of Collective Intelligence, Public Policy and Social Innovation, University College London

  • Paul Natorp, Co-founder, Sager der Samler (Citizen Change) and Founder, Rethink Activism Festival

  • Beth Noveck, Co-founder and Director, GovLab and Chief Innovation Officer, New Jersey Government

  • Arild Ohren, PhD Candidiate, Norwegian University of Science and Tech

  • Reema Patel, Head of Public Engagement, Ada Lovelace Institute and Nuffield Foundation

  • Lex Paulson, Founding Director, UM6P School of Collective Intelligence

  • Teele Pehk, Estonian democracy artist & urbanist

  • Tiago Peixoto, Tech & Citizen Engagement Lead, World Bank

  • Sophie Pornschlegel, Senior Policy Analyst, European Policy Centre

  • Alice Rawsthorn, Design critic and author of Design as an Attitude

  • Kyle Redman, Programme Manager, newDemocracy Foundation

  • Gaëtane Ricard-Nihoul, Deputy Head of Citizen Dialogues Unit, European Commission

  • Sam Roberts, Head of Open Data and Open Government Policy, UK Department for Digital, Culture, Media, and Sport

  • Cassie Robinson, Senior Head, UK Portfolio, The National Lottery Community Fund and Co-founder, The Point People

  • Stefan Roch, Program Manager, Bertelsmann Stiftung

  • Matt Ryan, Non-resident fellow, GovLab

  • Vera Sacchetti, Co-creator, TEOK Basel

  • David Schecter, Co-ordinator, Democracy R&D

  • Typhanie Scognamiglio, Director of Participation, Centre de la participation citoyenne, French Inter-ministerial Department for Public Sector Reform

  • Graham Smith, Professor, University of Westminster

  • Paolo Spada, Researcher, Universidade de Coimbra

  • Ellen Stewart, Social Policy and Public Service Reform Officer, Department of the Taoiseach

  • Jane Suiter, Director, Institute for Future Media and Journalism

  • John Tasioulas, Director, Yeoh Tiong Lay Centre for Philosophy, Politics, and Law at King’s College London

  • Matthew Taylor, Chief Executive, RSA

  • Riley Thorold, Global Programme Manager, RSA

  • Clifton Van der Linden, Founder, VoxPopLabs

  • Van Reybrouck, Author and Founder, G1000

  • Stefaan Verhulst, Co-founder and Chief Research and Development Officer, GovLab

  • Kitty Von Bertele, Europe Officer, Luminate

  • Iain Walker, Director, newDemocracy Foundation

  • Alex Way, Managing Director, MASS LBP

  • Niamh Webster, Digital Lead, Scottish Government

  • Richard Youngs, Senior Fellow, Carnegie Europe

  • Anthony Zacharzewski, Director, Democratic Society

  • Katharina Zuegel, Co-director, Décider Ensemble

Note

← 1. Lethbridge Citizens' Assembly on Councillor Employment and Compensation, Prince Edward County Citizens’ Assembly, Citizens' Assembly on Social Care, Camden's Citizens' Assembly on the Climate Crisis, and National Assembly for Wales

Metadata, Legal and Rights

This document, as well as any data and map included herein, are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. Extracts from publications may be subject to additional disclaimers, which are set out in the complete version of the publication, available at the link provided.

https://doi.org/10.1787/339306da-en

© OECD 2020

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at http://www.oecd.org/termsandconditions.