FAQs
Q: What is a citizens’ jury?
A: A citizens’ jury is a type of long-form deliberative process. Long form deliberative processes can be distinguished by the following characteristics (Chwalisz, 2017):
- Citizens are tasked with helping to resolve a pressing problem that requires navigating multiple trade-offs and considering more than one possible and realistic solution (and this solution is not pre-determined by those commissioning or facilitating the process).
- This group of citizens is a small group (in numbers between 24 and 48) who are randomly selected from a local, regional or national community.
- The group spends a generally long period of time (ie a few days over the course of two to three months) learning about and discussing a policy issue from different angles.
- Citizens are not asked for their individual opinion on an issue, but to deliberate on behalf of their community with the aim of reaching a consensus or compromise.
- The group produces concrete recommendations for decision-makers, who then respond directly and publicly to the proposals.
Crucially, long-form deliberative processes should not be confused with focus groups or consultations. They are not ‘one-way’ exercises in which citizens are only asked for their own opinions on an issue; rather, they are ‘two-way’ conversations between experts, decision-makers and the public in which ideas are exchanged (and often respectfully challenged) in order to reach a conclusion in a collaborative manner.
Q: How is the jury selected?
A: The jury is made up of 29 citizens and have been selected using a range of criteria. We worked with a recruitment agency, which independently recruited members of the public to meet our specifications. Our primary specifications included recruiting across the nine regions of England and Wales, with slightly more jurors from the more densely populated regions; a 50:50 gender split; minimum 20 percent of BAME jurors; minimum 20 percent of jurors with a disability; as well as recruiting for a range of ages, socio-economic backgrounds, and attitudes towards AI (positive, neutral, and negative).
Q: What is the citizens’ jury doing?
A: Jurors are asked to give their verdict, or answer, in response to a question, much like in a court of law. In this case, the jurors will be answering a specific question that poses a problem, in order to inform state and corporate policies. The question they will be asked is, ‘Under what conditions, if any, is it appropriate to use an automated decision system?’
As part of the process, citizens spend a period of time learning about and discussing the problem within many different contexts. Similar to a traditional jury, expert witnesses are summoned to enhance citizens’ understanding of the different elements to the problem.
Citizens are then asked to enter into an open dialogue, commit to listening to others, and provide responses with consideration for the wider community (in contrast to focus groups and most consultations where individuals are asked for their own opinion). This is to encourage citizens to strive towards a consensus and/or a compromise in the best interests of society, rather than for themselves as individuals.
Finally, the jury draws its conclusions, providing an answer to the question set and a clear steer or recommendation(s) for government, businesses, and civil society organisations to take forward. This answer will take the form of a statement that they will directly present to key decision-makers and influencers.
Q: Why do the conclusions matter if only 25-30 citizens are involved in the process?
A: Some people may question how impactful a public dialogue can be given the scale of the groups assembled for citizens’ juries in particular. There may be concerns about whether such small groups are likely to be representative of one’s own views and values. It is thus important to clarify that there is a distinction between representation and representativeness. We are asking these citizens to represent their community to encourage them to consider more than their own, individual interests, but we are not claiming that they are statistically representative of that community. Rather, we are suggesting that there are relevant insights to be drawn from a diverse group of citizens who are given the opportunity to enter into an informed and deliberative dialogue. Similar logic underpins the use of juries for criminal trials, in which members of the public are chosen to reach a verdict which informs the judge’s decision on appropriate sentencing for the defendant.
Q: Why is the RSA partnering with DeepMind on this project?
A: The RSA is partnering with DeepMind on the Forum for Ethical AI because we share a commitment to encouraging and facilitating public engagement on the real-world impacts of AI.
This project is part of DeepMind’s Ethics & Society programme, which aims to help technologists put ethics into practice, and to help society anticipate and direct the impact of AI so that it works for the benefit of all.
The RSA chose to work with DeepMind because they are both a leader in the field of artificial intelligence and at the forefront of considering the ethical implications of this technology. In the spirit of collaboration, DeepMind is working with a range of partners, including the AI Now Institute at NYU, Oxford Internet Institute’s Digital Ethics Lab, and the Royal Society, to grapple with emerging ethical questions. As one of DeepMind’s partners, the RSA is in a position to make what we hope will be a valuable contribution that addresses some of these questions. As we are a charity, whatever contribution we make will be in the interests of the public good rather than for commercial benefit.
Q: How can the RSA demonstrate that its research is independent?
A: As an independent charity, the RSA relies on a range of private, public and voluntary organisations to fund our research. For complete transparency, we publish a full list of our funders as well as clearly stating our partnerships as part of project pages on our website. Our research would not be possible without the generosity of our partners. However, we are mindful of the risks associated with accepting funding from others and there are protocols in place to ensure that the integrity of our research is not compromised. For example, we retain editorial control over every project, including this one with DeepMind. Our partners are welcome to share their views with the RSA, but on the same basis as every other stakeholder we engage with, including civil servants, policymakers, community groups, NGOs, trade unions, and RSA Fellows.
This particular project has the support of an independent advisory group, drawn from a range of backgrounds, which will help us to maintain the rigour and impartiality of our research. For example, our advisory group assesses the materials and format of the citizens’ juries. We name the members of our advisory group (and a digital reference group who will input on specific outputs) on our website.
Q: This project aligns with my own research and/or interests – how can I get more involved?
A: Thank you for your interest in our project; we really appreciate the enthusiasm and support. This project has a small research team who are currently focused on delivering the citizens’ jury in addition to undertaking accompanying research. [KG1] While there are currently no formal avenues for getting involved, the team is committed to widely sharing key updates from this project, including any public events or other opportunities to participate as they arise. If you would like to ensure that you are the first to hear from us then please do sign up to our project mailing list (link to sign up form on website).
Q: Is it possible to observe the citizens’ jury?
A: It is not possible to observe the citizens’ jury due to concerns about comfort and constraints of space. We hope to maintain a comfortable environment for the citizens in which they feel able to speak freely, and thus aren’t anxious about being judged or scrutinised by others in the room. However, most participants have agreed to be filmed for a short video that will capture the spirit of the process. There are also a number of RSA colleagues acting as note-takers, and we will be using their notes to produce a summary document to be released shortly after the conclusion of the citizens’ jury.
Q: What will happen after the citizens’ jury concludes?
A: The citizens’ jury will reach their conclusions in June 2018. These conclusions will then be tested during two workshops with citizens who may be disproportionately impacted by the use of these systems. There will be a final event in October 2018 involving the citizens and key stakeholders, and the programme will culminate with a report.
Q: Where can I learn more?
A: Please see the RSA’s report Artificial intelligence: Real public engagement for more information on the citizens’ jury. The report also provides an overview of public attitudes towards automated decision-making and delves into the substance of our dialogue with citizens.
Should you have any more questions, please feel free to contact Brhmie Balaram, the programme manager for the Forum for Ethical AI, at brhmie.balaram@rsa.org.uk.