By Diane Warburton, Sciencewise Evaluation Manager, Sciencewise
"The result of a public dialogue process is a deep understanding of public views on an issue. It provides a source of evidence that can be used in the policy process. This means stronger policies that are more likely to be accepted by citizens." Cabinet Office March 2015; Open Policy Making Toolkit.
Government at the highest levels now recognises how public dialogue, done well, can help ordinary members of the public develop and express their views. "These techniques elicit people’s deeper views on issues and, crucially, what values and understandings underpin those views: they are rich discussions that produce valuable insights into tricky issues from diverse perspectives." (Cabinet Office OPM Toolkit).
The key to achieving all these benefits is in the two words: 'done well'. The new Quality in Public Dialogue assessment framework, published by Sciencewise in March 2015, aims to provide a checklist of all the main questions that need to be answered to build a high quality public dialogue, to stimulate thinking and open up options in design and delivery. It covers issues such as:
• how many is 'enough' participants? how many locations are needed to provide adequate 'coverage'?
• should the role of scientists and other specialists involved in dialogue events be limited to providing information, or should they be participants in the dialogue?
• what role do participants have in reporting dialogue results, and how much can reporting be based on agreements reached collectively with participants?
• what will count as sufficiently robust processes to enable decision makers to be able to know how and when to use dialogue results with confidence in decision making alongside other forms of evidence?
The framework does not aim to provide a template for all public dialogue; that would be neither feasible nor desirable. Form follows function in dialogue - the methods used always need to be tailored so they are appropriate for the purpose of the work. But we now have the experience and knowledge to be able to identify the most basic design and delivery elements which are common to many public dialogue projects. The framework is focused on those elements and is based on existing well-established quality standards as much as possible, drawing on the most appropriate sets of standards for the different elements of dialogue practice.
Public dialogue is still a relatively new field, and methods and approaches are developing all the time. We want our quality framework to continue to develop over time, to reflect the wider development of the field. We hope it will be used to test the robustness of dialogue methods, for formal evaluations, as an introduction to the basic building blocks of dialogue and as a contribution to the wider development of methods to assess the quality of participatory working more widely. We very much welcome feedback, comments and ideas on the content and use of the framework.
There has in the past been some resistance to any form of quality assurance for dialogue, to avoid being prescriptive, limiting or bureaucratic. But if we want dialogue to play a real role in improving policy, and in strengthening the voice of ordinary citizens in the decisions that will affect life in the future, we need to recognise the need to assess and improve the quality of that dialogue - and that is what we hope the new quality framework will support.
Done badly, dialogue can be a waste of everyone's time, money and energy. It can be frustrating and meaningless. But done well, dialogue can be creative, transformative, inspirational and beautiful - alongside being rigorous, impartial, ethical, relevant, meaningful, practical and simply useful.
We hope that the new quality framework will support even more dialogue being done well. Initial responses have been very positive but please do send us feedback so that we can continue to review and improve the framework over the coming months. We are planning to produce a revised version later this year, which will take account of all comments received.
Please contact Diane Warburton, Sciencewise Evaluation Manager at Diane.Warburton@sciencewise-erc.org.uk.