Providing tech for effective crowdsourcing

An important idea in mySociety’s history and work is that combining people-power and technology is a powerful method to make change.

Recently we’ve had the chance to work with Climate Emergency UK on the council climate scorecards. As part of this we’ve been learning from how CE UK works with volunteers, and thinking how best we can be a technical partner to crowdsourcing projects.

But we’ve also been thinking how we can better incorporate crowdsourcing into our core services. This has led to improving our approach to crowdsourcing within WhatDoTheyKnow, and developing a new platform (GRACE) to help volunteers crowdsource information for the scorecards.

Developing our technical approach

In the first round of the Council Climate Scorecards,back in 2021, the process was managed on a big Google Sheet. Each answer to a question has a line, which tracks the answer from a dropdown and keeps track of notes, and we use macros to populate and move information around. This worked and let us get started at the pace we wanted, but introduced a lot of room for error, and meant more work was needed at the end of the process to validate the data.

For the second round of scorecards, with a more complicated set of questions, we wanted to move away from the spreadsheet approach, into something that could validate the data as we went, and integrated the right of reply process into the same platform.

Working with CE UK, we built a crowdsourcing platform GRACE (named after CE UK’s first volunteer) to replace the old spreadsheet approach.

The overall process worked like this:

  • Working with their advisory group, CE UK created questions across a range of different areas, with different sections for different kinds of councils.
  • These questions and data from FOI responses were uploaded into the platform.
  • Volunteers were given access to the platform, and could answer questions in the topic areas assigned to them. This would include an answer from a dropdown, a public “evidence” field, and a private notes field to share between volunteers.
  • Overall progress across all areas could be seen by the volunteer coordinators.
  • Local authorities were given a right of reply per question. Emails were sent to all local authorities, giving them access to the platform to review the scores and evidence for their section. This gave them the opportunity to submit new evidence where they felt the score did not reflect their actions. 74% submitted responses to the right of reply.
  • Responses from the right of reply were reconciled with the original answers by a group of volunteers and CE UK.
  • This data could then be reviewed and loaded into councilclimatescorecards.uk

How did CE UK find the tool? Isaac Beevor, Co-director of CE UK:

GRACE – our online data collection system was incredible. We would not have been able to collect the thousands of data points on local authorities’ climate action without it. It was simple to use and effective so all of our volunteers were able to use it, with minimal training. Furthermore, it allowed us to coordinate and track progress. This meant we were able to encourage volunteers who needed it and track progress in sections allowing us to plan ahead for staff and volunteer capacity. We then used it to allow access for councils in the Right of Reply and all of the feedback from officers, particularly its ease of use, was positive. The best thing is we can now use this for every Scorecards, which saves us so much time and energy.

Effective crowdsourcing

As we wrote in our report about public fragmented data the issue with the idea of ‘armchair auditors’ is not that they do not exist, but that they were thought about in the wrong way. People can and do use their time to support civic accountability through looking at spreadsheets. But they need to be given support and structure to work effectively together.

Reflecting on the CE UK process and other crowdsourcing approaches we admire, like Research for Action and Democracy Club, what these projects have in common is that they involve knowledge sharing and collaboration across the country, with volunteers themselves contributing a local or specialist focus. “Armchair auditors” aren’t atomised individuals, but work together as a community.

The Effective Crowdsourcing diamond (below) describes the aspects we think are key to successful projects, and helps us shape our thinking about future crowdsourcing/citizen science projects.

A blue diamond with the following text "Effective crowdsouricng: working together to answer big questions". The four points of the diamond have: Volunteers: A pool of motivated supporters break the problem up Expertise: Experts and advisors to frame the questions to ask Technology: specialist tools streamline information gathering and quality assurance steps Impact: clear paths form gathering data to making change.

In this framework, an effective process will have four features that all interrelate and reinforce each other:

  • Expertise – Helping move from high level principles to concrete questions and approaches that inform how work can be split up for volunteers, and the technical tools needed.
  • Volunteers – People who care about an issue, with the skills to contribute to answering questions.
  • Technology – Technology makes it easier for people to work together, and validation and cross checking improves the accuracy of the process, enabling more complex approaches.
  • Impact – What is the path from the result of a project to change the world? Clear routes to impact means clearer benefits of engaging to experts and volunteers.

For Scorecards and other projects, we have provided the technical side of the diamond. But there is lots of potential for us to run crowdsourcing projects that improve our core Democracy and Transparency services. Starting with the tools we now have, we’re thinking about how we can strengthen our skills and approaches across the diamond – and how we might partner on projects to bring in other expertise and skills.

GRACE screenshots

Initial volunteer screen showing sections assigned:

Screenshot of a page showing the assignments and completion rates assigned to a specific indivudal. A series of progress bars next to sections.

Screen showing councils and progress

Screenshot of a pages showing the overall progress for a section. It shows by council the number of questiosn completed for this group of questions

Marking screen

This shows the marking screen - which is a form capture the answer to a question, with boxes and links for evidence

Right of Reply screen

This is the right of reply screen that will be visible by councils - it shows the answers to the question in uneditable boxes -while providing options to agree/disagree with response and submit further evidence

Admin screen so you can check that how much has been assigned to volunteers

Volunteer management screen - showing how many sections are currently assigned to volutneerws

Admin screen to show progress on an authority level

Progress screen, showing the number of questions completed by council, and the overall progress

Or on a section level

Progress screen by question section, showing the number of councils with a complete set of questions for that section.