1. WhatDoTheyKnow Transparency report

    In 2021 WhatDoTheyKnow users made 100,092 Freedom of Information requests.

    Those requests, and the responses they received, are public on the website for anyone to see. But what’s not quite so visible is the work the WhatDoTheyKnow team do behind the scenes — answering users’ questions, removing inappropriate content and keeping everything ticking over.

    Some of the team’s most difficult calls arise around the removal of information. WhatDoTheyKnow’s guiding principle is that it is a permanent, public archive of Freedom of Information requests and responses, open to all.

    For this reason, the default position is not to remove substantive public information requests and responses; however, we act quickly if problematic content is reported to us. And, to help everyone understand exactly what has been removed and why, where possible we record these details on the request page.

    This year, for the first time, we’re extending our efforts towards transparency even further, with this report in which we’ll summarise the information removal requests and actions taken during the last twelve months.

    To allow for a full 12 months of data, the date range used throughout this report is 1 November 2020 to 31 October 2021

    Headline facts and figures

    • 20,714,033 visitors to WhatDoTheyKnow.com this year
    • 22,847 new WhatDoTheyKnow user accounts this year, taking the total to 222,694
    • 7,971 total number of email threads in the support inbox in 2021
    • 822 requests hidden from WhatDoTheyKnow in 2021
      …in the context of 100,092 requests made in the year, and a total of 772,971 requests now published on the site
    • 196 Total number of published requests where we redacted some material in 2021
      …usually due to the inappropriate inclusion of personal information, or defamation.
    • 126 The number of users who created accounts this year banned
      …that’s just 0.06% of new users.
    • WhatDoTheyKnow is a project of mySociety run by a small team of staff and dedicated volunteers.

    And in more detail…

    Requests flagged for our attention

    The table below shows the reasons that requests were reported for admin attention this year. Note that we also receive many reports directly by email, so while not comprehensive, this is indicative.

    Reason for attention request Total number
    Contains personal information 143
    Not a valid request 108
    Vexatious 94
    Request for personal information 85
    Contains defamatory material 51
    Other 287
    Total 768

    Material removed from the site

    The following tables show where members of the support team have acted to remove or hide requests from WhatDoTheyKnow in the last year, and the reason why.

    There is a range of options available to moderators, from ‘hidden’ (the most extreme) to ‘discoverable with link’. This is in addition to the censor rules that are used to hide certain information within a request or response.

    Request visibility Total number
    Visible only to the request maker 805
    Discoverable only to those who have the link to the request 11
    Hidden 8
    Reason for removing from public view Total number
    Not a valid FOI request 701
    Vexatious use of FOI 29
    Other (reason not programmatically recorded*) 124

    *Current processes do not create an easily retrievable list of reasons beyond the two above, but we are hoping to improve our systems so future transparency reports can include a more detailed breakdown.

    Censor rules (programmatically hiding the problematic part/s of a request) Total number
    Number of censor rules applied 881
    Number of requests with censor rules applied 196 
    Number of requests with censor rules applied which are still publicly visible, but with problematic material hidden 188

    Data protection issues raised to the WhatDoTheyKnow user support inbox 

    The following data shows the number of email threads* received into the WhatDoTheyKnow user support inbox regarding the most common types of concern around information published on the site. Not all issues raised resulted in material being removed from the site.

    GDPR = UK General Data Protection Regulations
    DPA: Data Protection Act

    Label Total number of threads
    GDPR Right to Erasure 317
    Defamation  130
    Data breach 96
    GDPR & DPA concerns (type not specified) 42
    GDPR Right to Rectification 33
    GDPR Right of Access 21
    Harassment 17
    GDPR Right to Object 12
    Data breach – internal** 2
    Impersonation 1
    Total 674

    * Email threads may be either automatically categorised by the system, or manually categorised by the WhatDoTheyKnow support team on the basis of the information given by the person reporting them.

    ** “Data Breach – internal” refers to cases where WhatDoTheyKnow has identified that a data breach may have been caused due to our own staff actions. We take our obligations seriously, and use such instances as a learning opportunity, so these are reported even if very minor, and often when they’re nothing more than a near miss — which both of these cases were.

    High risk concerns raised for review 

    Our policies ensure that certain issues can be escalated for review by the wider team and, where more complex, by a review panel that includes mySociety’s Chief Executive and the Chair of the Trustees.  Escalation is typically prompted by threats of legal action, complaints, notifications of serious data breaches, complex GDPR cases, or cases that raise significant policy questions.

    Case type* Total number
    Defamation 66
    GDPR Right to Erasure 42
    Data breach 40
    Complaints 33
    GDPR & DPA concerns 11
    GDPR Right of Access 6
    Harassment 5
    Takedown 2
    GDPR Right to Object 2
    GDPR Right to Rectification 1
    Other 78
    Total 286

    * Email threads may be either automatically categorised by the system, or manually categorised by the WhatDoTheyKnow support team on the basis of the information given by the person reporting them.

    Users

    User accounts Total 
    WhatDoTheyKnow users with activated accounts 222,694
    New user accounts activated in 2021 22,847
    Reason for banning users in 2021 Total 
    Spam 3,936
    Other site misuse 166
    Total number of users banned in 2021 4,102
    Anonymisation* Total 
    Accounts anonymised in 2021 170

    * Where accounts have been anonymised this is at the user’s request, generally to comply with GDPR Right to Erasure requests.

    Users are banned and their accounts may be closed due to site misuse and breach of the House Rules. Anonymised and banned users are no longer able to make requests or use their accounts.

    Thank you for reading

    This is the first time we’ve compiled a Transparency Report like this for WhatDoTheyKnow, but it’s something we’ve been wanting to do for some time. We demand transparency from public authorities and it’s only right that we also practice it ourselves. 

    Additionally, we hope that the report goes some way to showing the type of work the team do behind the scenes, and that moderating a well-used site like WhatDoTheyKnow is not without challenges.

    In future years, we hope to build on this initial report, ideally automating many of the stats so that they can be seen on a live dashboard. For now, we thought it was worthwhile making a manually-compiled proof of concept. 

    If there are specific statistics that you’d like to see in subsequent Transparency reports, or you’d like to know more about any of those above, do drop the team a line. They’ll get back to you as soon as the urgent moderation work is done!

    See mySociety’s 2021 annual review

    Image: Create & Bloom

  2. New functionality for FixMyStreet for Councils

    We’re pleased to announce new moderation features for clients of FixMyStreet for Councils.

    This new functionality enables nominated members of staff to edit user reports from within the FixMyStreet front end.

    It’s quick and easy, and allows you to react immediately to unwanted content on your site. Read on to find out more.

    Screenshot of a problematic report in FMS

    Screenshot of a problematic report in FMS

    What’s wrong with this report?

    So what is wrong with the report in the screenshot above?

    If you run a site on the FixMyStreet platform, you’ll be familiar with this kind of report, and the chances are that you’ll already be twitching to edit it.

    User-generated content is wonderful in many ways – but it can also present problems on a public-facing site. Let’s look at a few of the potential issues in the report above:

    • The user has included his phone number in the report description, and now it’s available for anyone to see.
    • The user’s name is also public. While this is the default option on FixMyStreet, users often get in touch to say that they meant to make their report anonymously (an option on FixMyStreet, but one which the user can only access at the point of submission).
    • There’s an inappropriate photo. This one is a statue of Carl Jung, which obviously has nothing to do with the report. But even relevant photos can be problematic: imagine if it was a graphic depiction of a dead animal, or some rude graffiti.
    • Profanity: in the example above, we’ll imagine that “pesky” is a mild profanity, but experience tells us that users don’t always hold back on their language.

    There are other common problems too, not represented in this report. Users sometimes post potentially libellous information: naming someone they suspect of flytipping, for example, or giving an address where they believe planning permission has been flouted.

    In the run-up to local election, councils have to be particularly sensitive to any content that might be construed as political – commonly they wish to remove any mention of any candidate.

    Moderation in all things

    New FMS Moderation panel
    Up until now, we’ve edited reports for our council clients, on request. However, this is clearly a long-winded way of getting sensitive material off the site, especially when time is of the essence.

    So we’ll shortly be introducing the ability for client moderation of sites. Councils or other bodies who run FixMyStreet will be able to nominate trusted users and give them the ability to edit problematic reports from within the report page.

    When logged in, these users will see a “moderate” button on every report – this feature will not be available to any user unless explicitly authorised.

    As you can see, this panel provides the ability to:

    • Hide the report completely
    • Hide the name of the poster
    • Hide or show a photo (if one was originally provided)
    • Edit the title and body of the report.

    For some reports, it might be necessary to make a number of edits, and finally submit the changes:

     

    FMS Moderation in Progress

    FMS Moderation in Progress

    The moderator can also add a reason for the changes, so it’s recorded if a colleague needs to know the history of the report in the future.

    This functionality gives a lot of power to admins to remove inappropriate information – but the user took the time to submit their report, and it’s only fair to let them know it’s been changed. So the system sends them an automatic email, as below:

    FMS Moderation Email

    FMS Moderation Email

     

    Finally, the system automatically updates the report to show that it has been moderated. As well as a timestamp, it signals where any information has been removed in the title or body of the report.

    FMS Moderation Displayed on Report

    FMS Moderation Displayed on Report

     

    Updates can be just as problematic as reports, so the same functionality will apply to them.

    We’d welcome feedback on this mechanism, so please let us know if you think we’ve missed any features.

    Note: These screenshots are from our work in progress and do not yet display the slick design that we habitually apply right at the end of the build process. Please regard them as preview shots only!