1. Election reform: a solid start with work ahead

    We don’t talk a lot about elections at mySociety: we see our unique contribution as being about the democracy between elections, with a focus on how democratic institutions work (sometimes how they can work better), and the connections between the public and those institutions.

    But whatever part of the democratic system you care about, elections shape everything. Rules about elections not only decide the winners, but the incentives that all players operate by. In our work looking at money in politics, we started looking at broken forms and worked our way back to problems in how elections are financed. There’s a lot of good work to be done on small problems, but we need to keep our eye on the big picture too.

    This blog post looks at the new government’s new election strategy (which will be the basis of the forthcoming Elections Bill) 

    This strategy provides some certainty around large-scale changes in how UK elections will work in this Parliament, which is good as we’re at the point in the election cycle when the practical work to make these changes happen needs to begin.

    The strategy is a solid foundation to build on, that needs to be followed up with complementary support and legislative action through the rest of the Parliament. There are also areas where the plan doesn’t go far enough – and from the outside we need to be building better evidence and campaigns for change. 

    Making it easier to use your vote

    The big headline changes are:

    • automatic voter registration
    • expansion of valid Voter ID
    • votes at 16.

    These are all changes that make it easier for people to vote if they want to. 

    It is important that these changes are enabled now to facilitate the substantial work that will take years — both within the electoral administration system, and for civil society. 

    One of the significant hopes for votes at 16 is that it will give more people a first election while they’re in an educational setting that helps them understand and use their new right.  But this doesn’t happen on its own: the Democracy Classroom have outlined the work that is required over the next few years to create the environment where young people not only have the right to vote, but the knowledge and willingness to use it. 

    Election finance and donor caps

    There are good foundations on improved electoral finance laws in the strategy.  A key improvement is much stronger regulation of unincorporated associations: creating Know Your Donor requirements, and bringing the transparency thresholds into line with donations to parties and candidates. These transparency thresholds are too high (and have gotten higher), but this is an area that could (and should) be redressed through a statutory instrument at a later point.

    What this strategy does not go near is the idea of donation caps to limit the power of large donors. As we explored on our Beyond Transparency report, a key blocker in this area is both the comparative lack of public funding of elections in the UK compared to European and Anglophone countries, and a belief that public opinion is an obstacle to a move in that direction (public funding is not popular on its own, but the status quo is also deeply unpopular). 

    In the spirit of “what is the work needed now to enable change by the end of the Parliament”, our key recommendation is that either government or civil society actors convene a citizens’ forum/assembly to further unblock this line of argument by creating a better understanding of how the public approach trade-offs in this area. There is good reason to believe that, while exactly what it looks like might be open to debate, increased levels of public funding are possible, enabling donor caps and restricting the uneven influence of wealth on elections. 

    In the absence of this, we are unlikely to make substantial progress within this parliament. We need well-developed answers to the “how are we going to pay for elections then?” question  — and to be building those answers now. 

    The Electoral Commission needs better data infrastructure

    A key part of the new strategy is creating much greater enforcement powers for the Electoral Commission, increasing maximum fines and expanding to cover financial offences by candidates and local third-party campaigners. 

    This implies more resourcing for the Electoral Commission, but also improved data infrastructure.

    For instance, basic infrastructure (such as the Electoral Commission having a list of all candidates covered) does not exist in a formal centralised way. This is currently provided indirectly through Democracy Club’s work using volunteers to source hundreds of different statements. 

    The strategy provides both a need for a better solution, and the framework of how it can happen – through new requirements for Electoral Register officers to provide election information to the government and Electoral Commission. Building on this legal foundation will require important work over the next few years to create reliable flows of data to enable both better public participation in elections, and an effective regulator of candidates and parties. 

    For investigation of campaign finance offences,  the flow of financial information currently has significant problems, not only in terms of external accessibility but in terms of usable data for a regulator to enforce the rules. Our forthcoming report, Leaky Pipes, will explore this problem in more depth, and propose solutions that help support both public transparency and effective regulation. 

    Abuse and intimidation

    The strategy includes a number of elements aimed at reducing the incidence and impact of abuse and intimidation of candidates and election staff.  These range from privacy measures such as removing the need for candidates’ home addresses to be published, through making intimidation related to an election an aggregating factor in other offences, and requiring candidate ID checks to avoid sham candidates

    A key aspect of this in the strategy is better guidance on the different parts of the system and creating clearer understanding on roles, available protection, thresholds for action and ways of accessing police support. 

    A throughline of the Jo Cox Civility Commission report is the potential high rewards of better joining up and signposting existing systems of support.  A consequence of creating the guidance may be discovering further opportunities for better joined-up procedures. 

    This is another area where improved data infrastructure would enable a range of goals, for instance making it possible for consistent updates to guidance to be sent centrally from the Electoral Commission.

    Managing scope

    A running theme through the strategy is trying to close loopholes that can be abused. However, some loopholes reflect genuine ambiguity that can be hard to address with either creating a chilling effect or problems with enforcement through covering a large number of organisations.  One area of concern is the expansion of the need for a digital imprint to cover unregistered campaigners (the name and address of the organisation creating/promoting a viral post), which might bring a huge number of organisations into scope. We would want to see more examination of this as the bill progresses through Parliament. 

    Character and truth

    One interesting element of the strategy is support for the recommendation that the current Speaker’s Conference on the security of MPs, candidates and elections establish a code of conduct for campaigning and coordinating cross-party discussions. 

    Given the Speaker’s Conference‘s specific interest in s106 (the rule where defaming a candidate’s character can be an election-invalidating offence), discussion of conduct inevitably enters grounds around truthfulness. The problems in agreeing a code of conduct was one of the reasons that the Advertising Standards Agency stopped policing election adverts in 1999 (the lack of regulation here is generally not known by the public). 

    Progress here could work as an enabling measure for the ASA to adopt New Zealand style rules on policing political ads. But it could also run into problems finding consensus. It is worth campaigners in this area paying attention to where practical progress is possible, and could be enabled by outside research and campaigning. 

    Electoral Commission independence remains at risk

    This strategy intends to use the government’s power to set the Electoral Commission’s strategy and policy rather than abolish that power. Labour was opposed to this in opposition. It undermines all electoral offences if parties have a viable “win at any cost” approach to electoral compliance and then can deprioritise enforcement of rules against them once in government. 

    As the Chair of the Electoral Commission said, they “remain opposed to the principle of a strategy and policy statement, by which a government can guide our work. The independence and impartiality of an electoral commission must be clear for voters and campaigners to see, and this form of influence from a government is inconsistent with that role.”

    This is a change that needs to be made through primary legislation. Failing to do this within this parliament will set a norm where two major parties have agreed that electoral enforcement priorities are set by the winner, rather than reflecting an agreed democratic understanding of the rules. 

    Improving candidate/voter privacy

    In the 19th century, when the franchise was heavily restricted, the list of electors was public so that people’s eligibility could be challenged. This has continued into a big dataset of all registered electors being commercially available to buy from local authorities. 

    This is out of sync with modern ideas of privacy, and especially the risks of these kinds of massive datasets. The opt-out to the open register was added in 2002, and by December 2018 56% of the register had opted out. The usefulness as a universal dataset has now been broken, and it serves little purpose. 

    The election strategy says that for automatic voter registration, opt-out will become the default. We would recommend going further to fully remove the register.

    This is just one way that electoral transparency is out of step with our current democratic needs. In our forthcoming Leaky Pipes report, we explore how GDPR is both used as an obstacle to useful donor transparency, while at the same time a surprising amount of information on small donors is available for public access.

    Image: Ben Allen

  2. Our recipe for cookieless Google Analytics 4

    We use Google Analytics to understand how many people visit our sites and services; though we’re careful not to collect personal information about our users, or to ‘follow’ them around the web.

    With the old version of Google Analytics, we had an approach where we turned off storage (with ‘client_storage: none’) and so did not need to display cookie banners to visitors the first time they visited each of our sites. The downside of our approach was losing access to any figures on unique ‘users’, but we still had pretty good access to generalised page view and event statistics, plus a simpler, banner-free experience for our visitors.

    Unfortunately, this year we’ve had to migrate to Google Analytics 4, because they’ll be turning off the previous version (“Universal Analytics”) in 2024.

    The old ‘client_storage: none’ setting doesn’t work in GA4, so this blog post outlines our new, slightly more complicated approach, for running our websites without cookies.

    If you just want the code, you can get it here.

    Backstory

    GA4 represents Google trying to work better with requirements for people to consent to cookies before turning on analytics cookies. The expected flow is that someone will arrive at a site, see a cookie banner and hit ‘allow’ or not.

    Because of this, GA4 has more built-in approaches for working with consent that controls when it sets cookies. Behind the scenes GA4 communicates some stuff anonymously when activated (with enough opt-ins, it starts using this and machine learning to give you fake data for your non-opt-in visitors), and holds back other stuff (eg: custom events) until the fully opted-in version is turned on.

    This fiddly stuff makes it more complicated than before to have a cookieless approach. Running with the ‘analytics_storage: denied’ consent setting doesn’t store cookies, which makes sense. But it also turns off other features—like real time statistics, custom events, and page_views—that could work fine, but don’t, because GA4 assumes you’ll eventually enable ‘analytics_storage’ once the user gives consent. GA4 is effectively useless in this mode.

    How we fix this

    To get the benefits of Google Analytics without letting it set any cookies, we do two things:

    • Feed GA4 a randomly generated client_id with each page load.
    • Intercept and prevent GA4 setting cookies by overriding ‘document.cookies’.

    The GA4 config option accepts a manual ‘client_id’ that lets you effectively make each page view anonymous. The client_id is two 32-bit integers separated by a dot, and generating a random one each time makes each page view independent so we’re not tracking anyone around the site.

    Setting a random ‘client_id’ on each page load is philosophically compliant with our privacy stance that we don’t want to track users moving around our sites. But it’s still setting cookies without consent (even if they never move information between views, and they expire quickly), which runs afoul of PECR.

    So, to go further than this, we monkey patch  ‘document.cookies’ so that when GA4 tries to write or read cookies, it’s writing nowhere and is reading a random client_id that hasn’t been stored anywhere. A similar example can be seen in this StackOverflow comment. Google thinks it’s reading and writing cookies, but no cookies ever get stored on the user’s device.

    The final template can be found in our GitHub repo. Feel free to use it too!

    Replacing ‘{{ GA4_MEASUREMENT_ID }}’ in this template gives an approach that still has the ‘users means page views’ issue, but real time statistics, custom events, and page_views work as expected.

    Photo by Vyshnavi Bisani on Unsplash

  3. FOI and Chinese CCTV cameras

    Big Brother Watch is a UK organisation that campaigns to defend civil liberties and privacy. As such, they keep a close eye on the ‘surveillance state’: the degree to which authorities are monitoring citizens, and the ever-more sophisticated technology that enables them to do so.  

    They have recently been investigating the use of Chinese-manufactured CCTV cameras in the UK, submitting Freedom of Information (FOI) requests via WhatDoTheyKnow Pro as one part of their research.

    In doing so they’ve uncovered the full extent to which this technology has permeated our schools and colleges, local and central government and beyond. Linked with information about both the capabilities of the tech, and the ways in which these companies support the human rights violations of the Chinese state, Big Brother Watch’s recent report makes for disturbing reading.

    The campaign has caught the attention of government, and driven a change in policy over their internal usage of cameras. Hearing of this positive outcome, we were keen to talk to Big Brother Watch and find out more about their use of FOI, and their Head of Research & Investigations Jake Hurfurt filled us in.

    What can they see?

    Big Brother Watch opposes the rise of surveillance in this country, pointing out that the UK is one of the most surveilled countries in the world with an estimated 6 million CCTV cameras across the nation. 

    As a society we may have become more accustomed to CCTV in public areas, but most people aren’t aware of recent rapid advances in surveillance functionality — such as facial, gender and even race recognition. And until there is widespread understanding of exactly what these cameras can do, there can’t be informed consent. 

    Because of its affordability, Hikvision is the most common provider of CCTV equipment in this country, with Dahua in second place. Both companies are known to supply surveillance equipment that has been used to target ethnic Uyghur minorities in the Chinese province of Xinjiang.

    Finally, Hikvision software has known vulnerabilities that could be exploited by hackers — and in countries like the US and Italy, breaches have been noted where cameras were found to be ‘communicating with China’.

    That’s the background: so what prompted Big Brother Watch’s investigation?

    Taking a closer look

    Jake says, “We’d become aware of Hikvision’s dominance of the UK market. We had also seen the US and Australia move to restrict Hikvision and Dahua, given their links to the atrocities in Xinjiang and their cybersecurity flaws, and were concerned of the risks this posed to privacy and civil liberties in the UK, so decided to dig deeper.”

    And, given the methods by which other countries had brought about change in this area, they knew they’d need to be looking at public authorities. That meant that FOI, which gives everyone the right to ask public bodies for the information they hold, was an obvious choice: 

    “From the start the project was designed as a public-sector focused investigation. We could see that successful restrictions on state-owned firms in other countries had originated in the public sector. It was clear from the beginning that we would ask questions using FOI.”

    As Big Brother Watch’s report states, “Over 5 months, [we] submitted more than 4,500 Freedom of Information requests to a range of public bodies to establish where Hikvision and Dahua equipment is in use and what advanced capabilities this equipment has. Some subsequent focused requests were submitted to public bodies who confirmed they do use Hikvision cameras.

    “The public bodies included all secondary schools and FE colleges in England, all UK universities, police forces, NHS Trusts, all Oxbridge Colleges, all central government departments, the House of Commons, the three devolved administrations and the Greater London Assembly.”

    A survey of surveillance

    Our WhatDoTheyKnow Pro service is designed for people making large scale investigative FOI requests: it makes it simpler to send FOI requests to specific types of authority in a single batch, and keeps all correspondence private until you are ready to go public.

    Jake says, “We used WhatDoTheyKnow Pro to send an initial wave of requests to police forces, government departments and the largest local authorities in the country, which acted as a scoping exercise to see how widespread Hikvision & Dahua are from a selection of public bodies. 

    “Batch requesting and WhatDoTheyKnow’s built-in reminders of when it’s time to chase up a response have been useful. It was also useful in sending requests to public bodies who are hard to contact, as the contact address database is amazing. 

    “The other helpful feature was the ability to look across requests others have made in the past, to compare responses and see what kind of wording worked and which did not – it’s something I always do when conducting large FOI projects as I can learn from other’s experience in the kind of questions that are successful, or start by knowing more than I would have done otherwise.”

    Getting results

    You can read Big Brother Watch’s full report for all the details, but their investigation revealed that three out of five schools, 60% of hospitals, 31% of police forces, 53% of universities, and 73% of councils use Hikvision and Dahua CCTV cameras, representing, in total, 60.8% of our public bodies.

    Once the full extent of their usage had been quantified, it was time to get the word out to the people capable of bringing about change — the UK’s elected representatives.

    Jake says, “We sent the report to MPs and held an event in Parliament to complement the launch of the report, and we had a good reception.

    “Around publication we also got dozens of Parliamentarians to back a pledge to ban Hikvision, and working alongside other NGOs we have been heavily involved in advocacy around a number of bills to push for this. Big Brother Watch does a lot of advocacy in Parliament and the report has supported these efforts.”

    Subsequently, with Oliver Dowden describing them as “current and future possible security risks”, the Chinese cameras were banned from being installed in or on government buildings.

    Still campaigning

    The government may have cracked down on their own use of these cameras, but that still leaves thousands of institutions and private companies who may not even be aware of the capabilities or implications of the cameras they’ve installed.

    Big Brother Watch are pressing for further action and you can find their campaign page here. If you are concerned, you can add your name to a petition to request that Parliament pass a law to ban this type of CCTV; and you can add sightings of the distinctive cameras to a collaborative Google map.

    Thanks very much to Jake for filling us in on the details of how WhatDoTheyKnow played a part in Big Brother Watch’s wider investigation.

    You might also be interested to read our 2019 post on Privacy International’s investigation into police use of facial recognition.


    Our services help people investigate the vital issues that affect us all. If you're able, please donate to help us keep them running.
    Donate now
  4. That’s how the cookies crumble

    mySociety staff are a varied and talented bunch, and it’s always interesting to find out more about what they get up to outside working hours. While some are tangoing, singing in choirs or foraging, others are sitting on the boards of organisations, or (busman’s holiday alert) developing their own tech.

    But in a new one on us, Rebecca Rumbul, our Head of Research, is now leading a £multi-billion class action-type lawsuit against Oracle and Salesforce, concerning alleged breach of GDPR regulations in their advertising technology businesses.

    We are fully behind Bec in her endeavour; but did want to stress that this is something she’s taking forward as a private individual, and not part of her professional role with mySociety. The claim is very interesting, and you can find out more on her personal blog here.

    While we’re talking about privacy, we should restate our commitment to GDPR and protecting the privacy of all our users. We made the decision last year to stop using tracking cookies on our sites and in our newsletters, and we regularly conduct privacy and data protection audits to ensure we are not collecting a scrap more information than we need to.

    That’s all! Now best of luck to Bec and her legal team in their endeavours.

    Image: ev

  5. Who’s checking your Facebook profile?

    If you were putting in a claim for benefits, challenging an accusation in court or phoning in sick to your employer, would you expect your local authority to be checking your social media presence?

    How do you think a stranger might assess you as a parent, were they to skim over any public posts on your Facebook page? If you’ve been on a protest recently, would you be comfortable knowing that your local council was combing through any photos you’ve shared?

    A Freedom of Information investigation by Privacy International, using WhatDoTheyKnow Pro, has discovered that a significant number of local authorities — 62.5% of those responding to their FOI requests — habitually monitor citizens’ Facebook or other social media profiles to gather intelligence.

    What’s more, the majority have no policy in place or measurement of how often and to what extent these investigations occur.

    If this concerns you, the first thing you should do is check that your social media privacy controls are up to date. Then you might like to go and read Privacy International’s full report, as well as checking how (or whether) your own local authority has responded to their requests for information.

    And finally, you can join Privacy International’s call for stronger guidelines from the Investigatory Powers Commissioner.

    Just… maybe think twice about putting it in a public Facebook post?

    We’re only joking, of course. Or half joking.

    Issues like this need to be shared far and wide. But as Privacy International point out, there are already sobering instances from abroad of threats to those following anti-government accounts. With so many completely unexpected changes to the status quo recently, can we say for certain that it could never happen here?

    Image: John Schnobrich

     

  6. We’re turning off newsletter tracking

    Last month, we sent some of our newsletter subscribers an email to say that we’d noticed they weren’t opening our newsletters and, unless we heard otherwise, we’d be removing them from our mailing list.

    A number of people replied to that email to say, ‘Actually, I do open your emails — I just have image loading turned off so you don’t track my activity’. This was a welcome reminder to examine our own practices.

    Such responses triggered a conversation within mySociety, resulting in a session at our team meeting last week, with the upshot being a decision to turn off personally identifiable tracking on all our newsletters.

    We’ll be able to see how many people visit our blog posts from the newsletter, but that’s all — we won’t be able to associate visits with particular accounts, nor will we know how many of the views are from repeat visits rather than different users.

    This decision reflects mySociety’s position as an organisation concerned about matters of privacy and also feels like it’s part of a wider movement within our sector: we’re not the only ones having this conversation.

    But why did you track opens at all?

    The obvious question, given this statement, is why mySociety were tracking activity in the first place.

    Partly tracking was in place to help manage the financial cost of our newsletters. We use Mailchimp, whose costing structure is based on how many subscribers you have at any one time: you pay more for each subscriber even if they are not active.

    Sending an email to people who hadn’t opened — or who appeared not to have opened — our emails for over a year was a way of cutting the costs of holding a large database on Mailchimp, and you can only do that if you know who they are.

    For purposes of understanding the impact of our communications, too, it is helpful to be able to see how many people have opened a mailout, what stories have been clicked on, and which have not.

    However, this can be done in a way that doesn’t intrude on the users’ privacy – and, as mySociety team members posited during our team meeting, a cost saving on our side doesn’t justify infringing on privacy.

    Mailchimp’s defaults assume that you want as much information in as granular a form as possible, but they do give you the option to turn tracking off.

    As far as we can see, one has to remember to do this each time a mailout is sent, but we’ll also be contacting Mailchimp to ask them if the relevant boxes could be unchecked by default, or managed at a global level.

    So from now on 

    After our team discussion, we’ve made the immediate decision to turn all tracking off on our newsletters. Thank you if you were one of the people whose feedback spurred us to have this conversation and arrive at this decision.

    And if you’d like to subscribe to our newsletters, you can do so here.

    Image: Jehyun Sung

  7. Help Privacy International discover how the police are accessing your online activity

    You may remember our recent post on the surveillance techniques in use by police forces, as investigated by the campaign group Privacy International.

    Several of you tweeted or commented that you were concerned to read of these new technologies. Well, here’s a way that you can get involved in finding out more.

    Privacy International are asking people to submit an FOI request to their local police force, to enquire whether they are using cloud extraction technology.

    Sounds fluffy? The reality is a bit more chilling. Cloud extraction technology allows the police to gain access from a citizen’s mobile phone to cloud based services such as their email, browser activity, and social media. So, if you are stopped and your phone is examined then handed back, surveillance might not stop there. Even after the phone is returned, using this tech police can monitor your online activity on an ongoing basis, seeing what you search for, trawling through your social media posts, and even accessing your location data.

    Whether or not you’ve ever been detained by the police, you might like to know whether this sort of surveillance is in action in your own local neighbourhood. And that’s where FOI comes in.

    To make everything as easy as possible for you, Privacy International have used pre-filled FOI requests* and provided the wording you should include. You can also see which forces have already been contacted, so as not to waste time making duplicate requests. Here’s where to get started.

    Camilla Graham Wood, a Legal Officer at Privacy International, is clear about the benefits WhatDoTheyKnow has brought to their campaigning: “Using WhatDoTheyKnow we have created a way for members of the public to quickly and easily contact their local police force and ask them about intrusive surveillance tech. We were able to embed this on our own website and to pre-fill certain boxes as well as adding a tag so we can follow the progress of the campaign.

    “Engaging the public in this way shows the level of public interest in policing technologies and introduces those who might not have used Freedom of Information request before to this valuable transparency tool”.

    *If you’re running a campaign and you’d like to know how to set up something similar, take a look at this blog post where Gemma explained it all, back in 2016.

    Image: Gilles Lambert

  8. Using WhatDoTheyKnow to uncover how schoolchildren’s data was used to support the Hostile Environment

    Freedom of Information forms the basis of many a campaign that seeks to expose hidden facts, or stories which should be in the public eye.

    We spoke to Jen Persson, Director of defenddigitalme, about that organisation’s tireless campaign to get to the truth on the collection, handling and re-use of schoolchildren’s personal data in England.

    What emerged was a timeline of requests and responses — sometimes hard fought for — which when pieced together reveal secrecy, bad practice and some outright falsehoods from the authorities to whom we entrust our children’s data. Perhaps most striking of the findings was the sharing of data with the Home Office in support of their Hostile Environment policy.

    As Jen describes defenddigitalme’s campaign, “It began with trying to understand how my daughter’s personal information is used by the Department for Education; it became a campaign to get the use of 23 million records made safe”.

    It’s a long tale, but definitely worth the read.

    December 2012: consultations and changes

    The story begins here, although it would still be a couple of years before Jen became aware of the issues around children’s data, “despite — or perhaps because of — having three young children in school at the time”.

    Why did no one at all seem to know where millions of children’s personal data was being sent out to, or why, or for how long?

    As Jen explains, “During the Christmas holidays, the Department for Education (DfE) announced a consultation about changing data laws on how nationally stored school pupil records could be used, proposing that individual pupil-level records could be given away to third parties, including commercial companies, journalists, charities, and researchers. Campaigners raised alarm bells, pointing out that the personal data would be highly identifying, sensitive, and insecure — but the changes went through nonetheless.”

    2014: discovering the power of FOI

    Jen came across that change in law for herself when reading about a later, similar data issue in the press: there were plans to also make available medical records from GP practices. This prompted her first foray into FOI, “to answer some of the questions I had about the plans, which weren’t being published”.

    I feel strongly that if I am going to ask for information which has a cost in time and capacity in the public sector, then it should mean the answers become available to everyone.

    And that first step got her thinking:

    “At around the same time I asked the DfE a simple question, albeit through a Subject Access rather than FOI request: What personal data do you hold about my own child?

    “My Subject Access request was refused. The Department for Education would not tell me what data they held about my children, and as importantly, could not tell me who they had given it to.

    “There was nothing at all in the public domain about this database the DfE held, beyond what the campaigners in 2012 had exposed. It wasn’t even clear how big it was. How was it governed? Who decided where data could be sent out to and why? How was it audited and what were the accountability mechanisms? And why was the DfE refusing its lawful obligations to tell me what they held about my daughter, let me correct errors, and know where it had gone? Why did no one at all seem to know where millions of children’s personal data was being sent out to, or why, or for how long?

    “Prior to all this, I’d never even heard of Freedom of Information. But I knew that there was something wrong and unjust about commercial companies and journalists being able to access more personal data about our children than we could ourselves.

    I worded some questions badly. I learned how to write them better. And I’m still learning.

    “I needed to understand how the database operated in order to challenge it. I needed to be able to offer an evidenced and alternative view of what could be better, and why. FOI was the only way to start to obtain information that was in the public interest.

    “I believed it should be published in the public domain. WhatDoTheyKnow is brilliant at that. I feel strongly that if I am going to ask for information which has a cost in time and capacity in the public sector, then it should mean the answers become available to everyone.”

    And so Jen went on a crash course to learn about FOI, reading books by Heather Brookes and Matthew Burgess, and WhatDoTheyKnow’s own guidance pages.

    “I tried to ask for information I knew existed or should exist, that would support the reasons for the changes we needed in data handling. I worded some questions badly. I learned how to write them better. And I’m still learning.”

    2015: sharing children’s personal data with newspapers

    That was just the beginning: at the time of writing, Jen has made over 80 FOI requests in public via WhatDoTheyKnow.com .

    Through FOI, defenddigitalme has discovered who has had access to the data about millions of individuals, and under what precepts, finding such astonishing rationales as: “The Daily Telegraph requested pupil-level data and so suppression was not applicable.” The publication “wished to factor in the different types of pupil” attending different schools.

    Jen explains: “This covered information on pupil characteristics related to prior attainment: gender, ethnic group, language group, free school meal eligibility (often used as a proxy for poverty indicators) and SEN (Special Educational Needs and disability) status, which were deemed by the Department to be appropriate as these are seen as important factors in levels of pupil attainment.”

    But with such granular detail, anonymity would be lost and the DfE were relying only on “cast iron assurances” that the Telegraph would not use the data to identify individuals.

    2016: sharing children’s nationality data with the Home Office

    In a Written Question put by Caroline Lucas in Parliament in July 2016, the Minister for Education was asked whether the Home Office would access this newly collected nationality data. He stated: “the data will be collected solely for the Department’s internal use […]. There are currently no plans to share the data with other government departments unless we are legally required to do so.”

    But on the contrary: defenddigitalme’s subsequent requests would disclose that there was already a data sharing agreement to hand over data on nationality to the Home Office, for the purposes of immigration enforcement and to support the Hostile Environment policy.

    Jen says: “As part of our ongoing questions about the types of users of the school census data, we’d asked whether the Home Office or police were recipients of pupil data, because it wasn’t recorded in the public registry of data recipients.

    The Home Office had requested data about dependents of parents or guardians suspected of being in the country without leave to remain.

    “In August 2016, a FOI response did confirm that the Home Office was indeed accessing national pupil data; but to get to the full extent of the issue, we had to ask follow up questions. They had said that “since April 2012, the Home Office has submitted 20 requests for information to the National Pupil Database. Of these 18 were granted and 2 were refused as the NPD did not contain the information requested.

    “But the reply did not indicate how many people each request was for. And sure enough, when we asked for the detail, we found the requests were for hundreds of people at a time. Only later again, did we get told that each request could be for a maximum agreed 1,500 individuals, a policy set out in an agreement between the Departments which had started in 2015, in secret.

    “In the October afternoon of the very same day as the school census was collecting nationality data for the first time, this response confirmed that the Home Office had access to previously collected school census pupil data including name, home and school address: “The nature of all requests from the Police and the Home Office is to search for specific individuals in the National Pupil Database and to return the latest address and/or school information held where a match is found in the NPD.”

    The Home Office had requested data about dependents of parents or guardians suspected of being in the country without leave to remain.

    “In December 2016, after much intervention by MPs, including leaked letters, and FOI requests by both us and — we later learned —  by journalists at Schools Week, the government published the data sharing agreement that they had in place and that was being used”.

    It had been amended in October 2016 to remove the line on nationality data, and allowed the data to be matched with Home Office information. It had also been planned to deprioritise the children of those without leave to remain when allocating school places, shocking opposition MPs who described the plan variously as “a grubby little idea” and, simply, “disgusting”.

    Other campaigners joined the efforts as facts started to come into the public domain. A coalition of charities and child rights advocates formed under the umbrella organisation of Against Borders for Children, and Liberty would go on to support them in preparing a judicial review. ABC organised a successful public boycott, and parents and teachers supplied samples of forms that schools were using, some asking for only non-white British pupils to provide information.

    Overall, nationality was not returned for more than a quarter of pupils.

    2017: behind the policy making

    Through further requests defenddigitalme learned that the highly controversial decision to collect nationality and country of birth from children in schools — which came into effect from the autumn of 2016 — had been made in 2015. Furthermore, it had been signed off by a little known board which, crucially, had been kept in the dark.

    “I’d been told by attendees of the Star Chamber Scrutiny Board meeting that they had not been informed that the Home Office was already getting access to pupil data when they were asked to sign off the new nationality data collection, and they were not told that this new data would be passed on for Home Office purposes, either. That matters in my opinion, because law-making relies on accountability to ensure that decisions are just. It can’t be built on lies”, says Jen.

    The process of getting hold of the minutes from that significant meeting took a year.

    Jen says, “We went all the way through the appeals process, from the first Internal Review, then a complaint to the Information Commissioner. The ICO had issued a Decision Notice that meant the DfE should provide the information, but when they still refused the next step was the Information Rights First Tier Tribunal.

    “Two weeks before the court hearing due, the DfE eventually withdrew its appeal and provided some of the information in November 2017. Volunteers helped us with preparation of the paperwork, including folk from the Campaign for Freedom for Information. It was important that the ICO’s decision was respected.”

    2018: raised awareness

    In April last year, the Department confirmed that Nationality and Country of Birth must no longer be collected for school census purposes.

    However, Jen says, “Children’s data, collected for the purposes of education, are still being shared monthly for the purposes of the Hostile Environment. There’s a verbal promise that the nationality data won’t be passed over, but since the government’s recent introduction of the Immigration Bill 2018 and immigration exemption in the Data Protection Act, I have little trust in the department’s ability in the face of Home Office pressure, to be able to keep those promises.

    “The Bill includes a blanket sweeping away of privacy rights, highlighted by the 3 Million campaign, again thanks to FOI: Every EU citizen applying for Settled Status to accept its Privacy Policy that allows it to share all data with “public and private sector organisations in the UK and overseas.”

    “Disappointingly”, says Jen, “the government has decided instead of respecting human rights to data protection and privacy on this, to create new laws to work around them.

    The direction of travel for change to manage data for good, is the right one.

    “It’s wrong to misuse data collected for one purpose and on one legal basis entrusted for children’s education, for something punitive. We need children in education, it’s in their best interests and those of our wider society. Everyone needs to be able to trust the system.

    “That’s why we support Against Borders for Children’s call to delete the nationality data.

    “A positive overall outcome, however”, she continues, “is that in May 2018, the Department for Education put the sharing of all pupil level data on hold while they moved towards a new Secure Access model, based on the so-called ‘5-Safes’. The intention is distribute access to data with third parties, not distribute the data itself. The Department resumed data sharing in September but with new policies on data governance, working hard to make pupil data safer and meet ‘user needs’. The direction of travel for change to manage data for good, is the right one.”

    2019: Defenddigitalme continues to campaign

    Defenddigitalme has come a long way, but they won’t stop campaigning yet.

    People working with FOI is really important, even and perhaps especially when it doesn’t make the press, but provides better facts, knowledge, and understanding.

    Jen says, “Raw data is still distributed to third parties, and Subject Access, where I started, is still a real challenge.

    “The Department is handing out sensitive data, but can’t easily let you see all of it, or make corrections, or tell you which bodies for sure it was given to. Still, that shouldn’t put people off asking about their own or their child’s record, or opting out of the use of their individual record for over 14s and adult learners, and demand respect for their rights, and better policy and practice. The biggest change needed is that people should be told where their data goes, who uses it for how long, and why.

    “Access to how government functions and the freedom of the press to be able to reveal and report on that is vital to keep the checks and balances on systems we cannot see. We rely on a strong civil service to work in the best interests of the country and all its people and uphold human rights and the rule of law, regardless of the colour of government or their own beliefs. People working with FOI is really important, even and perhaps especially when it doesn’t make the press, but provides better facts, knowledge, and understanding.

    “FOI can bring about greater transparency and accountability of policy and decision making. It’s then up to all of us to decide how to use that information, and act on it if the public are being misled, if decisions are unjust, or policy and practice that are hidden will be harmful to the public, not only those deciding what the public interest is.

    “WhatDoTheyKnow is a really useful tool in that. Long may it flourish.”

  9. Now you’re even more secure on FixMyStreet

    All mySociety websites have strong security: when you think about some of the data we’re entrusted with (people’s private correspondence with their MPs, through WriteToThem, is perhaps the most extreme example, but many of our websites also rely on us storing your email address and other personal information) then you’ll easily understand why robust privacy and security measures are built into all our systems from the very beginning.

    We’ve recently upped these even more for FixMyStreet. Like everyone else, we’ve been checking our systems and policies ahead of the implementation of the new General Data Protection Regulation in May, and this helped us see a few areas where we could tighten things up.

    Privacy

    A common request from our users is that we remove their name from a report they made on FixMyStreet: either they didn’t realise that it would be published on the site, or they’ve changed their mind about it. Note that when you submit your report, there’s a box which you can uncheck if you would like your report to be anonymous:

    FixMyStreet remembers your preference and applies it the next time you make a report.

    In any case, now users can anonymise their own reports, either singly or all at once. When you’re logged in, just go to any of your reports and click ‘hide my name’. You’ll see both options:

    report of puddles on the undercliff walk on FixMyStreet

    Security

    Security for users was already very good, but with the following improvements it can be considered excellent!

    • All passwords are now checked against a list of the 577,000 most common choices, and any that appear in this list are not allowed.
    • Passwords must now also be of a minimum length.
    • If you change your password, you have to input the previous one in order to authorise the change. Those who haven’t previously used a password (since it is possible to make a report without creating an account), will receive a confirmation email to ensure the request has come from the email address given.
    • FixMyStreet passwords are hashed with an algorithm called bcrypt, which has a built in ‘work factor’ that can be increased as computers get faster. We’ve bumped this up.
    • Admins can now log a user out of all their sessions. This could be useful for example in the case of a user who has logged in via a public computer and is concerned that others may be able to access their account; or for staff admin who share devices.


    Image: Timothy Muza (Unsplash)

     

     

     

     

  10. UK public bodies accidentally release private data at least once a fortnight

    Private data, containing personal details of the general public, is accidentally released by public authorities at least once a fortnight, say mySociety.

    The volunteer team behind WhatDoTheyKnow, mySociety’s freedom of information website, have dealt with 154 accidental data leaks made by bodies such as councils, government departments and other public authorities since 2009, and these are likely to represent only the tip of the iceberg.

    On the basis of this evidence, we are again issuing an urgent call for public authorities everywhere to tighten up their procedures.

    How WhatDoTheyKnow works

    Under the Freedom of Information act, anyone in the UK may request information from a public body.

    WhatDoTheyKnow makes the process of filing an FOI request very easy: users can do so online. The site publishes the requests and their responses, creating a public archive of information.

    Public authorities operate under a code of conduct that requires personal information is removed or anonymised before data is released: for example, while a request for the number of people on a council housing waiting list may be calculated from a list including names, addresses and the reason for housing need, the information provided should not include those details.

    Accidental data releases become particularly problematic when the data requested concerns the details of potentially vulnerable people.

    Hidden data is not always hidden

    When users request information through WhatDoTheyKnow, it’s often provided in the form of an Excel spreadsheet. But unfortunately, private data is sometimes included on those spreadsheets, usually because the staff member who provides it doesn’t understand how to anonymise it effectively.

    For example, data which is in hidden tabs, or pivot tables, can be revealed by anyone who has basic spreadsheet knowledge, with just a couple of clicks.

    By its very nature, data held by our public authorities can be extremely sensitive: imagine, for example, lists of people on a child protection register, lists of people who receive benefits, or as happened back in 2012, a list of all council housing applicants, including each person’s name and sexuality.

    Our latest warning is triggered by an incident earlier this month, in which Northamptonshire County Council accidentally published data on over 1,400 children, including their names, addresses, religion and SEN status. Thanks to the exceptionally fast work of both the requester and the WhatDoTheyKnow volunteers, it was removed within just a few hours of publication, and the incident has been reported to the Information Commissioner’s Office. Concerned residents should contact the ICO or the council itself.

    Advice for FOI officers

    Back in June 2013, we set out the advice that we think every FOI officer should know. That advice still stands:

    • Don’t release Excel pivot tables created from spreadsheets containing personal information, as the source data is likely to be still present in the Excel file.
    • Ensure those within an organisation who are responsible for anonymising data for release have the technical competence to fulfil their roles.
    • Check the file sizes. If a file is a lot bigger than it ought to be, it could be that there are thousands of rows of data still present in it that you don’t want to release.
    • Consider preparing information in a plain text format, eg. CSV, so you can review the contents of the file before release.

    Part of a larger picture

    Not every FOI request is made through WhatDoTheyKnow—many people will send their requests directly to the public authority. Moreover, we can only react to the breaches that we are aware of: there are, in all probability, far more which remain undiscovered.

    But because of WhatDoTheyKnow’s policy of making information accessible to all, by publishing it on the site, it’s now possible to see what an endemic problem this kind of treatment of personal data is.

    When we come across incidents like these, we act very rapidly to remove the personal information. We then inform the public authority who provided the response. We encourage them to self-report to the Information Commissioner’s Office, and where the data loss is very serious, we may make an additional report ourselves.

    Image: Iain Hinchliffe (CC)