1. Parents for Inclusive Education are on a mission — with the help of FOI

    How do you bring about systemic change within structures that are embedded into the national culture? That’s a big question, but it’s one that users of our Freedom of Information site WhatDoTheyKnow are often tussling with.

    One place to start is with data that helps you map the current state of affairs, and FOI can be the perfect medium for getting hold of that. When we spoke to Jack Russell from Parents for Inclusive Education (PfIE), a grassroots organisation of primary school parents in Northern Ireland, he explained the value of data very well: “it means you can start a conversation”.

    So, what are PfIE trying to achieve?

    “We came together because we want to see a more inclusive primary education for every child” – and they’re starting with religious education.

    “We realised that, for many parents, there was a lack of clarity around how RE is delivered in Northern Ireland, and what rights parents have in this area.”

    PfIE wanted to gather data on who comes into schools to deliver RE lessons, collective worship and assemblies. Their aim was to achieve an accurate, representative picture of practices across Northern Ireland, as opposed to their baseline assumptions which, as they admit, had up until then been based on anecdotal evidence.

    From small beginnings

    And so began a large-scale FOI project — although initially the team had much more modest plans: 

    “At first, we were only going to contact our own schools to ask them who was given access and how this was communicated. 

    “But then we realised that other parents might want to be informed about these practices at their schools — and they were entitled to answers too. So we decided to send a Freedom of Information request to every publicly funded primary school in Northern Ireland, apart from special schools: that was 772 in total.”

    The organisation had some tech expertise amongst its members, and, as they explained, at first it seemed that WhatDoTheyKnow wouldn’t quite be suitable for their needs:

    “One of our team — Laura — had successfully used WhatDoTheyKnow in the past to query hospitals about their waitlist times for outpatient appointments, so she suggested using it. But after some initial research, we decided not to, as we’d wanted to include attachments and links in our requests. 

    “I’d written a script to batch send them all, but it turned out that these were heavily spam filtered by the schools’ email server, so we fell back on WhatDoTheyKnow.

    “I’m really glad we did, as the fact that all correspondence will be public is a huge plus for us.”

    Managing batch FOI requests

    So, how did PfIE manage their 772 FOI requests? They signed up for our WhatDoTheyKnow Pro service, which is designed specifically to help keep track of large batches like this, and also allows users to keep their requests and responses private until they’re ready to release their findings.

    “We focused our questions around two areas: first, access: which churches and religious organisations were being given access to schools, and how that access was managed via processes and/or controls; and secondly communication: whether and how parents were made aware of religious visitors; and were informed about the options to withdraw their children from religious practices.

    “We asked 14 questions in total, some of which were yes/no or multiple choice, others which required free-form answers.”

    FOI allows the request-maker to specify the format they’d like to receive their responses in, which can save a lot of data-cleansing further down the line. As Jack acknowledges,”we received submissions back from schools in varied formats, including Word and PDF attachments, and also as plain or rich text email replies.”

    It was all useful, though. “The data we collected provides us with an objective, fully representative sample — we had a 99% reply rate — to gain an accurate understanding of RE practices in Northern Ireland primary schools. 

    “We understand this response level to be unprecedented, according to academics we’ve spoken to who have conducted similar research. Our project is primarily focused on making data transparently available to parents, so from this perspective the 99% number is hugely encouraging. It also means that any aggregate conclusions we draw are as close to being unbiased as possible — we actually have a response rate that is higher than the NI Census 2021 (97%) which people were legally required to complete.”

    Tenacious in the face of challenges

    Getting to this gratifying result wasn’t all plain sailing, though. Jack explained the issues they encountered along the way:

    “Some schools initially mistrusted the FOI request email that came through WhatDoTheyKnow, and didn’t know whether they had to reply. However, a couple of weeks after we sent the request out, the Northern Irish Education Authority issued guidance instructing schools to reply, providing an information document and template response.”

    In any large batch of FOI requests there will be a variety of levels of response, and PfIE came across this too. 

    “There were non-responses, partial responses and responses with an incorrect understanding of the question. Our first technique to remedy these was by following up via WhatDoTheyKnow, which provided alerts and tools which made this very easy to do — another reason I’m very glad we went with the platform!”

    Fortunately, the FOI Act has a provision for dealing with non-responders: referring them to the Information Commissioner’s Office.

    “For persistent non-repliers, we contacted the ICO, who very diligently helped us further encourage schools to respond.

    “But several of the schools that responded late, following an ICO decision notice, sent their responses to our own email account, meaning that the responses didn’t appear on WhatDoTheyKnow. The team at WhatDoTheyKnow were very helpful in adding these: I sent through several batches of .eml files and they made sure they appeared within the conversation.”

    On a mission

    So how will PfIE be sharing their findings? They are launching a report today, On A Mission, with an event at Stormont. They’ve also created an online map to help people explore the data.

    But they’re not stopping there: “After releasing the findings of our report, we plan to create resources and a set of best practices for schools to achieve a more inclusive RE experience for all students. We also plan to engage and empower parents, hopefully promoting a sense of transparency and open dialogue between the school and parental community.

    “Beyond this, we have several other plans to empower parents, increase transparency and improve the education system in Northern Ireland”.

    And that’s how you start to make change

    PfIE have used the mechanism available to them to produce exactly the outcome they were after.

    “The tools provided by mySociety, together with help from the ICO in chasing up the late responders, and the cooperation of the NI Education Authority in doing the same have been invaluable in achieving this level of response,” says Jack.

    “We would definitely recommend WhatDoTheyKnow. The tools have been really useful in managing a large scale request, and the fact that all correspondence will be publicly searchable and visible is invaluable: it adds a great deal of credibility to our research by effectively underwriting our findings with an auditable trail of evidence. 

    “And on top of this, the team have been super-helpful and a pleasure to work with! “

    We’re glad to have been of service. Thanks very much to Jack for talking us through the project. If you’d like to know more, visit the PfIE website, where you can also sign up to their newsletter to be kept informed.


    Image: Priscilla Du Preez

  2. Statement on the proposed ICO fine to PSNI

    The ICO have today announced that they intend to fine the Police Service of Northern Ireland (PSNI) for their accidental release of staff’s personal information in August 2023. This data was released in response to a Freedom of Information request made using WhatDoTheyKnow.

    mySociety is a charity; we run WhatDoTheyKnow as a vital tool to help anyone exercise their right to information held by public authorities. We understand the repercussions of a breach like this, which serves to demonstrate that public authorities must be good at dealing with personal information. We welcome the ICO’s emphasis on the importance of robust release processes to ensuring that information that is important to the public interest can be released safely. 

    We take the responsibilities that come with operating a large platform extremely seriously, especially around the personal data breaches that can occur when authorities’ release processes fail. Following this breach, we’ve undertaken a significant programme of technical and process work to play our part in reducing the risks of this kind of incident.

    We’ve developed a new piece of code which analyses spreadsheets as they come in as responses to FOI requests on WhatDoTheyKnow, and holds them for review if they are detected to contain hidden data. The deployment of this code has proven successful and we will be continuing to improve it. In its first three months, this spreadsheet analyser has screened 3,064 files and prevented the release of 21 spreadsheets that have been confirmed to contain data breaches, and 53 which were likely to contain data breaches (around 2% of the files screened in total).

    In an ideal world, such measures would not be necessary; we continue to work with authorities making such releases to help them understand the reasons for data breaches, the potential severity of their impact, and how to avoid them.

    This blog post was updated at 10:04 on 23 May to correct the figures around the number of spreadsheets screened.


    Image: Pietro Jeng

  3. Off by one: How Parliament counts votes is out of date

    Last night there was a vote to allow MPs to be excluded from Parliament (after a risk assessment) if arrested on suspicion of a serious offence. This vote passed by a single vote.

    The problem is, looking across several sources of voting information, there’s not a good agreement on what the actual totals were. Ultimately the tellers count is authoritative, but this problem reflects the complicated way that MPs vote.

    The result(?)

    SourceDescribed resultCount of names
    votes.parliament.uk170 Ayes, 169 Noes169 Ayes, 169 Noes
    hansard.parliament.uk170 Ayes, 169 Noes169 Ayes, 168 Noes
    (teller result)
    169 Ayes, 168 Noes

    170 Ayes, 169 Noes (speech with teller result)
    169 Ayes, 168 Noes

    What’s going on here?

    In the voting lobby, there are two different systems going on to record votes:

    • An electronic pass based voting system – run by the clerks, that feeds into votes.parliament.uk and Hansard.
    • A counting system run by the tellers – a MP for each side is in each lobby, and if they agree the count, that’s the count used to make the decision.

    Meanwhile, at TheyWorkForYou, we use tidied up division names created by votes.parliament.uk, but the division lists from Hansard, and add the names to get the number of people on each side. 

    Votes.parliament.uk will be quickest with who voted – this feeds into the Hansard list, but the two can get out of sync if one is updated but not the other. 

    In this case, Rebecca Harris is counted in votes.parliament.uk but not in Hansard. This could be for a few reasons, for instance she may not have been able to use the pass system for some reason but was recorded manually and added as a correction but after it was fed into Hansard. We’ve queried this with her. In any case, what the tellers counted is the authoritative result for the vote. They could also have been right – and someone else forgot/was not able to tap in who should have done.

    But if the votes.parliament.uk count was right, it would mean the tellers in the Aye lobby overcounted by one. This would make it a draw, and in a draw the speaker will cast a deciding vote against the motion (as there isn’t a majority for it).  When it’s down to one vote – you want to have faith the system got the right answer. 

    Better ways are possible

    We think it should be easier for MPs to vote, and have previously recommended that:

    • The House of Commons should in normal circumstances, defer votes to a standardised voting time (within ‘core hours’), where multiple votes are held in succession.
    • These votes should be held through a fast electronic means – whether through terminals, voting pass systems, or apps.
    • Current proxy voting schemes should be extended to personal discretion to designate a proxy – e.g. a set number of days a year a proxy vote can be allocated, no questions asked.

    Electronic voting and a voting time would be bringing back good practice from the devolved Parliaments and help MPs make better use of their time than standing in division lobbies. But as well as being slow – there are clear questions to ask about the accuracy of the current approach.

    How MPs vote has big impacts on how our country works – getting it right matters.

  4. Update: some APPGs are back

    WhoFundsThem is our new project looking to uncover the influence of money in politics. You can donate or volunteer to support this project.

    Last month, we asked “What happened to all the APPGs?” because between March and April over a third of All Party Parliamentary Groups were deregistered, from 722 down to just 445. This story was covered in the Byline Times and the Parliament Matters podcast.

    On Monday, we got a partial answer to our question. 

    The May register shows an increase of 90 groups – up to 535.

    We’ve crunched the numbers, and found that 86 of the 277 groups that were removed in April have been re-registered for the May edition. We can’t know for sure why this happened, but we know that Parliamentary authorities did an audit of compliance ahead of the April register, which might have contributed to lots of groups being removed. It’s possible that these groups have since passed the necessary requirements to be re-registered in time for the May edition.

    Taking into account the last three registers, we found:

    Dive into the data yourself

    We’ve updated our public spreadsheet with the new register and an ‘All groups’ tab that shows which groups fall into the six categories above.

    What next?

    We’ve launched our WhoFundsThem project which is requesting information from all APPGs (yes, our job just got a bit bigger!).

    We need your help – please consider volunteering, or donating £10 to help make this work happen. 

    Photo by Zetong Li on Unsplash

  5. TICTeC schedule now online!

    Yes, it’s that marvellous time for the Civic Tech community: the full TICTeC schedule is now online and you can browse it to your heart’s content, picking which sessions you’ll attend — not always an easy decision when there’s so much to choose from!

    As usual, TICTeC promises access to civic tech around the world with insights you won’t get elsewhere, presented by a truly amazing roster of international speakers. This year we have a focus on threats to democracy and climate, and the tools that are working to counter them.

    You’ll find grassroots NGOs, making a difference through their on-the-ground technology; representatives of governments; tech giants; and of course the academic researchers that make sense of everything we do in the civic tech world.

    • Hear from Mevan Babakar, News and Information Credibility Lead at Google;
    • Learn how tech has shaped citizen-government communication from the Taiwan Ministry of Digital Affairs;
    • See what happens when you wake up and realise your civic tech project is now critical national infrastructure, with Alex Blandford of the University of Oxford

    These are just a few of the 60+ sessions from an international range of perspectives that you can dip into across TICTeC’s two days. Which will you choose?

    Come along in person, or tune in from home

    This year, most of TICTeC’s sessions will be livestreamed, so you can tune in no matter where you are (the workshops won’t be broadcast, as they don’t lend themselves to online participation). If you’d like to attend virtually, you can book a ticket via Eventbrite for just £50.

    Or, if you’d prefer to join the conference in person, enjoying all that a real-life meet-up entails, with sessions interspersed with networking, nibbles, and socialising, make sure you snap up one of the limited slots. But hurry – TICTeC always sells out, and this year is looking like no exception.

    Register for TICTeC now.

  6. And we’re off: gathering information on financial interests and APPGs

    We’ve kickstarted the WhoFundsThem project, and now we have a (tight!) timeline of work

    WhoFundsThem is our new project looking to uncover the influence of money in politics. You can donate or volunteer to support this project.  

    On Friday, we sent our first batch of requests for information to 25 All Party Parliamentary Groups (APPGs) as part of our WhoFundsThem work. 

    This is a test batch to see how well the template we’ve made works as a method for getting information back from APPGs. The new rules require them to make quite a lot of different kinds of information available, and there are 445 APPGs — so we want to ask in a way that makes sense for them, and for us.

    We’re asking for this information because we think it’s important to have it openly available for the public benefit. There are loads of possible uses for it: for example, we’d like to improve the APPG membership information we include on the Local Intelligence Hub, but once the information is public, it will be available for all sorts of other projects and individuals to use

    To select the lucky 25 APPGs who would make up our test batch, we took Parliament’s A-Z list of all of the APPGs, numbered them, and then randomly generated 25 numbers. The selected APPGs were:

    1. Africa
    2. Denmark
    3. Japan
    4. Poland
    5. South Africa
    6. Tibet
    7. Artificial Intelligence
    8. Arts and Heritage
    9. Biodiversity in the UK Overseas Territories and Crown Dependencies
    10. Children of Alcoholics
    11. Deafness
    12. Disability
    13. Ethnic Minority Business Owners
    14. First Do No Harm
    15. Future of Work
    16. Human-Relevant Science
    17. Internet, Communications and Technology
    18. Life Sciences
    19. Microplastics
    20. Packaging Manufacturing Industry
    21. Responsible Vaping
    22. SME (Small and Medium-sized Enterprises) House Builders
    23. Sport
    24. Taxation
    25. United Nations Global Goals for Sustainable Development

    On Friday,  we emailed these groups a copy of the template, and informed them that as per the rules they’ve got 28 days to get back to us, making a deadline of Friday 7 June 2024. After this deadline we’ll review the feedback and responses, make any adjustments necessary, and then email the template to all of the remaining 420 APPGs. This should give us responses from every APPG by the middle of July.

    Don’t forget, this is just one of the two parts of the WhoFundsThem project. While we’re waiting for APPG responses, we’ll spend the month of May recruiting volunteers, and then in June we’ll begin answering questions for the other stream of the project which looks at the Register of Members’ Financial Interests (RMFI). By mid-July, we’re hoping to have turned those answers into individual summaries for each MP. Then the right of reply process begins: MPs will have a month to respond to our summary of their financial interests.

    All being well, as we send off these summaries to MPs, we’ll be able to switch back to looking at APPGs, as the returns from the second batch should be back ready for us to clean and analyse. By the end of August, we should have both clean APPG data and RFMI summaries with MP feedback. We’ll then spend some time auditing this data ready for publication in the autumn.

    Well, that’s the plan at least!

    If you’re interested in being one of the volunteers who will work on this exciting new project, you have until 28 May to fill in our short application form! On Tuesday evening (14th), we’re hosting a Q&A event to explain more about the project and answer any questions about volunteering. We know not everyone can give up their time, though, so if you want to support projects like these in another way, please consider financially supporting us.

    Want to find out more about APPGs? I wrote a blog post last month explaining what APPGs are, how the rules changed, and the impact that change had. 

    As ever, if you’re interested in the work we do, make sure you’re signed up to our newsletter. Thanks!

  7. TICTeC keynote speaker announcement: Nick Mabey OBE

    Hot on the heels of our last big announcement, we’re very happy to confirm our second keynote for TICTeC, The Impacts of Civic Technology conference 2024: Nick Mabey OBE.

    If you’d like to hear from one of the big players, really making a difference to the UK’s climate change response, you’ll want to make sure you’re at TICTeC this year. 

    Nick is a founder of E3G (Third Generation Environmentalism), an independent climate change think tank with a goal to translate climate politics, economics and policies into action — and is now its co-CEO.

    He has previously worked in the UK Prime Minister’s Strategy Unit, UK Foreign Office, WWF-UK, London Business School and the UK electricity industry. As an academic he was lead author of Argument in the Greenhouse, examining the economics of climate change. 

    He also founded London Climate Action Week, one of the world’s largest climate festivals, which takes place on 22-30 June — so if you’re in London for TICTeC and you have an interest in climate, it might be worth sticking around for that! 

    Nick will open the second day of  proceedings at TICTeC, setting the scene for presentations and workshop sessions that strive to examine the central question: What is needed to make civic tech tackling problems around climate change more successful and impactful on a global scale?

    Few people are better equipped to bring such a broad spectrum of knowledge and experience to this complex issue. If you’d like to tap into some of that, then make sure to snap up your tickets to TICTeC


    The TICTeC 2024 schedule will be published very soon, so watch this space.

  8. We’re putting more ‘local’ into the Local Intelligence Hub

    Tl;dr: We’ve added lots of local council data to the Local Intelligence Hub.

    In February, we launched the Local Intelligence Hub, and today we’ve released a huge new update. 

    We designed the Local Intelligence Hub — in collaboration with The Climate Coalition and supported by Green Alliance — to provide all the data you need, either about one constituency or across the whole country, on issues around climate. It helps you gain a deep understanding of public opinion, demographics, political considerations, and much, much more. In short, it’s an extremely powerful tool, free to use, and invaluable for anyone pushing for better climate action.  

    At launch, we divided the data by UK Parliamentary constituency — but with this huge new update, you can now also explore data at the local council level.

    As ever, there are several different ways to view this data:

    • by individual authority, so you can deep dive into your local area
    • as a table, so you can compare councils by metrics that matter to you
    • plotted onto a map, so you can see where to find hot- and cold-spots of action

    And it can all be downloaded as a spreadsheet for use on your own desktop.

    What kind of data are we talking about?

    We’re pulling together data from multiple different sources. What does it all have in common? We reckon that it provides new insights for climate campaigners, researchers, journalists and organisations  — especially when it’s combined in new ways, as Local Intelligence Hub allows you to do quickly and simply. 

    Sources include national polling data, information from our services CAPE and Scorecards, and other Climate Coalition member organisations, like the National Trust and the RSPB. 

    And we’re always looking for more data, so do get in touch if you know of a useful source we haven’t yet included! 

    What can I do with it?

    You will know best how this rich data could inform your work, but here are a few ideas to get you started.

    1. Build a profile of your local council

    Dip into the local council page and see what data awaits you! Here’s an example of the top-level stats you can find for Leeds City Council:

    • The area has a strong mandate for climate action. MRP polling suggests we’d see 88% of Leeds City residents support onshore wind compared to 83.5% national average, and just 10% oppose net zero compared to 12% national average. 
    • Leeds City Council is doing better than most councils, but could be doing more. It scored 53% on the Climate Action Scorecards, gaining its highest scores in Planning and Land Use, but with the biggest room for improvement on Transport. 
    • Emissions are huge, but so is the population. Leeds City Council serves 798,786 residents compared to the average of 307,712. According to BEIS data, Leeds City Council has influence over 2,822 kilotons of CO2 emissions, which is more than twice the national average of 1,168.3.
    • There’s an active climate movement. In Leeds city there were more Great Big Green Week events than average in both 2022 and 2023.


    2. Design a national campaign strategy 

    If you’re a campaigning organisation looking to work out where and how to allocate resources, the table-builder and CSV download could form an essential part of your planning process. Here we’ve generated the single-tier councils with Net Zero target dates that fall within the coming decade, and sorted by their Action Scorecards overall score, alongside useful data about public opinion and emissions.

    Council Name Action Scorecards overall score Net Zero target date Population Oppose Net Zero % Total emissions (ktCO2) IMD Trussell Trust foodbanks Support onshore wind
    Wolverhampton City Council 21 2028 264407 12 854 1 0 82.0
    Middlesbrough Council 21 2029 141285 12 558 1 7 78.0
    Bromley Council 26 2027 332752 12 938 5 4 88.1
    Dumfries and Galloway Council 28 2025 148290 15 864 3 3 80.0
    Oldham Borough Council 32 2025 237628 12 690 1 2 80.1
    Cheshire East Council 33 2025 386667 13 1860 4 2 87.9
    Highland Council 35 2025 235430 13 1268 4 7 82.6
    Nottingham City Council 42 2028 337098 9 1038 1 10 78.0
    Haringey Borough Council 52 2027 266357 7 617 2 1 79.3
    Tower Hamlets Borough Council 53 2025 331969 6 1019 2 0 79.8
    Bristol City Council 55 2025 465866 8 1295 2 13 86.5

    3. Visualise your goals

    Local Intelligence Hub helps you zero in on the areas of the country that meet specific criteria. For example, where are the district councils who have declared a climate emergency but haven’t published a climate action plan? Here’s a map that shows you — just one of hundreds of maps that you can generate with a few clicks, and no expertise required:



    What to do with all this lovely local data?

    Thanks to this update, it’s now easier than ever to push for local climate action. With these rich new insights, you now have a number of talking points with which to engage your local councillors or council climate officers — and a wealth of facts and figures to back them up.

    What next?

    We need you to use the Hub and tell us what works, and what doesn’t! Give us your feedback  — and if you’d like to know whenever we add something new,  sign up to updates and we’ll let you know when there’s new data to play with.


    Photo by Daniil Korbut on Unsplash

  9. Improving TheyWorkForYou’s voting summaries

    If you value the work mySociety and TheyWorkForYou do, please consider whether you can make a donation.

    We have a good track record of making Parliament more open, provide essential tools to civil society and small charities, and with our platform a little support can go a long way.  If you would like to make a larger donation to support specific work, or to match-fund other donations – please get in touch.

    Our MPs in Parliament have many roles, but one of the most important is that they make decisions on the laws that govern us, and these decisions can affect every aspect of how we live our lives. 

    TheyWorkForYou’s voting record summaries are part of a number of different arguments about what the role of MPs is, and how Parliament should work.

    As well as listing individual votes in Parliament, our voting summaries give an overview of how MPs have voted on policies that come up in multiple votes. We strongly stand by the principle of our summaries, but don’t think there’s only one way of doing it. Using a grant from the Newby Trust, we’ve been reviewing our methods and refining our approaches to voting records. 

    The key headline is that we’re going to sharpen the focus in our approach. The main changes are:

    We have written a longer document explaining how these changes achieve our goals.

    This change is being applied alongside a backlog of new policy lines that we’ve been reviewing with our new criteria for inclusion. While these may be big shifts in principle, in practice most existing summaries stay exactly the same. It’s a progression and simplification rather than a revolution.

    To see our voting summaries for your MP, search for your postcode on TheyWorkForYou and click ‘Voting Summaries’.

    What we want to achieve with our summaries

    In thinking about our voting summaries, we wanted to clearly define what we’re trying to accomplish. This has led to two headline goals:

    We want to present clear and accurate summaries of how individual MPs have voted, for use by the public.

    • As a point of principle, it should be possible and straightforward to find out how MPs have acted on behalf of their constituents. 
    • The top-line display of information should be a good reflection of the data that was used to create it – balancing clarity and accuracy.  We should provide options for people to learn or explore more, with the expectation that most won’t, and so the clarity of the summary matters.
    • While we aspire to produce information that is also of use to people with a professional interest in Parliament, this need might be better met through other tools or summaries. For instance, while it is possible to compare different MPs through voting records, it is not the main purpose of these summaries.  

    In line with our general approach, we want to align with and amplify citizen perspectives of how MPs should work, as voiced by Citizens’ Assemblies (in particular the Democracy in the UK Citizens Assembly) and polling.

    • Historically, we’ve seen that making the actions of MPs more visible changes their behaviour.
    • We need to be conscious of the likely effects of our summaries, and ensure they reflect our values, democratic principles and approach. We want to anchor our approach in wider ideas of how our democracy works rather than our own opinions.
    • We also need to be aware of when pressure on individual MPs is not the best way to achieve systemic change. As such we need to consider where our work reinforces rather than changing parliamentary systems that are hostile to MPs from groups historically excluded from Parliament (e.g. women, ethnic minorities, disabled MPs). 

    A longer document explaining how these changes achieve these goals can be read here

    The impact of this change

    Most of the top-level summaries on the site (73%) are completely unaffected by these changes. 82% of MP ‘scores’ are either the same, or have a stronger/weaker version of the same alignment (i.e. the adjustment has not affected our assessment of whether the MP is for or against a policy). About 14% of connections between MP and policies are removed, which is a combination of removing seven policy lines that were made up entirely of votes that did not directly use Parliamentary power, and longer running policies being confined to a narrower time frame. The remaining changes are “a mix for and against” assignments becoming more clear (or the reverse), and a small group (about 120 out of 80,000) where the direction of the score has changed (i.e. where someone was voting for and is now seen as voting against – this is mostly concentrated in two policies). You can read more about this in our longer summary.

    This is good because we don’t generally want these to be too sensitive to the exact formula used: the kind of broad points we’re making should be reachable no matter which method is applied. Ultimately only a small group of votes have been removed from policies, and the positions we were displaying before were mostly driven by votes that already passed the “use of powers” criteria. The goal of this process is to simplify how we work and enable clearer explanations of what we’re doing – but the general end product isn’t massively changed by adopting these new rules.

    A process, not a destination

    This isn’t where we stop. This update is a step in the journey.

    There is a growing clarity issue that for long-serving MPs there are now quite a lot of policies — and part of our work creating summaries should be helping people find the relevant information they’re looking for. 

    There is also a pending question about presenting a retrospective on the current Parliament during the next election. With the technical work we’ve done, it is now much easier to explore alternate approaches to displaying this data. 

    We are considering how we can best do this, and how we work with others to ensure we are capturing the important issues of the last Parliament. 

    Making other tools available 

    One kind of complaint about voting summaries is that they do not provide an easy way of drawing out small differences between two MPs on how they voted. This is true – we might say two MPs voted a mixture of for and against a policy, but in practice they took opposite positions on different votes. 

    In our voting summaries we’ve made the decision to focus on providing information that makes sense for a constituent looking at their MP – we produce better summaries by focusing on specific kinds of users we want to make sure it works for. But for our own work as well as to support others, we want to provide a wider range of tools and information for both citizens and specialists. 

    Our previous approach to voting was deeply tied technically with the Public Whip (originally a companion project to TheyWorkForYou, but not run by mySociety). This means we had limited ability to take big swings in our approach: while we indirectly maintain it through the data feed, we can’t change the basic functioning of the Public Whip.  

    To implement the changes described above, we have internally created a Public Whip replacement (TheyWorkForYou Votes) that we’re using to update voting records, and provide new analysis tools to help us understand votes, giving us easy understanding of the parliamentary dynamics of a vote and basic analysis of the motion.  In the next year, we want to talk to more people who want better tools for working with raw voting information, to help shape this tool for a public release. 

    Supporting our work

    In our voting records work we have an approach that has public support, and we think serves an important purpose. But we don’t think this is the only or best way of creating voting summaries: we want to be able to be reactive to how Parliament is changing, and always making our coverage and approach better. We also want to work to encourage better transparency and public understanding at the source, through improving how Parliament works.

    If you value the work we do, please consider whether you can support us financially. We have a good track record, and with our platform a little support can go a long way.  If you would like to make a larger donation to support specific work, or to match-fund other donations, please get in touch. 

    In the next few weeks we will be announcing a new project involving volunteers and the register of members interests. If you’re interested in hearing more about that, please sign up to our volunteer mailing list.

    Header image: Photo by Aaron Du on Unsplash

  10. The Council Climate Scorecards project is having international impact

    Canada differs from the UK in many ways: obviously it’s vastly bigger, extending across many more latitudes; its climate, nature and terrains vary hugely; its cities are more dispersed and diverse — and accordingly, the challenges the two countries face around tackling the climate emergency are different, too. 

    But there are some significant ways in which we are alike, too, as we learned when we chatted with Hannah Muhajarine, National Campaign Manager at the Climate Reality Project Canada

    Climate Reality, like mySociety and Climate Emergency UK, have identified local councils — or municipal governments as they’re called in Canada — as crucial contributors to our respective countries’ decarbonisation. Both sides run projects that monitor the climate action of these authorities, helping citizens to keep an eye on their progress.

    “Scorecards helped us reimagine both our content and project design.”

    Hannah first heard about the Council Climate Action Scorecards on the Local Zero podcast, and immediately saw the parallels between our two projects. Not only that — she understood that Climate Reality could learn from our project, adopting some of the Scorecards’ approaches. 

    We were keen to hear how the Scorecards have encouraged Climate Reality to enrich and broaden their own work in monitoring climate action at the local level. So, first of all, what is Climate Reality?

    “We’re the Canadian branch of The Climate Reality Project, which is an international climate organisation,” explained Hannah.”We work by training citizens on climate advocacy, education, and communication.  

    “This includes supporting a network of Community Climate Hubs across Canada, which mobilise citizens to get involved in local climate advocacy targeting city and town councils. 

    “These grassroots groups get involved in a variety of projects, including making sure that climate is a priority during municipal elections and budget-setting; participating in public consultations relating to climate; campaigning to get their council to declare a climate emergency; organising campaigns; hosting community-facing public events on climate, and more.”

    All good stuff, but a lot more hands-on advocacy than we’ve been doing over here. So, where are the links with the Scorecards project?

    “To support the Hubs and their local advocacy work, since 2018 Climate Reality has led a project called the National Climate League (NCL), where we trained volunteers to collect data every year on a set of climate-related indicators measuring climate progress at the local level across Canada”.

    Ah yes, the overlaps certainly begin to become obvious — in fact, that’s exactly what Climate Emergency UK did for the UK Scorecards. So, what sort of data were Climate Reality collecting?

    “For example, the number of Passive Certified buildings within municipal limits, the number of transit trips per year, household waste per year… we also tracked a smaller number of policies, like climate plans and climate targets, adaptation plans, green building policies and so on.”

    And, like the Scorecards, all this information was a useful way of letting the public know how their local government was getting on: “Climate Reality staff would pull together the volunteer-collected data each year and publish it in a report, with data visualisations comparing municipalities across the range of indicators, the results of our policy scan, and case studies of top-performing municipalities.”

    “Learning about the Scorecards provided a great inspiration, and a specific model for us to work towards.”

    Hannah goes on, “In the spring of 2023, we’d just launched the fifth edition. We were interested in re-evaluating the design of the project, and especially expanding the policy aspect.”

    This was great timing: “It was around then that I heard about the Scorecards on an episode of the Local Zero podcast. I was really excited, since the project had many parallels to ours, but featured more detailed and extensive criteria — plus it was a bit larger scale in terms of volunteer participants and the number of councils covered, and it used the scoring method to compare councils with one another, which we hadn’t previously considered.”

    What great synchronicity. So, what changes did Climate Reality make, inspired by the Scorecards?

    Hannah explains, “Scorecards helped us reimagine both our content and project design. For example, we introduced more extensive and in-depth training for volunteers. Inspired by the way that Climate Emergency UK work, we identified volunteers with key skills, and harnessed them to help with data verification. 

    You also influenced us to add questions around retrofit programmes, support for low-income homeowners and rental housing; renewable energy targets; and community climate action funding.”

    And so, what were the outcomes of these changes?

    “We were able to recruit 51 volunteers to participate in data collection this year, and collect data for 53 municipalities across Canada, which is a great expansion on the project compared to last year.

    “Plus, the new version of the NCL includes 21 policy questions, each with several sub-questions. So we’re now tracking things like climate plans, community greenhouse gas reduction targets, citizens’ climate advisory committees, mode-share targets, curbside composting programmes, and more. 

    “We’re hoping the new Scorecards-informed version of the NCL will provide a great boost in terms of the data and information available to our network of climate advocates, and give them a new tool they can use to engage in local climate advocacy, targeting city councils, towns, and even other jurisdictions perhaps — as well as communicating with their community about how their city/town compares to others on climate. 

    “My hope is that expanding and strengthening the policy element of the NCL — which the Scorecards helped us do — will really help boost local advocates’ policy literacy, help them identify specific policies, targets and programmes that other municipalities have implemented and which they might like to build a campaign around and encourage their municipality to adopt. They’ll be able to evaluate their council’s climate plans and targets against what has actually been implemented and what the outcomes have been — in other words, they’ll be empowered to draw the connection between policy and action, just as the Scorecards have done.”

    Climate Reality won’t actually be scoring the municipalities this year, though Hannah says it’s a consideration for future iterations. “As you all know, there are challenges with designing an objective, properly weighted scoring system, so we decided we didn’t have the capacity to go all in on trying to design something this year, but it would be something we’d like to do in the future. Obviously it is a really good method for translating a lot of detailed, diverse policy information into something that can provide an at a glance comparison.

    “Overall, learning about the Scorecards and connecting with Climate Emergency UK provided a great inspiration, and a specific model for us to work towards, as well as really helpful advice on specific shared challenges.”

    We are very gratified to hear that. It is always wonderful to connect with other projects around the world that are working towards similar aims by similar means, and to exchange ideas. Thanks very much to Hannah for telling us all about it.

    Image: Will Clewis