This recording is from a TICTeC gathering on the latest techniques to help civic and pro-democracy tech projects protect their services, data and users.
Across the world, websites and apps that promote democratic transparency and citizen participation are experiencing more sophisticated cyber attacks.
We hear some of the latest techniques to help civic and pro-democracy tech projects protect their services, data and users, covering specific internet freedom tools, how they’re being tested and integrated amongst pro-democracy actors globally, as well as the challenges of sustaining protection.
Speakers
- Patricia Musomba, The Engine Room. Patricia is passionate about empowering at-risk communities to enhance their digital resilience through capacity building and tailored support. Patricia talks about The Engine Room’s Cybersecurity Assessment Tool (CAT), which is designed to measure the maturity, resiliency, and strength of an organisation’s cybersecurity efforts.
- Hui Hui Ooi, Advisor, Technology & Democracy at the International Republican Institute (IRI). Hui Hui shares lessons from IRI’s internet freedom user testing programme and how they could be integrated into the civic tech community. She also emphasises the importance of internet freedom tool developers working hand in hand with users at risk to conduct tool testing to ensure tool adoption and improve trust.
- Jocelyn Woolbright, Program Manager at Cloudflare Impact, and part of Cloudflare’s Project Galileo team. Since 2014, Project Galileo has been providing free cybersecurity to more than 3,000 at-risk websites from around the world, protecting them against attacks such as DDoS, web exploits, phishing, and automated bot traffic. Jocelyn discusses the full range of support provided and explain exactly how civic tech organisations can get involved and apply for this critical protection.
Links
- The Engine Room’s Cyber CAT project
- CloudFlare’s Project Galileo
- TICTeC’s Democratic Transparency community of practice
- Find our podcasts useful? Please donate!
Transcript
0:04 Gemma Moulder Hello everybody. I’m Gemma. I’m mySociety’s Events and Engagement Manager. And thank you for joining us today for this TICTeC community gathering. And thank you to NED for supporting TICTeC this year.
0:18 So little reminder, TICTeC stands for The Impacts of Civic Technology. TICTeC started life as a conference 10 years ago, back in 2015, but since 2020 we’ve also been running year-round activities between conferences, to connect people building, using and researching tech to strengthen democracy and civic power, so we can all learn from each other and boost our collective impact.
0:43 We’re currently fundraising for future TICTeC activities. So do get in touch if you’d like to support these in any way, there’s an email address there on the slide, and do get in touch with us anytime about ideas of the type of content that you’d like to see covered at our TICTeC events and activities.
1:00 The topic today is cyber security. Across the world, websites and apps that promote democratic transparency and citizen participation are experiencing more sophisticated cyber attacks, including ours at mySociety.
1:14 So therefore it’s vital, more vital than ever for civic and pro democracy projects to understand the latest techniques to protect our services, users and data and evaluate our current practices.
1:25 So to help us do that, today, we’re delighted to be joined by three fantastic speakers. First, we’ve got Patricia Musomba. She’s the Associate for Engagement and Support in Sub Saharan Africa at the Engine Room.
1:40 We’re also joined by Hui Hui Ooi from the International Republican Institute, and also Jocelyn Woolbright, who’s a Program Manager at Cloudflare Impact.
1:51 We’ll hear from our speakers in a moment. They’ll each have 15 minutes to talk to us, and then we’ll have time for questions and discussion afterwards. So yeah, firstly, we hear from Patricia Musomba who will talk about Engine Room’s work empowering at-risk communities to enhance their digital resilience through capacities, building and tailored support.
2:13 Patricia Musomba: All right, hello, everyone. My name is Patricia Musomba, and I’m joining you from Nairobi, Kenya, and I’m really excited to be here today to share a little bit about our work at the Engine Room and to share a little bit about the Cybersecurity Assessment Tool, which is one of the projects that we are currently maintaining.
2:33 I work at the Engine Room as an Associate for Engagement and Support, specifically for Sub Saharan Africa, but also across across the globe, and I mainly work to support organisations and individuals to use technology and data in safe and responsible ways. And my expertise mainly lies around digital security and digital resilience.
2:56 I’m excited to answer any questions you may have around those areas and around the work that we do.
3:04 I’ll start by briefly introducing what we do at the Engine Room. And our mission is to strengthen the fight for social justice, and we do this by supporting civil society organisations and individuals to use technology and data in strategic, effective and responsible ways.
3:24 Some of our work spans from research, especially at the intersection of technology and society, and also through direct support, where we have various support platforms, for example, the light touch support, where we provide tailored recommendations based on community needs.
3:44 But for today, I will mainly focus on the Cybersecurity Assessment Tool, which is a project that we’ve been maintaining for the last for the last year, and we will be maintaining this particular tool for the next two years as well.
4:01 So through our work, we worked with community partners across the global majority for the last for the last decade, and through our work with partners, we are seeing that, especially recently, that the civic space is shrinking.
4:21 We are seeing threats of intimidation, especially of activists and human rights defenders, which in most cases, has a chilling effect and leads to limited participation.
4:35 I think all of us, this year, we’ve seen the rising funding challenges across across the sector, and also rising digital restrictions. We’ve seen a lot of harmful legislation being used against civil society organisations.
4:54 We’ve seen interruptions of digital spaces, for example, through internet shutdowns, for example, what we are seeing in Tanzania recently through the elections, which is something that has become a lot more common, a lot more common.
5:11 Recently, we’ve also seen an increased number of threats targeting civil society organisations. I think I’ve seen a few people share in the chat around how this is also a concern that they’ve seen, and they’ve seen a few more civil society organisations and nonprofits being targeted, especially around their domains and even their social media accounts.
5:36 So some of the tactics we are seeing include phishing to compromise different accounts. We are seeing a rise of spyware cases. I think every every quarter, we see a new report around spyware being used against civil society organisations. And this is definitely something that we have to defend against.
6:00 All these different things have definitely been exacerbated by a constraint in terms of resources. So for example, when we look at capacity issues, when we look at infrastructure issues, all these different things definitely increase the risk for organisations.
6:23 I don’t know whether these are patterns that you’re seeing within your within your community or within your network, or there’s a pattern that I’ve not mentioned here, please feel free to add it in the chat.
6:37 So to continue, what can you do? I know I’ve mentioned all these different things that are happening, and it can, it can make us feel demoralised because of all these different things that are happening around us, but I do like to emphasise agency and focusing on this, the steps, however small that you can take to be able to prepare and respond to these different threats, and so whatever little steps that you can begin doing to prepare yourself against some of these different patterns that that I’ve mentioned will go a long way in protecting yourself and protecting the work that you do, and also the community members that you have.
7:22 So one of the things that organisations can do is improve and focus on their cybersecurity and make it a part of a part of the day to day.
7:33 We like to frame cybersecurity as a form of resistance, because that framing then puts the agency on us, and it also puts it puts the the agency on the fact that there’s something that we can do to resist against these different threats that I’ve mentioned, and by taking steps to protect yourself and to protect the work that you do.
8:02 It is also a form of care. It is a form of care for yourself. For example, I think someone mentioned in the chat that one of the things that they worry about is, you know, waking up in the morning and someone was trying to get into your into your account, and that can elicit emotional reactions.
8:21 So a lot of times, our digital lives and our offline lives are starting to collide in that our digital lives are starting to affect our offline lives as well. And so protecting ourselves digitally also means that we are also collectively caring for ourselves and also caring for the people that we work with and and the community partners that we work with as well.
8:47 And so when I speak about putting cybersecurity into your strategy, I mean taking steps and introducing cybersecurity into your way of working, into the day to day activities that that you have, putting it as a mindset as well as you’re doing different activities, asking yourself, How can I how can I do this? I need to be more securely.
9:13 How can I do this? I need to be more safely. How can I protect this, I need to be more…are there steps I can take to do this a little bit more safely? So adding those different questions is definitely what we mean by adding it into your strategy, and adding it into your ways of working.
9:32 So by building a cybersecurity culture where everyone feels a part of it, and it’s not just the people who are in charge of it, or the people in charge of technology who care about it, but all of us collectively play a part in it.
9:49 The other thing is that we do acknowledge that it can be difficult to start because, because of the competing priorities we have, we have a lot of programmatic work to do, and so having tools and resources that can help you in this journey is very important, hence why we are introducing this tool to you. The Cybersecurity Assessment Tool.
10:10 It is a tool that was created with civil society organisations in mind, and nonprofits in mind, non technical people in mind.
10:20 And so it’s a very simple tool. It’s a very simple tool where you answer questions about your current practices, you answer questions about your organisation and what you do, and then based on the answers that you provide, then it gives you recommendations on how to improve on these different areas.
10:44 So it tests you on four key areas, and that is operational security – that is how you work on a day to day basis.
10:53 It also asks you questions around risks such as doxing and hacking and harassment and identity theft, things that civil society organisations are faced with every single day, and it also has questions around device security and account security as well.
11:11 So how do you currently protect the devices that you use, and how do you protect the accounts that you have as an organisation? I have talked about it, but I also want to show you how this particular tool looks like.
11:26 So I will add the link of this tool to the chat if I can find it, let me see. I have added that link to the tool. I think someone else already added it as well. Thank you. Thank you for doing that.
11:51 So once you visit this particular link, you will be able to click on ‘Start an assessment’, and this is where it takes you. And so when it takes you, here it is.
12:06 It is a questionnaire where you get to answer all these different questions about the way you work currently, not as an ideal of what you would like to be seeing, but what are you currently doing within the organisation.
12:19 And so you bring together the entire team, if possible, to fill this questionnaire together, because we want, again, this is a part of building that culture. So we involve everyone. We involve people within this, this particular space.
12:35 So we bring in people, we answer the different questions around the areas that I mentioned about risk, different risks that you may be facing, you also answer questions about operational security.
12:47 So right here, if you click on understanding risk, you can see that you’ll get questions about about risk. If you go to operational security, you’ll see different areas around how you operate within the organisation.
13:04 And lastly, when you go to the device and account security, you’ll also get questions around how you collaborate, how you install different softwares within within your organisation. And so all this will collectively be used to determine, to determine the recommendations that you’ve given.
13:24 So it’s a tailored tool in that you will not get the same recommendations as someone else, because you might be on a different part of the journey. You might have already have implemented some things, but other people haven’t. And so depending on your answers, you will get recommendations that fit that specific profile that you have shared with the tool.
13:50 We do know that sometimes you’re not able to do, to commit to it for an extended period of time, because it takes one to two hours to fill the survey. And so we do have a feature that you can save and resume later, so you can answer a specific question, and then later on, be able to a specific section, and then later on, you’re able to come back to come back to it, so you can take as much time as you need.
14:21 We do understand that this does take time, and security is a journey, and so that’s why the tool also provides a way of coming back later. It’s also available in six languages, and so you can be able to use this tool in English, in Arabic, in Indonesian, in Spanish, and in Portuguese as well, and also in Mandarin. So it comes in those six languages.
14:50 If you log in and there’s a language you would like to see on the tool, we’re still hoping to be able to increase the accessibility of the tool, so you can also share with us, and we’ll be able to look into expanding the number of languages that are available.
15:10 And once you’ve completed the entire survey, you will get recommendations such as this. So I did a mock assessment today morning, just preparing for this call, and I filled in using a fictitious organisation, and it gave me these recommendations based on my responses.
15:31 So as you can see, it gives you a brief of how you performed in the different in the different sectors, but then it gives you specific information about each of the sections, so it will tell you what you scored and whether that is the recommended security level or not.
15:51 And it also gives you the recommendation and a roadmap on how to improve on the different areas. It gives you examples.
16:01 It gives you links to different tools that you can use to implement anything that is mentioned within the tool. And so it is a really good starting point for how to begin your cybersecurity journey.
16:15 One of the things that we are very keen on is meeting people where they are. And so you might have just started the journey. You might not have any formalised practices yet.
16:30 And so the CAT tool accounts for that. And so in a lot of our answers, you will see an answer like, we don’t know. We don’t know about this. We don’t know about this other thing. Why? Because sometimes you’re aware and sometimes you’re not aware about about what you should be doing.
16:49 You might have recently faced an attack, or people you know might have faced an attack, and you have similar profiles and you so you’re worried about it.
16:59 CAT also takes that into account, in that there are questions around whether you faced cyber attacks before and how you responded to them, so that, again, it can guide you in the process.
17:12 Lastly, you might already be on the journey. You might already have implemented a few things, but you’re curious around, how are you doing? So this is a tool that can also be used to establish a baseline of where you are so far.
17:27 So we have created also spaces for to support you in this particular journey, because we understand that is, it is a complicated journey. It is a journey that needs support.
17:37 And so we’ve created support spaces to be able to support you in using the CAT tool, but also support you in the way that you use technology and data within your work.
17:49 And so the two support spaces that we have is light touch support and CAT Your Spaces. Light touch support, I mentioned it in the beginning. This is a pro bono space that we have where you can book time with our team and ask questions you may have about about cybersecurity.
18:08 Could be about how you’re using data; it could be about resources or even networks that you would like to be connected to and from that conversation, then we are able to provide you with recommendations, or even connect you with different people.
18:26 And lastly, we have the Catio spaces, which is a friendly space that we wanted to create to make cybersecurity approachable, to make it friendly as much as possible. And so Catio space is our space that is dedicated to CAT itself.
18:42 And so if you have any questions around how to use CAT within your within your organisation or within within your collective, this is the space that you can reach out to us through.
18:55 I will post these links immediately after I am done, I will add them to the chat just now, and so if you need any any support, or you have questions about the tool that I’ve just introduced or how you can use it within within your organisations, you can reach out to us through cybercat@theengineroom.org and we will be able to provide you with tailored guidance on how to use the CAT tool, but also generally how to approach cybersecurity within your spaces. So thank you, Gemma. I will hand it over back to you.
19:31 Gemma: Thank you so much, Patricia, great work, great tool, hopefully to be useful to the people here. Thank you so much, and that’s only a little bit of what Engine Room do loads of awesome stuff? Ok, right, we’ll pass on. So as I said, if you’ve got any questions for Patricia, do put them in the chat, and we should get time at the end to ask those.
19:50 But I’m going to pass over to Hui Hui, who will share lessons from IRI’s Internet Freedom user testing program.
19:58 Hui Hui Ooi: Thank you all for joining us today. OK, for all that don’t know, IRI is a global international development organisation that focuses on promoting and advancing democracy.
20:11 We are a nonpartisan, non governmental institute, and we support civic groups, journalists, national and local governments and also many other types of democratic actors globally.
20:22 We work in the Americas, Africa, Asia, Europe and the Middle East. We just don’t do any work in the US.
20:28 And I am the advisor for technology and democracy. And my portfolio basically covers promoting digital democracy, civic and government technology, and also internet freedom globally.
20:41 My presentation today will focus on IRI’s efforts to mainstream internet freedom through user testing.
20:49 So just very quickly, the technology and democracy programming at IRI works on two fronts. First of all it is defending against digital threats, and then the second portion is more on advancing a positive vision for digitally empowered democracy.
21:04 So, you know, we both do work on countering digital repression. You’re helping our partners build and, you know, resilience to online threats, while also recognising that it’s also equally important to leverage the positive sides of technology to continue to advance our partners’ work.
21:21 We have four pillars of work here. The two main ones that is of importance of today is the second and the third one, which is on defending internet freedom and then also strengthening cyber security.
21:31 So why that matters? And you know, how is it tied to cybersecurity, right?
21:36 So internet freedom strengthens cybersecurity because an open rights, respecting digital environment encourages all all the above: transparency, accountability, free flow of information.
21:48 When a lot of our partners or users can access uncensored security tools, share their vulnerabilities, report attacks without fear, and also being able to communicate securely, they’re able to build collective resilience against cyber risk.
22:02 Obviously, globally, we are seeing an alarming ongoing trend where internet shutdowns have been used as a tool of control, often during politically sensitive moments like elections or protests, right?
22:14 And then this isn’t just happening in authoritarian states. We are seeing just an overall democratic backsliding, you know, meaning that these tools are spreading to countries that once protected digital rights.
22:25 What does this matter? To IRI, internet freedom really isn’t just a side issue for our work. It’s a fundamental infrastructure of how we proceed with our work.
22:35 When the internet goes dark or becomes unsafe, our programming kind of grinds to a halt; our partners cannot communicate, they can’t organise, and the movements that they’ve built in many, many years lose their most powerful tool for coordination, advocacy and then also open collaboration requires trust, and trust requires security.
22:54 If many of our partners are worried about surveillance and censorship, then they can’t communicate safely, and then the entire foundation of our work is affected.
23:04 So ultimately, it isn’t just about protecting activists and journalists in high risk environments. Everyone in our ecosystem needs these types of baseline digital security knowledge.
23:15 It’s not just for those who are working to protect digital rights. It’s everybody in the DRG space.
23:22 Through our work, we did identify quite a few gaps. So to basically, why are we implementing internet freedom programs and also user testing later on that I’m going to talk about?
23:33 We realised that many internet freedom solutions and tools remain inaccessible to those who need them the most.
23:39 It could be due to cost, could be technical barriers, language limitations, or just simply not knowing that they exist.
23:45 So there’s really a gap in terms of the accessibility and knowledge. Many of our at-risk users also simply don’t know what’s out there.
23:53 They’re unaware that there are solutions that exist, multiple solutions that exist, for there are specific threats, right?
24:01 There’s also a lack of timely support to access tools to circumvent censorship and surveillance. As we all know, internet freedom threats move very fast.
24:09 A government can implement new censorship technologies very quickly and launch a crackdown overnight, but our support systems tends to move a lot slower than that.
24:19 So we are in need of rapid response mechanism. There’s also a gap in terms of coordination with funders, implementers and technologists in the internet freedom community.
24:29 We are funders with resources, implementers like IRI, with the relationships and contextual knowledge and then the technologies, just with the tools and solutions.
24:38 But we are often working in silos, so this is where IRI comes in to quite try to bridge those things.
24:45 And now I’m going to move to talk about our core Internet Freedom programme approach. So we first began our Internet Freedom work in 2020 where we provided emergency support to groups within the internet freedom community.
25:00 We actually supported and funded up to 10 tools and research projects so that they could remain operational.
25:06 A lot of these tools are very important for our partners on the ground: civil society activists, independent media, they’re all working to promote rights and then also their work, right, especially in the height of Covid.
25:19 Back then, through this first programme that we had, we funded accessibility and also usability audits. We supported localisation efforts as well for these tools, and then also helped distribute these tools to our partners, making, basically making our partners aware that these things exist through tech demos of tools, briefings and all that.
25:43 And then through this specific project, we are basically starting to bridge the gap between the technologists in that freedom space with our partners in the DRG space.
25:52 So it helped the tool developers better understand the needs of our partners on the ground, like the actual threats that they’re facing, and then allowing, on the other side, our partners to basically know more about these tools that are available to them.
26:06 And this is more of what I say here, the tools first approach, and this is how we started this portfolio. So by the end of this programme, we built up a repository of internet freedom tools.
26:18 So we upskill ourselves internally, and in turn, being able to share a lot of these different solutions and tools to our partners to help them circumvent censorship and also surveillance, promote secure communications and also providing website security as well.
26:38 However, this initial approach, again, I said it’s very tool-centric, because our programme is very much focused on increasing the uptick and awareness of tools to our partners.
26:47 We supported the improvements and the localisations of the tools and hope that it better fits the needs of our partners. And then we tell our partners, here are all the tools, go use them.
26:59 It may solve your problem, but then we feel that again, you know, maybe not. We’re not encouraging people to use it by just giving it to them. You know, there’s a lot in between there.
27:13 And hence, our approach shifted to more of a user first, usercentric approach, and this is where our user testing work comes in.
27:21 Obviously, user testing is nothing groundbreaking, right? Nor is it anything innovative, but it’s quite unique to us in the DRG space, where most of our partners and implementers just aren’t technical people.
27:32 We don’t dabble in these types of tools in ways that the technologists, the developers do. So in turn, by providing the opportunity to our partners to participate in user testing, it’s kind of a unique way for them to be more involved in this space and be more aware and in turn, right, trust these tools, and then wanting to use them.
27:58 We again, I said this earlier, a lot of the internet freedom tools are not properly tested and integrated into DRG work. So this is IRI’s more intentional approach to achieve that.
28:09 We decided that without proper testing amongst the communities that are most in need of these tools, their feedback aren’t properly reflected in the tool, and hence it limits the tool’s applicability and also leading to a lack of trust within the user, who then will not want to use it.
28:26 Also, user testing allows us to surface usability issues and ensuring that the tools meets the needs. And then the developers, in turn, benefit from the direct feedback they’re getting from our partners, and then ultimately, to be responsive to our partners’ needs, their gaps, in terms of knowledge the threat landscape, we complemented user testing with also a series of digital and cybersecurity trainings. So those two things go hand in hand.
28:54 So I will now talk about the testing methodology. So before I go into the methodology overall, I wanted to explain briefly how we matched our programme’s cohort members to specific internet freedom tools.
29:07 We did a very structured assessment and pairing process for this by first letting our cohort members complete a very extensive digital security assessment that helps us identify their threat landscape, their challenges, what technical expertise they have, their needs, what sort of tools have they used or dabbled with, what works, what didn’t work.
29:29 And then we reviewed it, and then had more follow-up calls to understand some of their details that they share, and all that more deeply. And then, based on these insights, IRI, together with some of our external partners, help select relevant tools and match them with each member so each member get to be matched with one to two tools that is specific to their needs and that can hopefully address their security gaps, their operational environment and also threat environment as well.
30:01 We also obviously sought out our cohorts members’ feedback before the matchings are fully finalised, just to make sure that they feel like, oh, yeah, you know, this would be useful for me, versus like, oh, I don’t need this. I don’t use this.
30:13 So a lot of feedback overall before we finalise the matchings. And after that, we began coordinating user testing sessions between the members that we have in our cohort, directly with the developers of the tool, and we actually partnered to five internet freedom tools for this specific program, and we came up with the methodology.
30:34 So the main objective for our user testing for the five tools are to evaluate the UX UI, and also usability for the tools and specifically to you know, test onboarding processes for new tools for new users, assess access to and usage of core features within the tools and apps, help identify friction points and barriers in user flows, and then also understand features and navigational patterns within the tools and apps.
31:03 We ran both individual testing sessions and also more group focused sessions. For the individual sessions, we asked our members to first of all, download and access the app, either on your phone or the computer, set up the device, establish a connection, find one specific feature in the tool, and then explore the features’ possibilities.
31:24 This is just a very watered down version of what we asked them, and then after that, we hosted group focus sessions. So it’s more about discussing the user’s experiences and use cases as well.
31:37 We asked our members questions such as how they would use a tool in their context, the ease of navigation and what they would change about the tool, any friction areas or kind of confusion that they’ve experienced as they navigated the app, what they liked about it, what they disliked about it, and then also how their community might use the tool.
32:00 And then the methodology, overall measures for key areas, first is time for tasks, which means the duration needed to complete the task.
32:07 There’s also flow efficiency. So how easy is it to navigate the app? Friction areas. So any errors that came up are and then are our members able to, like, recover from the error, or are they stuck there? And then any pain points, any user interface issues, that is, you know, specific to their user journey as they go with it.
32:28 And then, after the session, we work to collate all the findings and observations. And then we suggested recommendations to the tool developer teams on how they could improve or localise. And then we give them time to do all that.
32:41 And then after that, we moved on to a second round of user testing again to see if, you know, are some of these pain points fixed, like, are there less friction areas and all that?
32:53 The two main points I want to end to emphasise here is that we work very closely with each of the tools and their developers on what their goals are.
33:03 So for instance, for one of the tools that’s more on communications, they want to find out how easy it is, how fast it is for a user to start a communication room using their app.
33:14 Another one is more focused on establishing secure connection between two people that are in two separate locations, in order to want it to do data collection, so on and so forth.
33:24 So we are very targeted in terms of our approach. We are not just testing everything, because it’s not possible to do that within the one year programme, and it’s going to be a lot on our partners on the ground to be able to participate on that.
33:37 So we made it very targeted and focused. And then also, the methodology emphasises having users think a lot. So when they’re doing these user testing sessions, we are not asking them to be silent and just do it, but we want them to, like, speak out loud what’s going on, what they’re feeling, and all that, to just like, capture real time reactions.
33:58 So a few successes and challenges here. First of all, this allowed for direct user development engagement. So typically, tool developers work in isolation from end users.
34:09 Especially end users that live in high risk or more repressive environments, most of the user testing that they do comes from technical experts or intermediaries, not directly from actual frontline users who face the threats of censorship, surveillance and shutdowns themselves, a lot of times, also, security concerns are language barriers and lack of trust that prevent direct communication.
34:33 So our approach here enabled the at-risk users, our partners, to work directly with the tool developers, and this help lead us to getting more authentic feedback, more real world insights from our partners, and tools can be quickly adapted to meet the actual needs.
34:51 It empowers at-risk users. Users feel heard and they feel valued, and it increases their confidence in using a lot of these tools, and such.
35:00 But then, reliance on external funding makes long term sustainability more uncertain. And then obviously we know any kind of follow up, ongoing digital security support, tool support and all that requires a programme to be funded continuously.
35:16 But typically these programmes are not. And then there’s also a communication gap right between the two developers and also non technical partners that we have.
35:26 So we always have to… like the implementers usually have to help translate between both worlds, to facilitate a lot of these engagements and all that, and improve understanding and collaboration.
35:39 So why does this matter for civic and pro democracy tech projects? I feel like many of the learnings from this programme, also the user testing approach, translates over to civic and pro democracy tech projects.
35:53 First of all, direct engagement builds trust and adoption. Like I said earlier, when at risk, users collaborate directly with developers, tools become more intuitive and more context specific, and then trust translates into higher adoption rates.
36:07 Localisation and accessibility protect services, so adapting tools to local languages and threat environments ensures that civic organisations can deploy solutions with less technical barriers, and then localised tools minimise misconfigurations and errors that could potentially expose sensitive data or disrupt services.
36:27 Capacity building enhances data protection. So trainings on digital security, safe tool usage equips civic actors to safeguard sensitive information and also prevent breaches.
36:38 All civic tech actors should continue learning best practices for encryption, so secure storage and also anonymous browsing, and then incorporate a lot of these elements and these knowledge into their tech project and their tech solution,
36:53 especially when they roll out a certain tool or an app, bridging technical non technical communities improves resilience, so developers gain insight into real world challenges, while civic actors understand the capabilities and also limitations of tools.
37:08 And then the last point here is continuous feedback and iteration counter ongoing threats. Regular user testing, bringing in your end users, and then updating them about how the tools work and all that, allowing the tools to continue to be able to continuously evolve as needs change, threat environments change and all that.
37:31 So civic tool projects, civic tech projects, benefit from tools that remain effective against new censorship methods and also surveillance. That is it. Thank you.
37:43 Gemma: Thank you so much. Super interesting work. Thanks for sharing it with us. I’m going to pass over quickly to Jocelyn Woolbright.
39:19 Jocelyn Woolbright: Great. Hi everybody, thanks for having me in this presentation. Hui Hui and Patricia are so great, they work at such great organisations.
39:28 I’ve worked with them in the past in trying to figure out how we provide services to civil society organisations. So it was really great to get started on that side.
39:36 But my name is Jocelyn. I am a Programme Manager at CloudFlare, and my main job is actually managing a bunch of different projects where we provide free cybersecurity services to nonprofit organisers, organisations, civil society organisations, human rights defenders, anything in the realm of providing free services to vulnerable communities is under my purview.
40:00 For this presentation, I’m going to give you a quick overview of what Cloudflare is, how our technology works, give you an overview of project Galileo and what services are available for your civil society organisation, and then go from there.
40:15 And really the goal of my presentation is to tell you that you really don’t have to be a technical wizard when it comes to getting the cybersecurity protection you need for your civil society organisation to make sure that your external facing website is protected, while also your internal data and sensitive information is secure.
40:37 So let’s get started. I’m going to give you a brief overview of exactly how Cloudflare works.
40:45 So we are a company where we help web itself keep yourself secure and safe, and really at its core, we are a globally distributed network spanning over about 340 cities from around the world.
41:01 And one of the ways that I like to think about Cloudflare is really as this protective shield in front of your website that’s ultimately the gateway by allowing visitors to visit your site while also blocking them as well.
41:14 And the way that we do this is that we sit between clients such as web browsers and back end servers so you know, where your website lives, really as this buffer for security. So we are essentially kind of the first hop that users have when they’re visiting a website that’s on Cloudflare.
41:35 And because we’re kind of this first hop, we have the ability to make sure to shield your website. Make sure that you know if you are getting these spikes in legitimate type of traffic or malicious type of traffic, we’re able to keep your website online during these high traffic periods.
41:53 Since we operate this large global network, we also see a ton of traffic go through it, and see tons of IP addresses, and we also provide free services. So you know, anybody can go to cloudflare.com right now and get a really great level of protection.
42:09 And because we see so much of this internet traffic, we’re able to actually learn from these different types of attacks that we see hit many of our users and automatically implement these security precautions for you, and I know, especially in the civil society space, it’s really hard to manage.
42:29 You know, if you’re managing your volunteer base, you’re managing the website, you’re working on stories, if you’re a journalist, it can be really difficult to figure out how to be protected against the worst of the worst types of attacks until it’s too late.
42:43 But when it comes to CloudFlare, we really want to make sure that you have these protections in place so you’re not struggling to stay online during a cyber attack.
42:52 So really, the two main points that I want you to get from this is that CloudFlare, we have a huge network. The idea is we’re going to store a version of your website at all of these places, and then deliver it to users on based on where they are.
43:04 This helps with those influxes of legitimate or malicious type of traffic.
43:09 And then the second is, because we have this huge network, we have millions of websites that use us every day, we are able to learn from these large scale attacks and make sure you’re protected against the worst of the worst when it comes to cybersecurity attacks, and that’s really where I get into Project Galileo.
43:28 So we started this project about 11 years ago because we were seeing this trend of cyber attacks, against really important and vulnerable voices on the internet.
43:39 They were getting hit with these DDoS attacks, or kind of ransomware attacks there, where attackers were trying to steal really sensitive internal data and at CloudFlare, like I said, we we’ve always had a free plan, which can give you a great level of protection, but when it comes to those working in the civil society space as Patricia and Hui Hui said, you know, they need another level of service because of the work that are that they’re doing, and also because, you know, when it comes to attacks, you’re not a technical expert of when it comes to how to protect your website, because you have so many other hats that you’re wearing within your organisation.
44:17 So the idea of Galileo was, let’s provide an upgraded level of service and also expertise when it comes to making sure you know how to be protected and use our services to organisations that are working in democracy, building or journalism or media or in the nonprofit space.
44:35 So right now, 11 years later, we have 3,000 organisations from around the world that are protected under Project Galileo, and we work with about more than 56 civil society organisations that really help us provide this protection to the to the organisations that really need it.
44:55 And so I gave you an overview of CloudFlare, a little bit about Project Galileo, I want to show you a couple of examples of threats that we see against civil society organisations, and then show you a little bit about how Cloudflare can help keep your website and internal data secure.
45:10 So unfortunately, this is a lot of the threat landscape and a lot of the stories that I hear working with organisations every day when it comes to Project Galileo, and we’re really only seeing these types of cyber attacks grow and become more sophisticated.
45:28 So when it comes to civil society and human rights, we have about 800 organisations that are protected under Project Galileo that fall into this group. And we’ve because we have so many of these organisations around the world working in this space, we actually our goal is really try to identify the types of threats that we’re seeing against these civil society organisations and how we can better help you understand what these threats look like and use our products to help mitigate against those threats.
45:59 So for example, a lot of what we do every year is try to understand what types of threats you’re seeing and what products actually block those threats.
46:08 We’re seeing a lot of these threats against civil society organisations and these that were mitigated by our web application firewall.
46:16 So when it comes to the WAF, what we’re doing is we’re mitigating against SQL injections, which are hackers trying to steal really sensitive data from your internal applications.
46:28 These are very common and easy attacks to mitigate against if you have something like a web application firewall in place, which is what we what we provide under Project Galileo.
46:39 Now, one of the questions I probably get twice a week from organisations protected under the project is, what types of firewall rules should we be enabling for our civil society organisation?
46:53 And as I mentioned earlier, because we operate this huge global network and are really the first hop for users when they’re browsing a website on Cloudflare, this means that we can learn a lot about malicious actors that are on our network and be able to automatically implement these firewall rules for you, so you don’t have to be a technical wizard to be really protected against the worst of the worst that we see on the internet.
47:19 Now I’m going to give you two different scenarios. The first one is kind of what we’ve been seeing, especially in the media and journalism space. And we’ve been seeing this trend of these large DDoS attacks against journalism and civil society.
47:34 So I imagine that many of you have unfortunately, probably been hit by some type of DDoS attack, which is really an attempt to overrun a server, making it inaccessible for legitimate users.
47:45 And this was actually a case that we saw against one of our civil society partners, the International Press Institute. So they were actually helping about 40 Hungarian media organisations that were reporting about about government corruption in the country, and they had a coordinated DDoS attack against 40 of these media organisations right about the same time.
48:08 And it actually is interesting, because they weren’t random. They were often happening right after they were posting critical articles about the government.
48:18 Meanwhile, we saw that pro government media outlets remained untouched, which kind of suggests this politically motivated campaign to suppress the independent reporting in Hungary during this time.
48:33 And then scenario two is, I talked about DDoS, but then this is a scenario where we’re seeing attackers that are looking to deface websites to spread misinformation, fear or propaganda.
48:46 And this really looks like, you know, replacing some type of homepage or other pages with a message or image, usually to promote some type of ideological or personal message.
48:58 And the way that hackers will do this is typically by exploiting some type of web vulnerability in the website’s code. The servers are third party software to be able to put on this type of of attack.
49:11 And for this one, there’s a couple of different ways that attackers could have gained access to this specific organisation’s website.
49:20 They could have had weak passwords. It could have been a phishing attack website vulnerability or kind of lack of multi factor authentication.
49:30 And one of the things that’s interesting is that a lot of times, unfortunately, we will see attackers do a DDoS attack while also exploiting web vulnerabilities at the same time, which can be really hard to figure out for a civil society organisation in terms of what they should be doing to be protected against these types of attacks.
49:52 And that’s really where Cloudflare and Project Galileo come in. So I imagine we have tons of products here at Cloudflare that fit many different audiences, but when it comes to Project Galileo, what we’re really focused on is thinking about the application services side of the world.
50:10 So if you take a look on the right side of the screen, what we provide under Galileo is tools to make sure that your external facing website is secure against these influxes of malicious type of DDoS attacks, but also these influxes of legitimate traffic.
50:26 And I’ll give you an example, typically during election times in countries, we will have media organisations that are reporting on election results and trusted media organisations where people are looking for authoritative information about where polling places are, or what the what the results of the elections are, and they’ll get these influxes of legitimate traffic, of people trying to figure out who won the elections.
50:50 So we’re able to help against the malicious ones, but also the ones that make sure of legitimate users trying to figure out authoritative information about a specific topic.
51:00 So when it comes to Project Galileo, we provide all of that external facing web application firewall, DDoS, mitigation, content delivery, network services, protection from bots, which I’m actually going to get into some of the AI crawler work, because I saw a lot of that in the comments, which I’m really happy to see.
51:16 So we provide that under our standard Project Galileo services. And then also, a couple years ago, we started offering our zero trust services, and that’s really thinking about internal team security.
51:29 And a lot of this was because we were seeing during Covid 19, many nonprofit organisations having to work remotely and having to figure out, OK, how do we make sure that our distributed volunteer force or civil society organisation is able to access really sensitive internal data without being able to access it to the whole internet.
51:50 So we have a lot of tools when it comes to internal security to make sure in terms of everybody from your organisation can can automatically get into sensitive internal data applications, whether they’re self hosted or a SaaS application, and make sure that you can automatically implement these two factor authentications, as well as other types of tools to make sure that your organisation is going to automatically secure from things like a phishing attack or making sure that you kind of have visibility into who is in your network.
52:25 Now this is a lot of information to throw, especially at a very small civil society organisation that doesn’t necessarily have the IT team of a large company, but that’s one of the great things about CloudFlare, that it’s really easy to get started since we provide services to more than 3,000 internet domains under Project Galileo, we’ve really learned the best way to make sure that to talk about our products and also make sure that you get the support and resources you need when it comes to using these products for your organisation.
52:58 Now I’m going to quickly get into a new tool that we started offering under Project Galileo, because in the chat I know you many organisations mentioned that AI crawlers have been causing a little bit of trouble when it comes to your organisation, and one of the things that we started to offer under the project is AI crawler control services for all organisations under Project Galileo.
53:23 And a lot of this is because we’ve seen organisations whose missions rely on either kind of creating or distributing content. What we see is that, when it comes to AI crawlers and AI scraping, it’s a really big problem for them.
53:39 So this new tool really provides you visibility into knowing exactly what AI crawlers are accessing your website, while also giving you the ability to either block or allow them to crawl your site as well.
53:56 So we’re trying to give you more tools to understand how you can better protect your content while also being able to give you more control of exactly if you want to block specific crawlers only allow certain types to be able to access your website.
54:13 So this is actually a really great tool that we announced last month to make sure that you have control over how AI is engaging with your civil society organisation.
54:23 So I threw a ton of information at all of you, but I really just want to let you all know that when it comes to, you know, getting the cybersecurity resources that you need under Project Galileo, we we have a ton of information in terms of, we have an impact portal that goes through exactly how to onboard an organisation to CloudFlare, how to use all of our different tools, how to use our application services or zero trust tools, and what that looks like for a small civil society organisation, as well as a ton of other resources that you will have available for you for Project Galileo.
54:59 So really, the next step, if you’re interested in learning more about the project, what services we can provide, I definitely recommend going to cloudflare.com/galileo, I’m happy to kind of put that in the chat as well.
55:12 Or you can reach out to me directly. I’m happy to chat with you about Project Galileo, the services that we provide, kind of answer any questions you might you might have along the way. So thank you so much. I’m happy to hand it off to Gemma.
55:27 Gemma: Thank you so much. I think we’ve got, like, one minute for some questions. So we’ve got one from Martine there. What’s the benefit of being part of Project Galileo? This is to Jocelyn, when there are outages. Obviously, there was a big news yesterday about Cloudflare outages. Yeah. Jocelyn, do you want to, is there any benefits of being part of the project?
55:49 Jocelyn: Yeah, no, that’s a really great question. And I think when it comes to Project Galileo, what we provide under the project is free business level services. So at CloudFlare, we have a free plan, which you can go to right now and get a great level of service.
56:03 When it comes to Galileo, you have a dedicated support line. So if you’re running into any technical issues with, let’s say, like a web application firewall rule that you enable and it doesn’t do the exact result that you want, you have access to that team as well as dedicated security guides, access to our impact portal, so a ton of new resources that are available for you, and also new products that we come out with.
56:29 So for example, with the AI crawler control, we are just offering that at an elevated version to organisations under Project Galileo, because we’ve been having chats with many of our civil society partners about, you know, what they should be doing and how they should be recommending how organisations should interact with AI crawlers.
56:49 So a lot of Galileo is, we’ll put out new products that are typically paid to organisations under Galileo in that, in that sense of the in the notion, but when it comes to outages, it’s really one of the things with Cloudflare is that transparency is really important to us when it comes to an outage.
57:08 And we published a blog post last night about an outage that we had yesterday, which was caused by an internal configuration, but that’s really important to us. So if you’re interested in learning more about the outage, you could definitely check out our blog, but for Galileo, it’s very much we want to give you the tools that you need to be able to keep your civil society organization secure.
57:31 Gemma: Brilliant. OK, well, we’ve run out of time. Thank you so much everybody for coming, but thank you again for our speakers and everybody for coming.
57:39 I’m sure we could do a whole other TICTeC session about the pay per crawl thing, and thank you, Marion from 360 Giving, for sharing your your experiences there.
57:49 But yeah, there’s more to talk about. But thank you everybody, so much. And have a good morning, afternoon, evening, wherever you are.
Credits
- Image: Buffik
- Music: Cyber Dreams by Ilex