Words:

Predictive Policing and Community Organising

Session Report Back Privacy Camp 

 Session Report Back

Last week, we attended the Privacy Camp in Brussels, where we co-facilitated a short session on so-called ‘predictive policing’ alongside Statewatch and Technopolice Belgium.  

The aim of the session was to introduce recent research, and gather the collective knowledge of attendees to explore potential strategies for challenging police tech.

It resulted in a lively exchange of inspiring examples and strategies of community organising in the context of (data-driven) policing of marginalised communities. We thought these were worthwhile to share through this report back and reflect on some of the conversations during the session.  

Current state of data-driven policing

Before giving the space to participants to exchange, Statewatch kicked off the session by introducing its recent publication, New Technology, Old Injustice: Data-driven discrimination and profiling in police and prisons in Europe. Based on in-depth research by organisations in Belgium, France, Germany, and Spain, this report discusses the implementation of data-based crime ‘prediction’ systems in these contexts. This publication is a highly recommended read, including its analysis of the very unhelpful, vague, and unclear limitations of ‘prediction’ systems in the AI Act. This research is yet another testimony of the patterns we see again and again and again; the same racial and socio-economic profiling, the same criminalisation fundamental to policing and their systems. The same tiring tech sales pitch of innovation – whatever that may mean – by the same horrible companies, such as Clearview or Eurocop.  

An extensive summary as well as full reports in English, French, German, Spanish, and Flemish can be found here.  

Counter Strategies

We organised the participants into five groups to encourage lively exchange and sharing. In our breakout group, everyone had the chance to share real examples of organising they knew well. Many of these examples focused on meeting practical, everyday needs and included clear, hands-on ways to support.

Community-organised digital security support 

The first project shared was by the organisers of Resist Berlin, who supported people targeted by immigration control by cleaning smartphones to prevent inadvertently exposing data during phone extraction by immigration control . 

[more on phone extraction by immigration control]

In Germany, as in many other countries, authorities extract data from mobile phones of asylum seekers and migrants to verify identity, country of origin and asylum claim. This includes call logs, messages, and location history, with no consideration of the privacy whatsoever. Moreover, the interpretationframe of this extracted data is suspicion and trying to find anything to undermine asylumclaim.

This type of low-threshold digital security support, provided by experts, has been agreed upon as essential for targeted communities. However, it remains relatively scarce and inaccessible to those who need it most. Often, digital security projects are not part of community organising or implemented through incidental trainings by experts who parachute in, which is not the most effective way of protection.  

Digital alert and warning tools  

A second set of examples of organising people brought forward are monitoring and warning systems for stop-and-search actions, police controls, or immigration raids. These enable people to avoid areas of risk or mobilise further for solidarity responses and emergency support where needed. 

Participants from the US context shared initiatives of anonymous community-driven tools for reporting and receiving ICE (the US Immigration and Customs Enforcement) activity alerts, such as Red Dot. These apps are similar to crowd-sourced speed trap warnings, like those in Google Maps and Waze, that notify users of nearby police presence or surveillance cameras. Unfortunately, some days after our session, Google’s and Apple’s app stores removed the apps following pressure from the US Department of Justice (which used the laughable argument that the apps share the location of ‘a vulnerable group’, namely ICE officers).  

While apps such as Red Dot are more advanced collective warning tools, we were all familiar with local examples of Signal or Telegram groups that share alerts of various police controls. The upside is that these are not dependent on app stores and are less vulnerable to censorship by Big Tech. The downside is that such huge chat channels are less anonymous.  

“We saw a need and we just felt a sense of urgency to fill it”

Other organising work in the brutal context of repression against undocumented communities in the US was shared as well, such as support networks providing grocery delivery services to families who are afraid to leave their homes due to fear of immigration raids. While there is beauty in the solidarity and care of such support networks, it is heartbreaking and enraging that this work is needed because people are not safe to go outside.   

Kids of Colour: community support, case work, and know your rights  

The Manchester-based Kids of Colour was mentioned as an anti-racist youth work organisation, whose work includes  ‘Know Your Rights’, case work, and other support to challenge racist policing. Recently, they published a new guide (in collaboration with Liberty, pdf), setting out the rights of youth service providers when it comes to sharing information with the police. Such guidance is especially relevant as many ‘predictive policing’ schemes demand data and collaboration from social care providers, schools, and so on. 

Another strategy mentioned in the context of care providers was finding ways of ‘soft sabotage’ to disrupt collaboration with policing and surveillance, practised by doctors, schools, and others. Although such attempts are mostly clandestine and not public, there are some examples of more public campaigns, such as Docs not Cops or university staff opposing invasive apps that monitor student attendance, like the iSheffield app. Since most of these examples are from the UK, providing more examples from different contexts would help better understand the kinds of disruption that are possible. 

Monitoring, advocacy and campaigning 

Many organisations work on monitoring racist police violence and surveillance, exposing police brutality in connection with public campaigns. SOS Racisme Catalunya was mentioned specifically, but of course there are many others doing great work on these fronts.

Community action research  

The Catalan context brought the conversation to our survey of police technologies in the Catalan region, which was the result of a collaborative effort with organisers in Barcelona and an example of research in service of communities.  

Responding to needs 

Someone in the group brought forward a dilemma familiar to many working in more institutionalised settings, or those further removed from on-the-ground realities: a sincere interest in collaborating more with those most targeted but having a lack of connection and not knowing where to start. And although it might be somewhat of a cliché, in the following conversation, we concluded that it is most important to start from a concrete need that people experience, rather than starting from a general problem as perceived from a distant analytical framework. “We saw a need and we just felt a sense of urgency to fill it”, explained one of the participants. Which sounds way simpler than it is in messy and challenging on-the-ground realities – but it is a point of departure far too often forgotten.  

Opportunities for collaboration

An urgent issue for communities, discussed in one of the other breakout groups, is the rise of predictive policing and profiling by banks under the pretext of combating money laundering, with little accountability. This can result in bank accounts being suspended, significantly impacting people’s daily lives. Developing strategies to support individuals in these cases would be a valuable starting point for future discussions in the digital rights arena.

All breakout groups showed a genuine interest in fostering more exchange and collaboration across sectors and borders. Someone observed that, with the rapid pace of implementation—especially with the increasing push for AI investments—it can be tough to keep up with all the data-driven policing projects happening around the world. Greater cross-border coordination could be really beneficial, as many of these technologies are being used in different countries.

The ongoing collaboration and open exchange of ideas between frontline organisers, researchers, journalists, and digital rights activists remain truly vital in helping us effectively tackle harms. Creating spaces that last and hosting annual events like the Data Privacy Camp are essential parts of this exchange infrastructure, helping to sow the seeds for future work and cooperation.  

We would like to thank Statewatch and Technopolice Belgium for organising the session with us, and EDRi for another excellent Privacy Camp, and their support for travel and accomodation.

Sanne Stevens

Sanne Stevens is Co-Director of the Table of Justice, Equity and Technology. She has many years of experience working with civil society organisations in the field of technology and digital safety. Her interest is critical analysis and organising which addresses underlying power structures of data-driven technology and depart from a deliberate social justice framework.

Other Posts