We create spaces of convening to confront discriminatory policing and the use of data-driven technologies throughout Europe. Our aim is to expose and confront the harms inflicted by the use of new forms of surveillance and control in their contexts. From the ground up, based on needs and experiences of racialised and marginalised communities.
Data driven policing and discrimination as an emergent issue
There is widespread evidence of racial profiling by police forces all over the continent while issues of institutional racism are hardly addressed. Anti-immigrant policies which lead to increased control and checks on the streets and at borders further exacerbate the criminalization and disproportionate targeting of racialized communities. Meanwhile, counterterrorism programs legitimize far-reaching and intrusive surveillance projects that result in the over-policing and stigmatization of Muslims.
While data-driven policing—or, more simply, police tech—is still at an early phase of adoption across Europe, its deployments so far have had adverse effects on minority ethnic and marginalised communities. Members of these communities are treated as fodder for police tech experiments, left to shoulder the emotional, psychological, and material consequences of police tech mistakes, distortions, and mistreatment. Meanwhile state and private sectors make spurious claims of public benefit and public safety, driving the tech boosterism of the digital surveillance industry
Groups on the ground bear witness to the harms of police tech, but these experiences of racialized communities appear as afterthoughts in policy and public debates. Most spheres of policy and oversight addressing data-driven automation are limited by their exclusivity, featuring mostly expert groups who tend to overlook the specific harms and everyday realities of the communities most affected by the onset of data-driven policing. In addition, organizers lack the capacity to learn from other contexts and communities questioning the baseline assumptions, deployments, and outcomes of police tech.
In this context the Justice, Equity and Technology Project seeks to build a broader set of community-centred visions and strategies for confronting discriminatory technologies. We aim to abolish systemic injustice perpetuated by discriminatory technologies and envision a world where all people live the lives they value, free from exploitation, dispossession, and violence.
Combining a collective understanding of racialized criminalisation with insights about the incursion of new technologies into contemporary policing, we aim to build strategies and shift the narrative.
Aims
1
Unpacking and understanding violent, discriminatory technologies.
We seek to make sense of the motivations, drivers, institutions, and histories behind police tech and other discriminatory technologies, and of their impacts on people and their communities. We wish to surface what communities are already doing in the face of discriminatory technologies
2
Strengthening collaborative capacity of justice and equity organizers at the intersection of technology governance.
We bring together groups and individuals to engage in coordinated action. We aim to discover parallels between different experiences of affected communities and to share strategies of community organizing and advocacy.
3
Building a better world with alternative visions of collective wellbeing.
We provide space to discover and develop new models of safety and belonging that counter racialized criminalization. We uplift policies and practices of community care and resilience.
Principles
These are principles which form the basis for our collaboration. We consider this a living document that forms part of an ever-evolving collective learning process.
We are guided by the principles of justice, equity, co-liberation, and care. We understand that technology is/has power and that technology reflects power in society.
We recognize that deployment of technological systems exacerbates existing injustices. We see structural and institutional harms perpetuated by the use of automated systems as continuation of histories of oppression and colonization.
We thank all the organizers that inspire our work and principles such as the Movement for Black Lives, Our Data Bodies, Abolitionist Futures, Center for Intersectional Justice, Stop LAPD Spying Coalition, Coalition for Critical Technology, and Detroit Digital Justice Coalition.
- Technology is and reflects power
The deployment of technological systems exacerbates existing injustices. The structural and institutional harm perpetuated by the use of automated systems, can be understood as continuation of histories of oppression and colonization.
- Intersectionality
We believe that all forms of oppressions are interlinked and that they need to be addressed simultaneously. We see intersectionality as a guide to connect and support each other’s struggles and to foster systemic change.
- Transformation
We support each other’s learning, recognizing that we will make mistakes and are committed to learning from them to accomplish our collective aims. We see individual and collective reflexivity as an inseparable part of our work and value human relationships as fuel for social change.
- Solidarity
We are committed to act in solidarity for the collective liberation of all, and in particular people who are being traditionally targeted and marginalized.
- Critical Engagement
We investigate questions of power around technology. We center lived experiences and prioritize the participation of people who have been historically and structurally excluded from and attacked by technological systems.
We value diverse tactics and strategies as part of the ecology of our movements. We recognize and get inspired by the work which has been done by other organizers.
Members
Border Violence Monitoring Network ↗
Documenting illegal pushbacks and police violence at the European borders
Stop The Scan ↗
Campaign against the police rollout of mobile fingerprint scanners in the UK
The Northern Police Monitoring Project (NPMP) ↗
Monitoring Network Documenting illegal pushbacks and police violence at the European borders
Fundación Secretariado Gitano (FSG) ↗
Defending the rights of the Roma community in Spain and Europe
Homo Digitalis ↗
The only digital rights civil society organisation in Greece
Patrick Williams ↗
Lecturer and social researcher
AlgoRace↗
Bring an anti-racist perspective to the public debate on AI and bring it closer to migrant and racialized communities in Spain
GHETT’UP ↗
Organising youth empowerment in the banlieues of Paris
No Tech For Tyrants (NT4T) ↗
Student-led organisation against violent technology and hostile immigration environments
StopWatch ↗
Turning a spotlight on stop and search, campaigning against the overpolicing of marginalised communi
Transbalkan Solidarity↗
Solidarity network in the Balkans for and about people on the move of more than 500 individuals and civil society groups from across the region and beyond.
Technopolice ↗
Campaign coordinating the fight against new policing tech in France
People
Esra Özkan ↗︎
Co-Director Justice, Equity and Technology Project
◼
Esra is based in Izmir, Turkey. She has a background in social justice organising and facilitation. Her work focuses on movement building, individual and community transformation as pathways to experiencing freedom and justice in our lifetimes. Previously, she was part of LABO vzw, a movement of critical citizenship in Belgium. She worked at European Network Against Racism (ENAR) as Network Development Officer and Merhaba vzw as Movement Project Coordinator.
Sanne Stevens ↗︎
Co-Director Justice, Equity and Technology Project
◼
Sanne Stevens is a researcher, trained facilitator and advisor. Her work focuses on how data-driven technologies reinforce and transform powerstructures, surveillance technologies and strategies of subversion and resistance. Besides conducting critical research, she facilitates workshops, meetings and practical workshops for activists, journalists and civil society worldwide about basic digital security practices, surveillance, data tracking and strategies against online harassment.
Seeta Peña Gangadharan ↗︎
Associate Professor, Principal Investigator and Founder
◼
Dr Seeta Peña Gangadharan is Associate Professor in the Department of Media and Communications at the London School of Economics and Political Science. Her work focuses on inclusion, exclusion, and marginalization, as well as questions around democracy, social justice, and technological governance. She currently co-leads two projects: Our Data Bodies, which examines the impact of data collection and data-driven technologies on members of marginalized communities in the United States, and Justice, Equity, and Technology, which explores the impacts of data-driven technologies and infrastructures on European civil society. She is also a visiting scholar in the School of Media Studies at The New School, Affiliated Fellow of Yale Law School’s Information Society Project, and Affiliate Fellow of Data & Society Research Institute.
Advisory Board
The Advisory Board of the Justice, Equity and Technology Table serves as essential sounding board and critical friend in relation to the overall direction and activities of the Table.
Bogdan Kulynych ↗︎
PhD researcher at EPFL Security and Privacy Engineering Lab(SPRING)
◼
Bogdan Kulynych works on privacy and security as related to socio-technical systems. His interest is in studying the harmful effects of machine learning, algorithmic, and optimization systems, and, leveraging security and privacy techniques and principles, developing defences against these harmful effects. Bogdan is also a co-organizer of the Participatory Approaches to Machine Learning workshop.
Eric Kind ↗︎
Managing Director at AWO
◼
AWO is a new agency of lawyers, policy experts, technology analysts and applied ethicists orking to shape, apply and enforce data rights. Eric Kind works as a legal and public policy expert in technology, society and human rights, with particularly deep expertise on surveillance technology law and practice. He previously led public policy development efforts around complex technology policy areas such as security and intelligence, dual use export controls, artificial intelligence and algorithmic decision making, gig economy and the future of work, cyber security, competition, data protection, and platform accountability. He has also led coalitions of NGOs reforming surveillance laws and was the Deputy Director at Privacy International.
Sarah Chander ↗︎
Senior Policy Advisor at European Digital Rights (EDRi)
◼
EDRi is the biggest European network defending rights and freedoms online. Sarah Chander leads EDRi’s policy work on AI andnon-discrimination with respect to digital rights. She is interested in building thoughtful, resilient movements and she looks to make links between the digital and other social justice movements. Sarah has experience in racial and social justice, previously she worked in advocacy at the European Network Against Racism (ENAR), on a wide range of topics including anti-discrimination law and policy, intersectional justice, state racism, racial profiling and police brutality. Before that she worked on youth employment policy for the UK civil service. She was actively involved in movements against immigration detention. She holds a masters in Migration, Mobility and Development from SOAS, University of London and a Law Degree from the University of Warwick.
Grants & Awards
LSE Knowledge Exchange and Impact (KEI) Fund.
To: Justice, Equity and Technology Table (Seeta Peña Gangadharan)
Total: £97,131
Period: Jan.2020–Jul.2021
Luminate
To: Justice, Equity and Technology Table (Seeta Peña Gangadharan)
Total: $ 75,000
Period: Nov.2021–Jun.2022
◼
To: Justice, Equity and Technology Table (Seeta Peña Gangadharan)
Total: $50,000
Period: Jul.2022–Jul.2024
Open Society Foundations
To: Justice, Equity and Technology Table (Seeta Peña Gangadharan)
Total: $329,000
Period: Jan.2022–Jul.2023
MacArthur Foundation
To: Justice, Equity and Technology Project (Seeta Peña Gangadharan)
Total: $300,000
Period: Jul.2022–Nov.2024
Please get in touch if your work touches upon these issues!
Whether you are an organiser or researcher around data-driven tech, policing or discrimination, or you are working to document harms or to collect organising stories, we would be happy to connect.