EU Artificial Intelligence Act: civil society calls for regulating surveillance technology
Increasingly, in Europe as in the rest of the world, Artificial Intelligence (AI) systems are built and deployed for harmful purposes linked to state surveillance. As the negotiations between EU legislators on the EU AI Act are ongoing, civil society organisations including SOLIDAR & SOLIDAR Foundation are calling on policy-makers to set strict limits and scrutiny mechanisms on the use of AI systems by law enforcement, migration control and national security. Read and share the statement below, coordinated by European Digital Rights (EDRi) and AccessNow.
SOLIDAR is also endorsing the #ProtectNotSurveil campaign calling for protection of migrants and other people on the move in the EU AI Act.
EU policymakers: regulate police technology! Civil society calls on the EU to draw limits on surveillance technology in the Artificial Intelligence Act
As AI systems are increasingly used by law enforcement, migration control and national security authorities, the EU Artificial Intelligence Act (AI Act) is an urgent opportunity to prevent harm, protect people from rights violations and provide legal boundaries for authorities to use AI within the confines of the rule of law.
Increasingly, in Europe and around the world, AI systems are developed and deployed for harmful and discriminatory forms of state surveillance. From the use of biometrics for identification, recognition and categorisation, to predictive systems in various decision-making and resource allocation capacities, AI in law enforcement disproportionately targets already marginalised communities, undermines legal and procedural rights, and enables mass surveillance.
When AI systems are deployed in contexts of law enforcement, security and migration control (including the policing of social security), the power imbalance between the authorities and the surveilled is even more profound. This means that there is an even greater risk of harm, and violations of fundamental rights and the rule of law.
This statement outlines the urgent need to regulate the use of AI systems by law enforcement, migration control and national security authorities throughout Europe.
We point to the specific dangers to freedom of assembly, liberty, the right to asylum, privacy and data protection, the right to social protection, and non-discrimination when such technology is deployed by those authorities.
Civil society organisations are calling for an AI Act that prevents unchecked forms of discriminatory and mass surveillance. In order to uphold human rights and prevent harm from the use of AI in policing, migration control and national security, the EU AI Act must:
- Include legal limits prohibiting AI for uses that pose an unacceptable risk for fundamental rights. This includes a legal prohibition on different forms of biometric surveillance, predictive policing, and harmful uses of AI in the migration context.
- Provide public transparency and oversight when police, migration and national security agencies use ‘high-risk’ AI, by upholding an equal duty of these authorities to register high risk uses in the EU AI database.
- Ensure that the AI Act properly regulates the uses of AI in policing, migration and national security that pose risk to human rights, specifically the full list of AI in migration control, and ensuring that national security is not excluded from scope.
Why the EU AI Act needs to regulate the use of AI in law enforcement, migration and national security:
- Checks on state and police power are essential to the functioning of a democratic rights-based society. The AI Act is intended to recognise and regulate high-risk uses of AI and, where necessary, prohibit them where the threat to fundamental rights is too great. Uses of AI by state authorities in fields of policing, migration and national security are amongst the most high risk use cases, because they most acutely impact fundamental rights including freedom of assembly and expression, the right to a fair trial, the presumption of innocence, non-discrimination, and the right to claim asylum. The work of police, migration and security authorities governs access to the public space, outcomes in the criminal justice and migration sectors, and various other areas of life with the highest impact on fundamental rights. As such, the use of AI by these authorities calls for the greatest scrutiny and transparency, and requires the clearest boundaries to uphold basic democratic principles.
- The use of AI in the fields of policing, security and migration amplifies structural discrimination against already marginalised and over-surveilled communities, such as racialised people, migrants, and many other discriminated groups. Mounting evidence demonstrates that such AI systems reinforce the over-policing, disproportionate surveillance, detention and imprisonment of structurally discriminated against groups. The data used to create and operate such systems reflects historical, systemic, institutional and societal discrimination. This discrimination is so fundamental and ingrained that all such systems will reinforce such outcomes. Prohibitions, public transparency and accountability frameworks are necessary so that harms are prevented and people are empowered to challenge harms.
- The use of AI in field of policing, security and migration invites private sector influence into core aspects of public governance, requiring even stronger oversight and legal limits in order to ensure peoples’ rights are upheld. As these fields are government functions, it is crucial the AI Act ensures that private sector’s development of AI in these fields is publicly transparent. AI systems, when deployed in areas of policing, migration and national security must be accountable first and foremost to fundamental rights standards and the rule of law, rather than profit motives. As such safeguards, oversight and legal limits must be applied.
Read civil society’s detailed recommendations on how the EU AI Act must be amended in these areas.
Signed (as of 20th September 2023):
- European Digital Rights (EDRi)
- Access Now
- AlgoRace
- Algorights
- AlgorithmWatch
- All Out
- Àltera
- AMERA International
- Amnesty International
- Angela Daly – Professor of Law, University of Dundee, Scotland, UK
- Anita Okoro
- ApTI – Asociația pentru Tehnologie și Internet
- Asia Indigenous Peoples Pact
- Aspiration
- Association for Legal Studies on Immigration (ASGI)
- Association Konekt
- Association of citizens for promotion and protection of cultural and spiritual values Legis Skopje
- ASTI asbl – Association de soutien aux travailleurs immigrés
- AsyLex
- Bits of Freedom
- Bridget Anderson – University of Bristol
- Bulgarian center for Not-for-Profit Law (BCNL)
- Centre for Information Technology and Development (CITAD)
- Centre for Peace Studies
- Chaos Computer Club e.V.
- Chiara De Capitani (PhD, Università degli Studi di Napoli “L’Orientale”)
- Civil Liberties Union for Europe
- Comisión General de Justicia y Paz de España
- Controle Alt Delete
- Corporate Europe Observatory (CEO)
- D64 – Zentrum für Digitalen Fortschritt e. V.
- Danes je nov dan, Inštitut za druga vprašanja
- Democracy Development Foundation
- Digital Ethics Center / Skaitmenines etikos centras
- Digitalcourage
- Digitale Gesellschaft
- Digitale Gesellschaft
- Dr Derya Ozkul
- Ekō
- Electronic Frontier Finland
- Elektronisk Forpost Norge (EFN)
- Elisa Elhadj
- epicenter.works
- Equipo Decenio Afrodescendiente
- Ermioni Xanthopoulou
- Eticas
- EuroMed Rights
- European Anti-Poverty Network (EAPN)
- European Center for Not-for-Profit Law
- European Civic Forum
- European Movement Italy
- European Sex Workers’ Rights Alliance (ESWA)
- Exploring Womanhood Foundation
- Fair Trials
- Fair Vote UK
- Francesca Palmiotto Hertie School
- Fundación Cepaim
- German NGO Network against Trafficking in Human Beings – KOK
- Gernot Klantschnig, University of Bristol
- Glitch
- Greek Forum of Migrants
- Homo Digitalis
- Human Rights Association (İHD)
- I Have Rights
- IDAY Liberia Coalition Inc
- Instituto de Asuntos Culturales
- International Commission of Jurists
- International Women* Space e.V
- Irish Council for Civil Liberties (ICCL)
- King’s College London
- KISA – Equality, Support, Antiracism
- La Quadrature du Net
- Legal Center for the Protection of Human Rights and the Environment (PIC)
- Legal Centre Lesvos
- Liberty
- Ligue algérienne pour la défense des droits de l’homme
- Ligue des droits de l’Homme (France)
- Ligue des droits humains (Belgium)
- LOAD e.V.
- Lorenzo Vianelli (University of Bologna)
- Mallika Balakrishnan, Migrants Organise
- Migrant Tales
- Mirjam Twigt
- Moje Państwo Foundation
- Mujeres Supervivientes
- Novact
- Open Knowledge Foundation Germany
- Organisation International Federation of ACAT (FIACAT)
- Panoptykon Foundation
- Partners Albania for Change and Development
- Platform for International Cooperation on Undocumented Migrants (PICUM)
- Politiscope
- Privacy First
- Privacy International
- Privacy Network
- Prof. Dr. Lorenz Boellinger, University of Bremen
- Prof. Jan Tobias Muehlberg (Universite Libre de Bruxelles)
- Promo-LEX Association
- Prostitution information center
- REFUGEE LEGAL SUPPORT
- REPONGAC Réseau des Plateformes Nationales d’ONG d’Afrique Centrale
- Ryan Lutz, University of Bristol
- Sea-Watch
- SOLIDAR & SOLIDAR Foundation
- Statewatch
- Stichting Landelijk Ongedocumenteerden Steunpunt
- SUDS – Associació Internacional de Solidaritat i Cooperació
- Superbloom (previously known as Simply Secure)
- SUPERRR Lab
- Symbiosis – Council of Europe School for Political Studies in Greece
- Taraaz
- Michael Ellison, University of Bristol
- Vicki Squire, University of Warwick
- Victoria Canning – University of Bristol
- Volonteurope
Pictures credits: Trismegist on Shutterstock