AlgoRace is leading an action to participate in a public consultation to draft an organic law against racism

Share on Facebook Share on Twitter Select sharing service
  • Illustration of an automation system for collecting biometric data.
    Illustration of an automation system for collecting biometric data. Source: Pixabay.

Around twenty organisations submit proposals to the Spanish Ministry of Equality to tackle discrimination relating to artificial intelligence. 

Some twenty organisations led by AlgoRace have submitted a series of proposals to the Spanish Ministry of Equality against discrimination caused by artificial intelligence within the framework of a public consultation to draft an organic law against racism, racial discrimination and related forms of intolerance.

The organisation considers it is “essential” to tackle the impact that algorithms have on people at a time that growing automation is starting to be defined in Spain and the European Union regulations.

In a statement, these organisations are requesting that AI is only applied when this does not infringe on the rights of oppressed social groups and that AI is not used to reproduce structural forms of violence.


These organisations are suggesting measures to ensure in law that these technologies are used in a transparent way so they are “traceable, intelligent and checkable”, just as the Spanish Agency to Control Algorithms, which is in the process of being developed, and that civil society is able to participate in the process.

Prohibitions for border control and the police

Conversely, they are also calling for artificial intelligence systems to be prohibited by law, especially when talking about predictive and profiling systems used by the police that “strengthen structural discrimination of marginalised groups”. They argue that this practice “goes against people’s right to the presumption of innocence”.

In a similar way, they demand that no biometric data are collected at borders because this goes against the freedom of movement, the right not to be identified, the right to privacy and intimacy, and because it breaches the principle of proportionality.

To avoid these violations, they are also calling to repeal the European Parliament’s regulation that allows for facial recognition and to equally repeal an article in the European Commission’s draft Artificial Intelligence Act (AIA) that waives this requirement for European Databases. They also call for a system of human rights’ defenders to be present at borders, independent audits and external checks.

They equally consider informed consent of people affected to be insufficient because they are facing high levels of vulnerability.

Add new comment

This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.