Statement published by La Quadrature du Net on 03/05/2022 (unofficial translation in English by the European Civic Forum)
On 26 April 2022, the Council of State rejected our criticisms against the massive use of facial recognition by the police in the TAJ (“Traitement des Antécédents Judiciaires”). This is a stinging defeat, which further confirms the Council of State in its role as a defender of mass surveillance, without any consideration for the respect of people’s rights. We are used to losing and not resigning ourselves: let us find in this defeat the future paths of our fight.
Massive and illegal surveillance
Two years ago, we attacked the 2012 decree that created the TAJ file by merging the STIC, a police file, and the JUDEX, a gendarmerie file, for judicial and administrative investigations. It contains information on suspects (regardless of whether they have been convicted or not), witnesses and victims involved in investigations. The TAJ is now sprawling: 19 million records are present in this mega-file (2018 figures, which we fear may have only increased since then).
Above all, and this was the subject of our appeal to the Council of State, the TAJ decree authorises the police to use facial recognition software to consult its database. Police officers can automatically compare an image captured by a surveillance camera, a telephone or on the Internet with the 8 million photographs in the files of suspects (2018 figures). This comparison takes place in the context of investigations as well as simple identity checks, as explained by the Minister of the Interior in 2021.
Introduced into the law discreetly almost 10 years ago, at a time when facial recognition tools were only in gestation, the use of this technology is now widespread. The police used TAJ to perform facial recognition 375,000 times in 2019, meaning more than 1,000 treatments per day throughout France (we talked about this in our summary article on the state of facial recognition in France, here). By 2020, this figure have risen to 1,200 daily TAJ queries using facial recognition.
However, the massive use of this technology is prohibited under the rules of personal data law. Only certain exceptional situations could authorise such processing and, even in these exceptional situations, the police could only use it in cases of “absolute necessity” – when there is absolutely no other way to pursue the investigation. We explained to the Council of State that none of these criteria were ever met in practice. There is no justification for such intrusive and dangerous means.
The Council of State’s headlong rush
And yet, the Council of State rejected our arguments. It did not deny the innumerable abuses that we pointed out, but invited us to submit them on a case-by-case basis to the authorities (judges and CNIL) responsible for verifying their legality, rather than to it. As if the Council of State could be satisfied with examining the legality of the TAJ in the abstract without worrying about its practical implementation. Yet, in practice, the Council of State knows very well that the abuses of the TAJ are so numerous that the CNIL will never have the means to detect and stop them one by one. It is materially impossible for the CNIL to monitor 1,000 police operations per day after the facts. Presenting the control of the CNIL and the judges as a sufficient guarantee to palliate these abuses is a dishonest way of allowing these practices to continue. It is the nature of mass surveillance to escape any credible supervision, and it is this evidence that the Council of State has denied.
Although the Council of State refused to take into account in its decision the concrete abuses of the TAJ, it nevertheless sought to justify the “absolute necessity” of facial recognition. Its demonstration is so terrible that we reproduce it as it stands: ‘in view of the number of suspects registered in [the TAJ], which amounts to several million, it is materially impossible for the competent officers to carry out such a comparison manually‘ of images, the automation of which can therefore only ‘prove absolutely necessary for the search for the perpetrators of offences and the prevention of breaches of public order‘. In other words, the use of automated image analysis software would be necessary because the TAJ, which has been left to the police for 10 years and without any external control, has become so sprawling and absurd that it can no longer be used to its full potential by humans. One mass surveillance (generalised data filing) makes another mass surveillance (generalised facial recognition) necessary.
Such circular reasoning allows the Council of State to detach itself from any consideration of the respect of fundamental freedoms. At no time does it seize the opportunity to seriously evaluate the only known use of facial recognition in France, despite the fact that it has been denounced for several years throughout Europe for the serious dangers it poses to our freedoms. On the contrary, the Council of State is stepping out of its role to analyse the file only from the point of view of its potential usefulness for the police and not to correct the damage caused over the past 10 years. By abandoning its role as guardian of freedoms, the Council of State validates and sets in stone the belief that we must always know more about the population, which is considered suspect by default.
The next step in our struggle
Let us not be discouraged and, in preparation for the next stage of our fight, let us look for lessons to be learned from this defeat. Firstly, it seems risky to attack facial recognition as a theoretical principle without also attacking its concrete implementation, otherwise our opponents risk ducking the debate as the Council of State has done here.
Secondly, it seems risky to attack facial recognition without at the same time attacking the whole system of which it is a part and which justifies it: the generalised filing of data, the excessiveness of which was used as a pretext by the Council of State, and the video surveillance which is inundating our cities and its excessiveness, which is just as scandalous, is also used as a pretext for the deployment of automated detection software on the cameras already installed (see our political analysis of the VSA, the algorithmic video surveillance).
Our fight will therefore continue, refined and adjusted by these two lessons. This fight is all the more urgent as the European Union is in the process of adopting a regulation on AI that would legitimise biometric surveillance technologies currently prohibited by the GDPR (see our analysis) and as France, currently president of the EU Council, is doing everything to defend its techno-police industry and ideology.
We will soon be celebrating 4 years since the GDPR and the European rules on personal data protection came into force on 25 May. If these rules have been almost useless in protecting us from GAMAM surveillance, they have entirely failed to protect us from state surveillance. Perhaps we should use this anniversary to try to turn things around.