Joint letter – the AI Act must protect the Rule of Law

Joint letter published on the European Civic Forum on 28/06/2023 – Accessible here.

More than 60 civil society organisations have called on European lawmakers to ensure that the AI Act protects the Rule of Law, including transparency, accountability, and access to justice. The human rights coalition also pushes for the rejection of recent amendments to the AI Act on blanket national security exemption and dangerous loopholes in the classification of AI systems as high-risk.  

The open letter was drafted and coordinated by the Civil Liberties Union for Europe (Liberties)European Civic Forum (ECF), and European Center for Not-for-Profit Law (ECNL) and was sent to EU legislators.

In the letter, the more than 60 signatories argue that as artificial intelligence becomes increasingly deployed by both the private and public sectors, the rule of law requires the EU to adopt robust safeguards within the AI Act to protect the very foundation our Union stands on. The misuse of AI systems, including opaque and unaccountable deployment of AI systems by public authorities, poses a serious threat to the rule of law, fundamental rights, and democracy.

Fundamental rights impact assessments are a must 

Specifically, CSOs demand that fundamental rights impact assessments (FRIAs) should be “an obligation for all deployers of high-risk AI technologies” to ensure that their use upholds the principles of justice, accountability, and fairness. They call for rule of law standards to be added to the impact assessments, with a structured framework to evaluate the potential impacts, biases, and unintended consequences of AI deployment. As states are responsible for the proper implementation of the rule of law framework, that public authorities, including law enforcement, conduct FRIAs is not just a recommendation but a necessary safeguard to ensure that AI systems are designed and deployed in full accordance with the values of the EU and the EU Charter of Fundamental Rights. 

No general exemption for national security or arbitrary loopholes for big tech

Signatories of the open letter also call on EU legislators to reject the European Council’s proposed amendment to Article 2, which aims to exclude AI systems developed or used for national security purposes from the scope of the Act. Furthermore, the campaigners urge lawmakers to return to the original Commission’s proposed version of the AI Act, thereby removing newly added loopholes that would give AI developers the power to unilaterally exempt themselves from the safeguards set out in the AI Act (Article 6(2)).

Open-Letter-The-AI-Act-must-protect-the-Rule-of-Law