On 27 March 2020, European Digital Rights (EDRi) and 12 of its member organisations sent an open letter to representatives of Member States in the Council of the EU. In the letter, we voice our deep concern over the proposed legislation on the regulation of terrorist content online and what we view as serious potential threats to fundamental rights of privacy, freedom of expression, etc.
You can read the letter here (pdf) and below
Brussels, 27 March 2020
Dear representatives of Member States in the Council of the EU,
We hope that you are keeping well in this difficult time.
We are writing to you to voice our serious concerns with the proposed Regulation on preventing the dissemination of terrorist content online (COM/2018/640 final). We have raised these concerns before and many similar critiques have been expressed in letters opposing the Regulation from human rights officials, civil society groups, and human rights advocates.i
We firmly believe that any common position on this crucial file must respect fundamental rights and freedoms, the constitutional traditions of the Member States and existing Union law in this area. In order for this to happen, we urge you to ensure that the rule of law in cross-border cases is respected, that the competent authorities tasked with ordering the removal of illegal terrorist content are independent, to refrain from adopting mandatory (re)upload filters and guarantee that the exceptions for certain protected forms of expression, such as education, journalistic and research materials, are maintained in the proposal. We explain why in more detail further below.
First, we ask you to respect the principles of territoriality and ensure access to justice in cases of cross-border takedowns by ensuring that only the Member State in which the hosting service provider has its legal establishment can issue removal orders. The Regulation should also allow removal orders to be contested in the Member State of establishment to ensure meaningful access to an effective remedy. As recent CJEU case law has established “efficiency” or “national security” reasons cannot lead to short-cuts to rule of law mechanisms and safeguards.ii
Secondly, the principle of due process demands that the legality of content be determined by a court or independent administrative authority. This important principle should be reflected in the definition of ‘competent authorities’. For instance, we note that in the Digital Rights Ireland case, the Court of Justice of the European Union considered that the Data Retention Directive was invalid, inter alia, because access to personal data by law enforcement authorities was not made dependent on a prior review carried out by a court or independent administrative authority.iii In our view, the removal of alleged terrorist content entails a very significant interference with freedom of expression and as such, calls for the application of the same safeguards.
Thirdly, the Regulation should not impose the use of upload or re-upload filters (automated content recognition technologies) to those services under the scope of the Regulation. As the coronavirus crisis makes abundantly clear, filters are far from accurate. Only in recent days, Twitter, Facebook and YouTube have moved to full automation of removal of content, leading to bad scores of legitimate articles about coronavirus being removed.iv The same will happen if filters are applied to alleged terrorist content. There is also mounting data suggesting that algorithms are biased and have a discriminatory impact, which is a particular concern for communities affected by terrorism and whose counter-speech has proven to be vital against radicalisation and terrorist propaganda. Furthermore, a provision imposing specific measures on platforms should favour a model that gives room for manoeuvre to service providers on which actions to take to prevent the dissemination of illegal terrorist content, taking into account their capacities and resources, size and nature (whether non-for-profit, for-profit or community-led).
Finally, it is crucial that certain protected forms of expression, such as educational, artistic, journalistic and research materials are exempted from the proposal, and that it includes feasible measures to ensure how this can be successfully implemented. The determination of whether content amounts to incitement to terrorism or even glorification of terrorism is highly context specific. Research materials should be defined to include content that serves as evidence of human rights abuses. The jurisprudence of the European Court of Human Rights (ECtHR)v specifically requires a particular caution to ,such protected forms of speech and expression. It is vital that these principles are reflected in the Terrorist Content Regulation, including through the adoption of specific provisions protecting freedom of expression as outlined above.
We remain at your disposal for any support you may need from us in the future.
Sincerely,
Access Now – https://www.accessnow.org/
Bits of Freedom – https://www.bitsoffreedom.nl/
Centrum Cyfrowe – https://centrumcyfrowe.pl
CDT – https://cdt.org
Committee to Protect Journalists (CPJ) – https://cpj.org/
Daphne Keller – Director Program on Platform Regulation Stanford University
Digitale Gesellschaft – https://digitalegesellschaft.de/
Digitalcourage – https://digitalcourage.de/
D3 – Defensa dos Dereitos Digitais –
https://www.direitosdigitais.pt/
Državljan D – https://www.drzavljand.si/
EDRi – https://edri.org/
Electronic Frontier Foundation (EFF) – https://www.eff.org/
Epicenter.Works – https://epicenter.works
Free Knowledge Advocacy Group EU- https://wikimediafoundation.org/
Hermes Center – https://www.hermescenter.org/
Homo Digitalis – https://www.homodigitalis.gr/en/
IT-Political Association of Denmark – https://itpol.dk/
Panoptykon Foundation – https://en.panoptykon.org
Vrijschrift – https://www.vrijschrift.org
Wikimedia Spain – https://wikimedia.es
Footnotes
i.
- Letter to Member States calls for safeguards in Terrorist Content Regulation, 17 December 2019. Available at: https://edri.org/letter-to-member-states-calls-for-safeguards-in-terrorist-content-regulation/
- Open letter: Regulation on terrorist content online endangers freedom of expression, 18 March 2019. Available at: https://edri.org/open-letter-regulation-on-terrorist-content-online-endangers-freedom-of-expression
- David Kaye, Joseph Cannataci and Fionnuala Ní Aoláin, Mandates of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression; the Special Rapporteur on the right to privacy and the Special Rapporteur on the promotion and protection of human rights and fundamental freedoms while countering terrorism, 2018. Available at: https://spcommreports.ohchr.org/TMResultsBase/DownLoadPublicCommunicationFile?gId=24234
- Access Now et al., Letter to Members of European Parliament, 2019. Available at: https://cdt.org/files/2019/02/Civil-Society-Letter-to-European-Parliament-on-Terrorism-Database.pdf
- Article 19 et al., letter on European Commission regulation on online terrorist content, 2019. Available at: https://www.article19.org/resources/joint-letter-on-european-commission-regulation-on-online-terrorist-content/
- WITNESS et al., Letter to the Committee on Civil Liberties, Justice and Home Affairs, 2019. Available at: https://drive.google.com/file/d/1WTgl5hjJ_cAE1U0OjqaQ9AucU6HNlhoi/view
ii.
- EDRi, Data retention: “National security” is not a blank cheque, 29 January 2020. Available at:https://edri.org/data-retention-national-security-is-not-a-blank-cheque/
iii.
- See Digital Rights Ireland v. Minister for Communications, Marine and Natural Resources, Joined Cases C‑293/12 and C‑594/12, 08 April 2014 at para. 62.
iv.
- Ars Technica, Reputable sites swept up in FB’s latest coronavirus-minded spam cleanse, 3 March 2020. Available at: https://arstechnica.com/information-technology/2020/03/reputable-sites-swept-up-in-fbs-latest-coronavirus-minded-spam-cleanse/
- The Guardian, Facebook says spam filter mayhem not related to coronavirus, 18 March 2020. Available at: https://www.theguardian.com/technology/2020/mar/18/facebook-says-spam-filter-mayhem-not-related-to-coronavirus
v.
- In cases involving the dissemination of “incitement to violence” or terrorism by the press, the ECtHR’s starting point is that it is “incumbent [upon the press] to impart information and ideas on political issues just as on those in other areas of public interest. Not only does the press have the task of imparting such information and ideas: the public also has a right to receive them.” See Lingens v Austria, App. No. 9815/82,8 July 1986, para 41.
- The ECtHR also repeatedly held that the public enjoyed the right to be informed of different perspectives, e.g. on the situation in South East Turkey, however unpalatable they might be to the authorities. See also Özgür Gündemv. Turkey, no. 23144/93, 16 March 2000, para.60 and 63 and the Council of Europe handbook on protecting the right to freedom of expression under the European Convention on Human Rights, summarizing the Court’s case law on positive obligations of States with regards to the protection of journalists (p.90-93), available at: https://rm.coe.int/handbook-freedom-of-expression-eng/1680732814