25 Jan 2019

Terrorist Content: LIBE Rapporteur’s Draft Report lacks ambition

By Yannic Blaschke

On 23 January 2019, the Rapporteur for the European Parliament Committee on Civil Liberties (LIBE), Daniel Dalton (ECR), published his Draft Report on the proposal for a Regulation on preventing the dissemination of terrorist content online. This Report by the lead Committee of the dossier follows the publishing of the Draft Opinions by the two other European Parliament Committees involved in the debate: the Committee on Internal Market and Consumer Protection (IMCO) and the Committee on Culture and Education (CULT).

Overall, LIBE’s Draft Report addressed only some of the many pressing issues of the Regulation which present serious risks for fundamental rights. Unfortunately, the Report therefore falls somewhat short of the ambitions to which a Committee dealing with civil liberties should aspire. This is even more disappointing after the comprehensive stance taken in the IMCO Draft Opinion, which includes more than twice as many amendments as the LIBE Draft Report.

LIBE’s Draft Report contains, in summary, the following positive points:
– it limits the scope of the Regulation to services that are available to the public (excluding, for example, file lockers from the scope)
– it addresses the need for reporting obligations from competent authorities

However, the Draft Report:
– does not tackle the manifest flaws of the measure of referrals from governments to companies for “voluntary consideration”, which would make Big Tech companies the Internet Police
– does not drastically modify or delete the problematic “proactive measures”, which can only lead to upload filters and other very strict content moderation measures, even though it reminds the legislator about the existing prohibition of general monitoring obligations in the EU
– does not address the problems caused by a lack of alignment of the definition of terrorist content with the Terrorism Directive

On a positive note, the scope of the Terrorist Content Regulation is more narrowly defined in the LIBE Draft Report, being limited now to services which are available to the public. On reporting obligations, it is a welcome addition that the report foresees an evaluation of the Regulation’s impact on the freedom of expression and information in the Union after a maximum of three years following the implementation of the legislation. Regarding the possibility for national authorities to impose proactive measures on online companies, the Draft Report furthermore contains some mitigating clauses, such as a consideration of a platform’s “non-incidental” exposure to terrorist content, or the reminder of the prohibition in EU law of general monitoring obligation for hosting providers. Finally, the Draft Report proposes some adjustments regarding remedies and safeguards. It gives a two week’s deadline for answering complaints by citizens whose content was removed or to which access was denied. The Draft Report also insists that the private complaint mechanisms of internet platforms do not preclude citizens from seeking legal redress before Member State’s courts.

However, Dalton MEP has disappointingly chosen not to address in the referrals of content to platforms for their “voluntary consideration”. These referrals could give national authorities an “escape route” from their human rights obligations by merely suggesting blocking of content which might be unpleasant , but not illegal and thus not suitable to require a removal orders, for a given government. Furthermore, the Rapporteur did not tackle the urgent need of reforming the definition of “terrorist content”, which three United Nations (UN) Special Rapporteurs had previously flagged as a key concern. The vagueness of the definition in the Commission proposal thus persists and  could threaten the work of journalists and NGOs documenting terrorist crimes. Finally, the “proactive measures” have not received the attention and intensive modification they need and they could still lead to de facto general monitoring obligations.

To summarise, the LIBE Draft Report lacks the ambition that would be expected from the Civil Liberties Committee and falls short from the much more comprehensive reworks delivered by the IMCO and CULT Committees. All involved Members of the European Parliament should cooperate and significantly strengthen the negligent and rushed Commission proposal, in particular in regard to the highly dangerous measures of referrals and proactive measures. Serious problems require serious legislation.

Terrorist Content Regulation: Document pool
https://edri.org/terrorist-content-regulation-document-pool/

CULT: Fundamental rights missing in the Terrorist Content Regulation (21.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU’s flawed arguments on terrorist content give big tech more power (24.10.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

Joint Press Release: EU Terrorism Regulation – an EU election tactic (12.9.2018)
https://edri.org/press-release-eu-terrorism-regulation-an-eu-election-tactic/

(Contribution by Yannic Blaschke and Diego Naranjo)

close
23 Jan 2019

EDRi’s Kirsten Fiedler wins Privacy Award

By EDRi

On 22 January, Kirsten Fiedler, current Senior Policy and Campaigns Manager and former Managing Director of European Digital Rights, received the distinguished Felipe Rodriguez Award in celebration of her remarkable contribution to our right to privacy in the digital age.

Why should we defend digital rights and freedoms when there are really pressing and often life-threatening issues out there to fight for? The reason is that the internet and digital communications are seeping into every part of our lives, so our rights online are the basis for everything else we do.

said Fiedler.

I’d like to accept this award on behalf of the entire EDRi team and network. Our strength is in collective, collaborative actions.

Fiedler’s relentless efforts have been crucial to transforming the EDRi Brussels Office from a one-person entity into the current professional organisation with eight staff members. In addition to this, she played an instrumental role in EDRi’s campaigns against ACTA and privatised law enforcement, and has been the engine to the EDRi Brussels office’s growth during the past years.

The Felipe Rodriguez Award is part of the Dutch Big Brother Awards, organised by the EDRi member Bits of Freedom. Previous winners include Kashmir Hill, Open Whisper Systems, Max Schrems, and Edward Snowden. The award ceremony took place on 22 January 2019 in Amsterdam.

Photo: Jason Krüger

Bits of Freedom announces winner of privacy award (09.01.2019)
https://edri.org/bits-of-freedom-announces-winner-of-privacy-award/

Twitter_tweet_and_follow_banner

close
21 Jan 2019

Copyright negotiations begin to derail

By EDRi

The negotiations on the EU’s highly controversial Copyright Directive proposal continue. The last trilogue meeting between Commission, Council and Parliament was originally scheduled for today, 21 January 2019. The event was, however, called off on late Friday evening 18 January by the Romanian Presidency of the EU Council.

It has become increasingly clear that the manifest problems with the text make it hard to find an acceptable compromise on the future of platforms’ and search engines’ liability regimes. A blocking minority formed by Germany, Poland, Belgium, Italy, Sweden, Finland, Slovenia, Hungary and the Netherlands did not approve the Presidency’s revised Council mandate.

This makes it less likely that the EU institutions will find a common position on the deeply flawed Article 13 of the proposal, which will either directly or indirectly require online companies to implement highly error-prone upload filters to search user uploads for copyrighted material. The divisions in the Council are yet another sign of the high degree of polarisation and increasing lack of support for the proposal, which was also highlighted by the fact that even the creative industries called for a halt of negotiations on Article 13 in a joint letter. More than 70 Internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations, programmers, and a plethora of academics have been highly critical of the proposal from the start.

The suspension of trilogue negotiations does, however, not mean that the fight against upload filters and for the freedom of expression is decided: In fact, it is now more crucial than ever to get in touch with your local Members of the European Parliament (MEPs) and national ministries, and ask them to oppose Article 13.

EDRi continues to follow the negotiations closely and calls all citizens and civil society to act and defend their digital rights through the #SaveYourInternet campaign.

Copyright: Compulsory filtering instead of obligatory filtering – a compromise? (04.09.2018)
https://edri.org/copyright-compulsory-filtering-instead-of-obligatory-filtering-a-compromise/

How the EU copyright proposal will hurt the web and Wikipedia (02.07;2018)
https://edri.org/how-the-eu-copyright-proposal-will-hurt-the-web-and-wikipedia/

EU Censorship Machine: Legislation as propaganda? (11.06.2018)
https://edri.org/eu-censorship-machine-legislation-as-propaganda/

close
21 Jan 2019

CULT: Fundamental rights missing in the Terrorist Content Regulation

By Diego Naranjo

The European Parliament (EP) Committee on Culture and Education (CULT), published on 16 January its Draft Opinion on the proposal for a Regulation preventing the dissemination of terrorist content online. Member of the European Parliament (MEP) Julie Ward, the Rapporteur for the Opinion, has joined Rapporteur for the IMCO Committee Julia Reda MEP, and civil rights group in criticising many aspects of the Commission original proposal. The Rapporteur expresses her concerns regarding threats for “fundamental rights, such as freedom of expression and access to information, as well as media pluralism.”

In the Draft Opinion, CULT proposes a number of changes:

  • Definition of terrorist content: The Opinion suggests aligning the definition of terrorist content with the Terrorism Directive 2017/541/EU and to carve-out educational, journalistic or research material.
  • Definition of hosting service providers: The CULT Committee acknowledges that the definition of these services is “too broad and legally unclear”, and that many services which are not the target of this Regulation would be unnecessarily covered. The Rapporteur suggests covering only those hosting service providers that make the content available to the general public.
  • Removal orders: According to the Opinion, the only authorities competent to issue removal orders should be judicial authorities, since they are the ones with the “sufficient expertise”. Furthermore, the “one hour” time frame to respond to the removal orders is replaced by “without undue delay”. This would allow for more flexibility for smaller service providers.
  • Pro-active measures: The obligation of pro-activity (in practice, to implement upload filters in hosting services) is deleted from the proposal.
  • Finally, the Rapporteur suggests removing the financial penalties in order to avoid smaller providers being overburdened, as well as to prevent the likely scenario “where companies may overly block and remove content in order to protect themselves against possible financial penalties.”

This constitutes, on a general level, a very welcome improvement of the dangerous pitfalls of the Commission’s original proposal. Of particular relevance is the Rapporteur’s assessment that an imposition of proactive measures would amount to a breach of Article 15 of the e-Commerce Directive (which contains the prohibition of general monitoring obligations), as well as the proposed deletion of pro-active measures (upload filters). However, it is unclear how the addition by the Rapporteur in Art. 3 (2) saying that hosting service providers “shall not store terrorist content” could be put in place without upload filters, even if as a safeguard the Rapporteur asks those measures to be “appropriate”.

Another shortcoming of the Draft Opinion is the lack of concern about the highly unaccountable instrument of providing referral capacities to national authorities. For some reason, the Rapporteur has decided not to address this trojan horse, which would directly implement privatised law enforcement in the European Union. Referrals from national authorities, even though with their intent to be just for “voluntary consideration” by private companies, are likely to become the way that pervasive Governments outsource the protection of Freedom of Expression to unaccountable private companies, who are outside of the scope of the Charter of Fundamental Rights.

Even though the Rapporteur has not addressed all of the key issues, there are many positive suggestions in the Draft Opinion. Some of them are in line with the IMCO Committee Draft Opinion, which provided an even more comprehensive proposal for improvement. Given the criticism from both Committees, three UN Special Rapporteurs and a large number of civil society groups, the lead committee, the Civil Liberties (LIBE) Committee, is expected to take all of this criticism on board and comprehensively amend the Regulation.

Draft Opinion of the Committee on Culture and Education on the proposal for a regulation on preventing the dissemination of terrorist content online (16.01.2018)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-632.087&format=PDF&language=EN&secondRef=01

Terrorist Content Regulation: document pool
https://edri.org/terrorist-content-regulation-document-pool

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/

Terrorist Content Regulation: Warnings from the UN and the CoE (19.12.2018)
https://edri.org/terrorist-content-regulation-warnings-from-the-un-and-the-coe/

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship (11.12.2018)
https://edri.org/the-eu-councils-general-approach-on-terrorist-content-online-proposal-a-step-towards-pre-emptive-censorship/

Terrorist Content Regulation: Civil rights groups raise major concerns (05.12.2018)
https://edri.org/terrorist-content-regulation-civil-rights-groups-raise-major-concerns/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

(Contribution by Diego Naranjo, EDRi)

Twitter_tweet_and_follow_banner

close
21 Jan 2019

Terrorist Content Regulation: Document Pool

By EDRi

Terrorist networks have grown highly prone to the use of the internet for spreading their propaganda and recruiting followers in recent years. Although the fear of the general public of terrorist attacks certainly puts considerable pressure on policy makers, politicians also strategically use the climate of diffuse anxieties to increase the securitisation of the internet and present themselves as capable, tough leaders. The latest example of such election-motivated policy making is the proposal for a Regulation on preventing the dissemination of terrorist content online, with which the European Commission continues its trend of producing a watershed of “solutions” to terrorist propaganda on the internet.

The proposal contains three main measures to address alleged “terrorist” content:

  1. First, it creates orders issued by (undefined) national authorities to remove or disable access to illegal terrorist content within an hour.
  2. Second, competent authorities can choose to make referrals of terrorist-related potential breaches of companies’ terms of service that would be subject to the voluntary consideration of the companies themselves.
  3. Third, it legislates on (undefined) proactive measures that can lead to an authority requesting a general monitoring obligation.

A major concern for the functioning and freedom of the internet is the extension of the upload filter regime the EU is currently about to introduce for copyright to terrorist content. Requiring internet companies to monitor everything we say on the web does not only have grave implications for the freedom of speech, but it also follows a dangerous path of outsourcing and privatising law enforcement.

EDRi will follow the developments of the Terrorist Content Regulation closely and critically in the next months and provide crucial input to policy makers to ensure that human rights are fully respected in the proposal.

EDRi’s analysis and recommendations
Legislative documents
EDRi’s blogposts and press releases
Other
Key Policy Makers


EDRi’s analysis and recommendations:

Legislative documents:


EDRi’s blogposts and press releases:


Other:

  • Press Release: Commission announces the new Terrorist Content Regulation (12.09.2018)

Key Policy Makers:

Opinion Committees:

Key Dates*:

*(note that these dates are TBC and subject to changes):

      • LIBE Committee (Lead Committee)
        • Deadline for Amendments: 15 February
        • Consideration of AMs: 7 March
        • Shadow meetings: 5-6 March or 11-12 March
        • Vote in LIBE Committee of the Report: 21 March
        • Vote in Plenary (1st reading): 25-28 March o 15-18 April
      • CULT Committe
        • Consideration of Draft Opinion:; : 4 February
        • Deadline for amendments: 6 February
        • Vote of the Opinion: 18 February or 4 March
      • IMCO Committee
        • Deadline for amendments: 23 January
        • Vote of the Opinion: 18 February or 4 March

       

      Twitter_tweet_and_follow_banner

close
18 Jan 2019

Terrorist Content: IMCO draft Opinion sets the stage right for EP

By Yannic Blaschke

On 16 January 2019, the European Parliament Committee on Internal Market and Consumer Protection (IMCO) published its draft Opinion on the Regulation to prevent the dissemination of terrorist content online. The Opinion challenges many of the issues from the original Commission proposal. The Opinion from IMCO should “inform” the main Report prepared by the the Civil Liberties Committee (LIBE).

IMCO’s draft Opinion addresses many of the high risks of a detrimental impact on the freedom of expression. In a nutshell, it:

  • deletes referrals and “proactive” measures
  • points out the need to refer to “illegal” terrorist content
  • re-defines the services covered and exclude some
  • clarifies that the competent authorities deciding on the measures implemented by the Regulation need to be judicial authorities
  • implements new wording on transparency and more reporting obligations for Law Enforcement Agencies

The original Commission proposal had previously been criticised by three United Nations Special Rapporteurs and a great number of civil society and human rights organisations.

The draft Opinion states the necessity of “terrorist content” to be “offences committed intentionally” and to be “illegal”. While this seems obvious at first, such wording is crucial for the exclusion of works that merely document or report on terrorist crimes, such as journalistic or human rights defender publications, from the scope of the Regulation . The Opinion further clarifies that only publicly available information should be covered by the legislation and that electronic communication services, blogs and data stored in cloud systems must be excluded.

In regards to the new competencies the legislation is supposed to give to national authorities, the draft Opinion makes clear that only judicial authorities should be able to issue removal orders. This is a significant improvement to ensure due process and much preferable compared to the vague reference to “competent authorities” in the original Commission text.

The Rapporteur Julia Reda MEP has also taken a strong stance on the highly sensitive measures of referrals and proactive measures. Referrals are the practice of forwarding a piece of content (which may or may not be illegal) to a hosting service provider for its “voluntary consideration”; proactive measures are obligations for companies to have measures in place to find and disable access to “terrorist content”. Both of these instruments have deeply problematic implications: There is, for instance, a substantial lack of accountability for public authorities as a result of unlawful deletions of content referred by them; in addition to this, the possibility to impose proactive measures (upload filters) on companies would amount to a general monitoring obligation, something prohibited in EU law. In the IMCO draft Opinion, both the referrals and the “proactive measures” are deleted from the text.

Finally, the draft Opinion highlights the need for extensive documentation: The IMCO Rapporteur proposes to collect information on the number of removals that led to successful detection, investigation and prosecution of terrorist offences. Currently, the Commission states that it has no information about the number of investigations that were initiated after referrals made by Europol under its mandate. Thus, it is reasonable that when introducing similar capacities for national law enforcement, the effectiveness and proportionality of measures against “terrorist content” to supporting the investigation of terrorist acts needs to be critically evaluated.

The IMCO Opinion, as proposed by the Rapporteur, brings many positive changes that should be taken into consideration by the LIBE Committee on its Report, for which Daniel Dalton MEP (ECR) is the Rapporteur. The Parliament is well advised to take into consideration the proposals in this draft Opinion because the improvements on aspects such as on Rule of Law principles, predictability, legality and fundamental rights safeguards.

Draft Opinion of the Committee on the Internal Market and Consumer Protection  on the proposal for a regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online(COM(2018)0640–C8-0405/2018–2018/0331(COD)) (13.12.2018)
http://www.europarl.europa.eu/sides/getDoc.do?type=COMPARL&reference=PE-632.028&format=PDF&language=EN&secondRef=01

Terrorist Content Regulation: Warnings from the UN and the CoE (19.12.2018)
https://edri.org/terrorist-content-regulation-warnings-from-the-un-and-the-coe/

The EU Council’s general approach on Terrorist Content Online proposal: A step towards pre-emptive censorship (11.12.2018)
https://edri.org/the-eu-councils-general-approach-on-terrorist-content-online-proposal-a-step-towards-pre-emptive-censorship/

Terrorist Content Regulation: Civil rights groups raise major concerns (05.12.2018)
https://edri.org/terrorist-content-regulation-civil-rights-groups-raise-major-concerns/

Terrorist content regulation – prior authorisation for all uploads? (21.11.2018)
https://edri.org/terrorist-content-regulation-prior-authorisation-for-all-uploads/

EU Parliament’s anti-terrorism draft Report raises major concerns (10.10.2018)
https://edri.org/eus-flawed-arguments-on-terrorist-content-give-big-tech-more-power/

Twitter_tweet_and_follow_banner

close
16 Jan 2019

Digital rights as a security objective: Abuses and loss of trust

By Yannic Blaschke

Violations of human rights online can pose a real threat to our societies, from election’s security to societal polarisation. In this series of blogposts, we explain how and why digital rights must be treated as a security objective. In this third and final blogpost, we discuss how digital rights violations can exacerbate breaches to the rule of law in EU Member States and risk to undermine the already fragile project of the EU, including the European security aspects.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

In our previous blogpost, we outlined how an unjustified reliance on algorithms can lead to unintentional censorship and new attack vectors for malicious actors. However, the upload filters that feature in the ongoing discussions for the Copyright Directive and the Terrorist Content Regulation also have a big potential for abuses by public authorities.

There’s a number of examples of state authorities misusing copyright for attacks on the freedom of expression and right to information: Chilling examples are for instance Ecuador, where critics of president Correa were flooded with copyright notices, and the recent attempt of the German government to curb quotes from an internal military report by claiming it as copyrighted material. In the context of counter-terrorism legislation, the situation looks even more severe; with the Commissioner for Human Rights of the Council of Europe recently decrying that “the misuse of anti-terrorism legislation has become one of the most widespread threats to freedom of expression, including media freedom, in Europe”. Laws that have given rise to this alarming assessment are for instance the Spanish “gag law”, which has been severely criticised by international human rights organisations, or the French counter-terrorism laws. The dangerous logic of prosecuting offences vaguely framed as “glorification of terrorism” has led to numerous convictions of citizens on the basis of arbitrariness or because of controversial, yet undoubtedly not terrorist opinions and ideas.

What will happen once such disputes about legitimate forms of expression do not go in front of courts any longer, but filter technologies prevent them from ever appearing in public debate in the first place? What if EU governments start to abuse the competences given to them to censor political journalists, human rights defenders, opponents or ideas they do not like, for instance by calling opposition parties or activists terrorists? What would such abuse mean for the Member States’ eroding trust into each other’s capacity to uphold the rule of law?

In the context of terrorism, the European Commission has proposed that law enforcement should have competences to demand from platforms the introduction of automated filtering technologies if they regard the companies’ own “proactive” measures of content moderation as not extensive enough. Furthermore, there shall be a possibility for the authorities to refer to specific pieces of content to internet companies for their “voluntary consideration”, with a high chance of such content to be taken down due to the hosting providers fear of being held liable for content stored on their servers. Such vague and imprecise measures are not only undermining the rule of law and freedom of expression – they are also bound to be misused. If national authorities with an established record of public interference with citizen’s digital rights start using the additional suppressive instruments provided by the EU in the same disproportionate way they do with their national measures, it will not take long until the courts in other Member States begin to question the extent to which the authorities in their jurisdiction can still cooperate with their abusive counterparts. This has already happened in other contexts: For instance, the CJEU’s decision that extraditions to Poland may be halted. In combination with very different interpretations of what constitutes an offence against the public (for instance, the case of Spanish rapper Valtonyc), cases in which the newly created tools will be deployed to censor voices that are seen as illegal one Member State but are seen as perfectly legal in other Member States can and will further divide the cohesion and integrity of the common area of freedom, security and justice. EU wide public security can only be reached through trust among European judiciaries and law enforcement that throughout cooperative cross-border actions, fundamental rights are respected in all Member States. Giving all EU Member State authorities new censorship powers will achieve nothing but the contrary of such trust – and will thus damage, not improve our security.

Despite some major steps ahead in digital freedoms such as the adoption of the General Data Protection Regulation GDPR , we are still far from realising that digital rights are not just fundamental civil liberties, but also a prerequisite for the security and pluralism of our societies. If we want disinformation to stop ravaging public debate, we should not allow that individuals are forced to automatically give waivers to tracking cookies because it gives publishers and the tracking industry some income. If we want to close the vulnerabilities of our public debate forums on the internet, we cannot impose new gateways for disinformation attacks on online platforms. If we want to prevent the new authoritarianism, we cannot give it more tools of censorship through copyright and the silent eroding of civil liberties in counter-terrorism pursuits.

EU citizens’ digital rights are first and foremost, but not only to the benefit of individuals: they must also be regarded as fundamental to the security of our democratic systems and societal cohesion, both within and across European Union countries. To keep our societies open, free and safe, we must place the rights of the individual at the heart of our internet policies.

Digital rights as a security objective: New gateways for attacks (19.12.2018)
https://edri.org/digital-rights-as-a-security-objective-new-gateways-for-attacks/

Digital rights as a security objective: Fighting disinformation (05.12.2018)
https://edri.org/digital-rights-as-a-security-objective-fighting-disinformation/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
16 Jan 2019

Advocate General issues two Opinions on “right to be forgotten”

By Yannic Blaschke

On 10 January 2019, the Advocate General (AG) Maciej Szpunar delivered two Opinions to the Court of Justice of the European Union (CJEU) that could have far-reaching implications for the “right to be forgotten”, which aims at enabling individuals to lead an autonomous life without stigmatisation from their past actions.

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

A geographical limit to the “right to be forgotten”

In his first opinion, case Google v CNIL (C-507/17), AG Szpunar recommens the CJEU to limit the scope of application of search-engine de-referencing obligations to the territory of the EU. The case at hand was referred to the CJEU after a dispute between search engine operator Google and French Data Protection Authority CNIL. The CNIL had imposed a 100 000 euro fine on Google after the company refused to remove web pages relating to a natural person from all domains listed in its search engine (rather than just EU Member State domains).

In his Opinion, AG Szpunar held that the “right to be forgotten” must be balanced against other fundamental rights, such as the right to data protection and the right to privacy, as well as the legitimate public interest in accessing the information sought. The AG noted that, if worldwide de-referencing were permitted, the EU authorities would not be able to define and determine a right to receive information, especially since public interest in accessing information will necessarily vary from one third State to another, depending on its geographic location. There would thus be a risk that persons in third States would be prevented from accessing information and, in turn, that third States would prevent persons in the EU Member States from accessing information. The AG did, however, not rule out the principal possibility for the existence of cases in which worldwide de-referencing would be justified. He recommended the CJEU to rule that upon receiving a request for de-referencing, search engine providers should not be obliged to implement such measures on all its listed domains. Nevertheless, they should be obliged to implement all possible measures, including geo-blocking, to enforce effective de-referencing for all IP addresses located in the EU, regardless of the used domain.

Search engine operator’s processing of sensitive data

The second Opinion of the AG, case G.C. and Others v CNIL (C-136/17), referred to de-referencing obligations of search engine providers in regard to sensitive categories of data. Following a dispute between the French Data Protection Authority CNIL and the search engine operator Google, Szpunar argued that the prohibitions and restrictions regarding special categories of data (under the previous Data Protection Directive 95/46 EC) cannot apply to the operator of a search engine as if it had itself placed sensitive data on the web pages concerned. Since the activity of a search engine logically takes place only after (sensitive) data have been placed online, those prohibitions and restrictions can, in his opinion, therefore apply to a search engine only by reason of that referencing and, thus, through subsequent verification, when a request for de-referencing is made by the person concerned. Szpunar held, however, that where referencing of sources that store sensitive data occurs, search engine providers have an obligation to react to de-referencing requests after carefully balancing the the right to respect for private life and the right to protection of data with the right of the public to access the information concerned and the right to freedom of expression of the person who provided the information.

Opinions of the Advocates General are not legally binding, but often considerably influence the final verdict of the CJEU. The judgements in both preliminary rulings will be given at a later stage.

Advocate General Szpunar proposes that the Court should limit the scope of the dereferencing that search engine operators are required to carry out to the EU (10.01.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-01/cp190002en.pdf

Advocate General Szpunar proposes that the Court should hold that the operator of a search engine must, as a matter of course, accede to a request for the dereferencing of sensitive data (10.01.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-01/cp190001en.pdf

Google’s forgetful approach to the “right to be forgotten” (14.12.2016)
https://edri.org/googles-forgetful-approach-right-forgotten/

More “right to be forgotten” confusion (15.09.2015)
https://edri.org/more-right-to-be-forgotten-confusion/

Google now supports AND opposes the “right to be forgotten” (27.08.2014)
https://edri.org/google-now-supports-and-opposes-right-forgotten/

Google and the right to be forgotten – the truth is out there (02.07.2014)
https://edri.org/google-right-forgotten-truth/

Google’s right to be forgotten – industrial scale misinformation? (09.06.2014)
https://edri.org/forgotten/

(Contribution by Yannic Blaschke, EDRi intern)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
16 Jan 2019

We can no longer talk about sex on Facebook in Europe

By Bits of Freedom

Sometime in late 2018, Facebook quietly added “Sexual Solicitation” to its list of “Objectionable Content”. Without notifying its users. This is quite remarkable, to put it mildly, as for many people sex is far from being a negligible part of life.

The company writes that it draws a line “when content facilitates, encourages or coordinates sexual contact between adults”. A selection of what isn’t allowed (translated from the Dutch-language Community Standards):

“Content that includes an implicit invitation for sexual intercourse, which can be described as naming a sexual act and other suggestive elements including (but not limited to):
– vague suggestive statements such as: ‘looking forward to an enjoyable evening’
– sexual use of language […]
– content (self-made, digital or existing) that possibly portrays explicit sexual acts or a suggestively positioned person/suggestively positioned persons.

Content in which other acts committed by adults are requested or offered, such as:
– commercial pornography
– partners that share fetishes or sexual interests”

It is unclear what the cause is for this change. The most obvious explanation is new legislation that went into force at the beginning of last year in the United States. The “Fight Online Sex Trafficking Act” and the “Stop Enabling Sex Traffickers Act” (FOSTA/SESTA) hold companies accountable for sex work ads on their platform. Craigslist, among others, took its “Personals” offline and Reddit blocked a couple of sex work-related subreddits. Facebook’s new policy can, as well, be seen as a response to this legislation. The broad formulation of the criteria for what isn’t allowed is a precaution. Facebook chooses to err on the side of caution and over-censor, rather than risk the consequences of hosting illegal content.

Facebook boasts about connecting people, but in reality, the company increasingly frustrates our communication. There’s no question that such vaguely formulated rules combined with automated content filters will lead to more arbitrary censoring. But what this incident illustrates, more than anything, is that Facebook is thwarted by the scale at which it operates, and chooses to offload the cost of scale, namely arbitrary censorship and diminished freedom of expression, onto European users. It’s inconceivable that new legislation passed in the US means that in many European countries, if not all, one consenting adult can no longer ask another consenting adult if they want to have sex. Or, for that matter, get in touch with other people over shared fetishes or fantasies, or exchange information about safe sex.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

This impacts all European citizens, and is particularly problematic in the case of people who don’t identify with the traditional, heteronormative perspective of sex and turn to the internet for alternatives. In addition, sex workers are affected disproportionately. Sex workers often use online platforms for contacting clients and in order to exchange tips and information. Proud, an interest group for Dutch sex workers, spoke out against the new legislation in 2018 because it would (further) marginalise sex work. Facebook’s new policy demonstrates that these fears weren’t unfounded.

European countries, like all others, work hard in order to uphold their values. Many of these countries find it important that one can speak openly about sex and sexuality. In the Netherlands, significant efforts are made in order to protect and improve sex workers’ rights. Facebook’s policy thwarts these endeavours. It is unacceptable that we find ourselves in a situation in which legislation from another country has such a big impact on our societies. Is Facebook’s bottom line so important to Europe that we are willing to part with the rights and freedoms we’ve fought so hard to achieve?

In Europe we can no longer talk about sex on Facebook (only in Dutch, 13.12.2018)
https://www.bitsoffreedom.nl/2018/12/13/in-nederland-mag-je-op-facebook-niet-meer-vragen-of-iemand-zin-heeft-in-seks/

(Contribution by Evelyn Austin, EDRi member Bits of Freedom, the Netherlands; translation by Winnie van Nunen)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
16 Jan 2019

EU Member States willing to retain illegal data retention

By IT-Pol

With its judgments in April 2014 (Digital Rights Ireland ) and December 2016 (Tele2 ), the Court of Justice of the European Union (CJEU) ruled that blanket data retention was illegal under EU law. Rather than repealing their illegal data retention laws, EU Member States have instead adopted a tactic of ignoring the highest court of the European Union under the pretence of a “common reflection process” with an expert data retention working group under the Working Party on Information Exchange and Data Protection (DAPIX).

----------------------------------------------------------------- Support our work with a one-off-donation! https://edri.org/donate/ -----------------------------------------------------------------

At the Justice and Home Affairs (JHA) Council meeting on 6-7 December 2018, the state of play of the expert working group on data retention was discussed. Council document 14319/18 prepared for the meeting reveals that the common reflection process has produced no tangible results towards compliance with the Tele2 judgment: replacing general and indiscriminate (blanket) data retention with targeted data retention. Member States appear to be happy with their current and illegal data retention regimes and do not want to make any changes. A recurring element in the Council document is the unwillingness of Member States to accept the Tele2 judgment, often disguised under a very selective reading of the judgment.

The expert working group has considered the concept of “restricted data retention”, previously analysed in the EDRi-gram. The main novelty is that Member States are supposed to limit the data categories to be retained to what is strictly necessary. No limitation is foreseen with respect to the persons concerned, which means that data about the entire population is retained, as with the current data retention regimes. Therefore, restricted data retention cannot possibly comply with the Tele2 judgment. However, even the token gesture of limiting the data categories has no support among Member States. They claim that the data categories which are not necessary for law enforcement purposes are already excluded. Based on this premise, Member States even contend that “there is no general and indiscriminate retention of data as referred to in the Tele2 judgment”, which is rather remarkable since the CJEU has stated the exact opposite in the Tele2 judgment.

The renewable retention warrant (RRW) proposal is another attempt by Member States to circumvent the Tele2 judgment. While the warrant only covers a single provider of electronic communications services for a fixed period of validity, all providers are expected to be covered by different warrants that are constantly renewed because the RRW would be rendered ineffective for law enforcement purposes if not all providers are covered. In practice, the RRW will be indistinguishable from the current blanket data retention regimes. With the exception of one Member State, which uses a similar system (undoubtedly the United Kingdom), there is no support for the RRW since the system would be too complex and inefficient and would require changes to national laws on criminal procedure.

After two years of “reflection” on the Tele2 judgment, Member States and their expert working group have not come up with a single realistic alternative to the current blanket data retention regimes that the CJEU has ruled to be illegal under EU law. The Council document does not describe a single suggestion which would actually make the data retention scheme targeted and limit the persons concerned by the measure, even though this is expressly required by the CJEU in paragraph 110 of the Tele2 judgment.

The second part of Council document 14319/18 deals with access to the retained data. According to the Tele2 judgment, access to the retained data must be limited to investigations involving serious crime and must be subject to review by a court or an independent administrative authority. As a general rule, only data of individuals suspected of being involved or implicated in a crime can be accessed.

Once again, Member States are reluctant to accept the restrictions imposed by the CJEU. Since there is no EU law or CJEU guidance defining “serious crime”, this task is left to Member States. Some Member States have a very broad definition, even to the point of including crimes that cannot be regarded as serious because of their low maximum sentence, but are nonetheless claimed to be perceived as serious by the general public. It is also noted in the Council document that without access to retained data, criminal investigations in cybercrime cases would often “turn out to be futile because digital evidence would be unavailable”. However, when data retention of electronic communications metadata is a particularly serious interference with fundamental rights, as the CJEU has established (Tele2 paragraph 100), access to the retained data must be subject to strict rules and will not always be available for law enforcement authorities. Since more and more activities are related to the online environment, making a complete carve out for crimes committed online would deprive the privacy and data protection safeguards at the access level of almost any meaning.

The Council document notes that the judicial review regimes of most Member States are in line with the prerequisites set out by the CJEU, through a prior review by a court/judge, an independent administrative authority or the prosecution office. However, by silently adding the prosecution office, which is not an independent judicial authority, to the list, Member States are rather misleadingly overstating their compliance with the Tele2 judgment regarding the requirement of independent review of access requests.

Finally, Member States are very reluctant to limit the access to the retained data to persons that are suspects or accused persons, as required by the CJEU, except in special cases involving terrorism (paragraph 119 of the Tele2 judgment). The main reason for this is that “proceedings are commenced not against certain individuals, but against (at least in the beginning) unknown perpetrators.” This suggests that law enforcement authorities routinely use data retention to find possible suspects of a crime, for example through cell phone tower inquiries where information is obtained about all persons that are present in a certain area. Data-mining investigations like this affect a large number persons, some of whom may become suspects simply because of their presence in a certain area (location data). The Tele2 judgment only allows broad access to the retained data as an exception in particular cases involving terrorism, but Member States want to turn the exception into the general rule by only requiring a connection to criminal investigations when retained data is accessed.

At the JHA Council meeting in December, ministers agreed to continue “the work at experts level to explore avenues to develop a concept of data retention within the EU.” However, this is precisely what the expert working group has been doing for the past two years, without delivering a single proposal for data retention that respects the requirements of the Tele2 judgment.

This puts the European data retention situation at a stalemate. Member States refuse to even think of alternatives to their current blanket data retention regimes, but they cannot have blanket data retention, at least not legally, because the CJEU has ruled that it is illegal under EU law. The European Commission is the “guardian of the Treaties”, but appears unwilling to start infringement proceedings against Member States even if it is “monitoring” them. Legal action at the national level against data retention laws is, of course, a potential way out of the stalemate. Litigation is currently being pursued in some Member States, and in the past has been successful in a number of Member States.

However, Member States are fighting for their blanket data retention regimes at other levels than ignoring the Tele2 judgment. One possibility is that the future ePrivacy Regulation will present a more “favourable” environment for data retention than the current ePrivacy Directive – something that the Council is actively working on. This could give Member States a “fresh start” on data retention since the CJEU would have to assess the national data retention laws against the new ePrivacy Regulation, but still interpreted in light of the (unchanged) Charter of Fundamental Rights. There is also the risk that the CJEU could revise its stance on data retention in some of the new cases that are pending before the Court (C-623/17 from UK, C-520/18 from Belgium, and C-511/18 and C-512/18 from France). The first question in C-520/18 is very similar to the first question in the Tele2 case, that is whether Article 15(1) of the ePrivacy Directive, read in the light of the Charter of Fundamental Rights, precludes a general obligation to retain traffic data for providers of electronic communications services. Member States would undoubtedly see this as an opportunity to “retry” the Digital Rights Ireland and Tele2 cases before the CJEU.

Data retention – state of play. Council document 14319/18 (23.11.2018)
http://data.consilium.europa.eu/doc/document/ST-14319-2018-INIT/en/pdf

EU Member States plan to ignore EU Court data retention rulings (29.11.2017)
https://edri.org/eu-member-states-plan-to-ignore-eu-court-data-retention-rulings

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close