03 Oct 2019

CJEU ruling on fighting defamation online could open the door for upload filters

By EDRi

Today, on 3 October 2019, the Court of Justice of the European Union (CJEU) gave its ruling in the case C‑18/18 Glawischnig-Piesczek v Facebook. The case is related to injunctions obliging a service provider to stop the dissemination of a defamatory comment. Some aspects of the decision could pose a threat for freedom of expression, in particular that of political dissidents who may be accused of defamatory practices.

This ruling could open the door for exploitative upload filters for all online content.

said Diego Naranjo, Head of Policy at EDRi.

Despite the positive intention to protect an individual from defamatory content, this decision could lead to severed freedom of expression for all internet users, with particular risks for political critics and human rights defenders by paving the road for automated content recognition technologies.

The ruling confirms that a hosting provider such as Facebook can be ordered, in the context of an injunction, to seek and identify, among all the content shared by its users, content that is identical to the content characterised as illegal by a court. If the obligation to block future content applies to all users on a large platform like Facebook, the Court has in effect considered it to be in line with the E-Commerce Directive that courts demand automated upload filters and blurred the distinction between general and specific monitoring in its previous case law. EDRi is concerned that automated upload filters for identical content will not be able to distinguish between legal and illegal content, in particular when applied to individual words that could have very different meanings depending on the context and the intent of the user.

EDRi welcomes the Court’s attempt to find a balance of rights (namely freedom of expression, freedom to conduct a business) and to limit the impact on freedom of expression by differentiating between the search for identical and equivalent content. However, the ruling seems to be departing from previous case law regarding the ban on general monitoring obligations (for example Scarlet v. Sabam). Imposing filtering of all communications in order to look for one specific piece of content, using non-transparent algorithms, is likely to unduly restrict legal speech – independently from whether they look for content that is identical or equivalent to illegal content.

The upcoming review of the E-Commerce Directive should clarify, among other things, how to deal with online content moderation. In the context of this review, it is crucial to address the problem of disinformation without unduly interfering with the fundamental right to freedom of expression for users of the platform. Specifically, the business model based on amplifying certain type of content in the detriment of other in order to attract users’ attention requires urgent scrutiny.

Read more:

Access Now blogpost: No summer break for free expression in Europe: Facebook cases that matter for human rights (23.09.2019)
https://www.accessnow.org/no-summer-break-for-free-expression-in-europe-facebook-cases-that-matter-for-human-rights/

CJEU case C-18/18 – Glawischnig-Piesczek Press Release (03.10.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-10/cp190128en.pdf

CJEU case C-18/18 – Glawischnig-Piesczek ruling (03.10.2019)
http://curia.europa.eu/juris/document/document.jsf?text=&docid=218621&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=192400

Fighting defamation online – AG Opinion forgets that context matters (19.06.2019)
https://edri.org/fighting-defamation-online-ag-opinion-forgets-that-context-matters/

Dolphins in the Net, a New Stanford CIS White Paper
https://cyberlaw.stanford.edu/files/Dolphins-in-the-Net-AG-Analysis.pdf

SABAM vs Netlog – another important ruling for fundamental rights (16.02.2012)
https://edri.org/sabam_netlog_win/

close
01 Oct 2019

CJEU on cookies: ‘Consent or be tracked’ is not an option

By EDRi

Today, on 1 October 2019, the Court of Justice of the European Union (CJEU) gave its ruling on “cookie consent” requirements. European Digital Rights (EDRi) welcomes the CJEU’s confirmation that under the current data protection framework, cookies can only be set if users have given consent that is valid under the General Data Protection Regulation (GDPR). This means consent needs to be given by a clear affirmative act establishing a freely given, specific, informed and unambiguous indication of a user’s agreement.

‘Consent or be tracked’ is not an option. The CJEU ruling spells it out for the industry and calls for clear rules on confidentiality of our communications

said Diego Naranjo, Head of Policy at EDRi.

EU Members States need to finally move forward with legislating this practice, and take the much needed ePrivacy Regulation out of the EU Council’s closet.

This ruling is a positive step towards protecting people from hidden commercial surveillance techniques deployed by the advertisement industry. It is, however, crucial to also urgently finalise the new ePrivacy Regulation that complements the GDPR in strengthening the privacy and security of electronic communications.

Read more:

CJEU press release: Storing cookies requiresinternet users’ active consent – A pre-ticked checkbox is therefore insufficient (01.10.2019)
https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-10/cp190125en.pdf

CJEU ruling C-673/17 (01.10.2019)
http://curia.europa.eu/juris/documents.jsf?num=C-673/17

Video: Cookies (05.09.2016)
https://edri.org/privacyvideos-cookies/

EU Council considers undermining ePrivacy (30.06.2018)
https://edri.org/eu-council-considers-undermining-eprivacy/

Civil society calls Council to adopt ePrivacy now (05.12.2018)
https://edri.org/civil-society-calls-council-to-adopt-eprivacy-now/

Freedom to be different: How to defend yourself against tracking (27.09.2016)
https://edri.org/freedom-to-be-different/

e-Privacy revision: Document pool
https://edri.org/eprivacy-directive-document-pool/

close
26 Sep 2019

Mozilla Fellow Petra Molnar joins us to work on AI & discrimination

By Guest author

Starting on 1 October, Petra Molnar will join our team as a Mozilla Fellow. She is a lawyer specialising in migration, human rights, and technology, and has a Masters of Social Anthropology from York University, a Juris Doctorate from the University of Toronto, and an LL.M in International Law from the University of Cambridge. Mozilla Fellowships are organised and supported by the Mozilla Foundation and engage in a specific project in collaboration with an association, such as EDRi. With our upcoming work on artificial intelligence (AI) and our experience working on surveillance and data protection, we look forward to working with Petra, to add our voice to the ongoing discussions on the impact of algorithms on vulnerable populations, such as migrants and refugees.

Artificial intelligence and migration management from a human rights perspective

The systematic detention of migrants at the US-Mexico border. The wrongful deportation of 7 000 foreign students accused of cheating on a language test. Racist or sexist discrimination based on social media profiles. What do these examples have in common? In every case, an algorithm made a decision with serious consequences for people’s lives.

Nearly 70 million people are currently on the move due to conflict, instability, environmental factors, and economic reasons. Many states and international organisations involved in migration management are exploring machine learning to increase efficiency and support border security. These experiments range from big data predictions about population movements in the Mediterranean, to Canada’s use of automated decision-making in immigration applications, to AI lie detectors deployed at European airports. However, most of these experiments fail to account for the far-reaching impacts on human lives and human rights. These unregulated technologies are developed with little oversight, transparency, and accountability.

Expanding on my work on the human rights impacts of automated decision-making in immigration, this ethnographic project and accompanying advocacy campaign aims to create a governance mechanism for AI in migration with human rights at the centre. While embedded at EDRi, I will interview affected populations, experts, technologists, and policy makers to produce a well-researched report on the human rights impacts of migration management technologies, collaborating with academics, tech developers, the UN, governments, and civil society. This project will build on the work already done in the EU and provide feedback to EDRi’s ongoing work on AI. I will engage with NGOs to help build EDRi’s network and broaden the scope of action to non-digital groups beyond the EU, translating these efforts into a global strategy for the governance of migration management technologies.

I am delighted to be working with EDRi on this important project as the 2019-2020 Mozilla Fellow!

Mozilla Fellowships
https://foundation.mozilla.org/en/fellowships/

Big Data and International Migration (16.06.2014)
https://www.unglobalpulse.org/big-data-migration

Bots at the Gate – A Human Rights Analysis of Automated Decision Making in Canada’s Immigration and Refugee System (26.09.2018)
https://citizenlab.ca/2018/09/bots-at-the-gate-human-rights-analysis-automated-decision-making-in-canadas-immigration-refugee-system/

Emerging Voices: Immigration, Iris-Scanning and iBorderCTRL–The Human Rights Impacts of Technological Experiments in Migration (19.08.2019)
https://opiniojuris.org/2019/08/19/emerging-voices-immigration-iris-scanning-and-iborderctrl-the-human-rights-impacts-of-technological-experiments-in-migration/

(Contribution, Petra Molnar, selected Mozilla Fellow, EDRi)

close
25 Sep 2019

PNR complaint advances to the Austrian Federal Administrative Court

By Epicenter.works

On 19 August 2019, Austrian EDRi member epicenter.works lodged a complaint with the Austrian data protection authority (DPA) against the Passenger Name Records (PNR) Directive. After only three weeks, on 6 September, they received the response from the DPA: The complaint was rejected. That sounds negative at first, but is actually good news. The complaint can and must now be lodged with the Federal Administrative Court.

Why was the complaint rejected?

The DPA has no authority to decide whether or not laws are constitutional. Moreover, it cannot refer the matter to the Court of Justice of the European Union (CJEU), which in this case is necessary, because the complaint concerns an EU Directive. It was to be expected that the DPA would decide in this way, but the speed of the decision was somewhat surprising – in a positive way. It was clear from the outset that the data protection authority would reject the complaint, but it was a necessary step that could not be skipped, as there is no other legal route to the Federal Administrative Court than via the DPA. All seven proceedings of the complainants lodged with the aid of epicenter.works were merged, and the organisation was given the power of representation. This means that epicenter.works is allowed to represent the complainants.

What are the next steps?

Meanwhile, epicenter.works is still waiting for a freedom of information (FOI) request they have sent to the Passenger Information Unit (PIU) that processes the PNR data in Austria. While an answer to one request was received within a few days, another one has been overdue since 23 August. The unanswered request concerns data protection framework conditions for the PNR implementation.

epicenter.works will file the complaint with the Federal Administrative Court within four weeks. It is to be expected that the court will submit legal questions to the Court of Justice of the European Union (CJEU).

Epicenter.works
https://en.epicenter.works/

Passenger Name Records
https://en.epicenter.works/thema/pnr-0

Passenger surveillance brought before courts in Germany and Austria (22.05.2019)
https://edri.org/passenger-surveillance-brought-before-courts-in-germany-and-austria/

PNR: EU Court rules that draft EU/Canada air passenger data deal is unacceptable (26.07.2017)
https://edri.org/pnr-eu-court-rules-draft-eu-canada-air-passenger-data-deal-is-unacceptable/

(Contribution by Iwona Laub, EDRi member Epicenter.works, Austria)

close
25 Sep 2019

Why EU passenger surveillance fails its purpose

By Epicenter.works

The EU Directive imposing the collection of flyers’ information (Passenger Name Record, PNR) was adopted in April 2016, the same day as the General Data Protection Regulation (GDPR). The collection of PNR data from all flights going in and out of Brussels has a strong impact on the right of privacy of individuals and it needs to be justified on the basis of necessity and proportionality, and only if it meets objectives of general interest. All of this lacks in the current EU PNR Directive, which is at the moment being implemented in the EU.

The Austrian implementation of the PNR Directive

In Austria, the Austrian Passenger Information Unit (PIU) has processed PNR since March 2019. On 9 July 2019, the Passenger Data central office (Fluggastdatenzentralstelle) issued a response to inquiries into PNR implementation in Austria. According to the document, from February 2019 to 14 May, 7 633 867 records had been transmitted to the PIU. On average, about 490 hits per day are reported, with an average of about 3 430 hits per week requiring further verification. According to the document, out of the 7 633 867 reported records, there were 51 confirmed matches and in 30 cases there was the intervention by staff at the airport concerned.

Impact on innocents

What this small show of success does not capture, however, is the damage inflicted on the thousands of innocent passengers who are wrongly flagged by the system and who can be subjected to damaging police investigations or denied entry into destination countries without proper cause. Mass surveillance that seeks a small, select population is invasive, inefficient, and counter to fundamental rights. It subjects the majority of people to extreme security measures that are not only ineffective at catching terrorists and criminals, but that undermine privacy rights and can cause immense personal damage.

Why is this happening? The rate fallacy

Imagine a city with a population of 1 000 000 people implements surveillance measures to catch terrorists. This particular surveillance system has a failure rate of 1%, meaning that (1) when a terrorist is detected, the system will register it as a hit 99% of the time, and fail to do so 1% of the time and (2) that when a non-terrorist is detected, the system will not flag them 99% of the time, but register the person as a hit 1% of the time. What is the probability that a person flagged by this system is actually a terrorist?

At first, it might look like there is a 99% chance of that person being a terrorist. Given the system’s failure rate of 1%, this prediction seems to make sense. However, this is an example of incorrect intuitive reasoning because it fails to take into account the error rate of hit detection.

This is based on the rate fallacy: The base rate fallacy is the tendency to ignore base rates – actual probabilities – in the presence of specific, individuating information. Rather than integrating general information and statistics with information about an individual case, the mind tends to ignore the former and focus on the latter. One type of base rate fallacy is the one we suggested above called the false positive paradox, in which false positive tests are more probable than true positive tests. This result occurs when the population overall has a low incidence of a given condition and the true incidence rate of the condition is lower than the false positive rate. Deconstructing the false positive paradox shows that the true chance of this person being a terrorist is closer to 1% than to 99%.

In our example, out of one million inhabitants, there would be 999 900 law-abiding citizens and 100 terrorists. The number of true positives registered by the city’s surveillance numbers 99, with the number of false positives at 9 999 – a number that would overwhelm even the best system. In all, 10 098 people total – 9 999 non-terrorists and 99 actual terrorists – will trigger the system. This means that, due to the high number of false positives, the probability that the system registers a terrorist is not 99% but rather is below 1%. Searching in large data sets for few suspects means that only a small number of hits will ever be genuine. This is a persistent mathematical problem that cannot be avoided, even with improved accuracy.

Security and privacy are not incompatible – rather there is a necessary balance that must be determined by a society. The PNR system, by relying on faulty mathematical assumptions, ensures that neither security nor privacy are protected.

Epicenter.works
https://en.epicenter.works/

PNR – Passenger Name Record
https://en.epicenter.works/thema/pnr-0

Passenger surveillance brought before courts in Germany and Austria (22.05.2019)
https://edri.org/passenger-surveillance-brought-before-courts-in-germany-and-austria/

We’re going to overturn the PNR directive (14.05.2019)
https://en.epicenter.works/content/were-going-to-overturn-the-pnr-directive-0

NoPNR – We are taking legal action against the mass processing of passenger data!
https://nopnr.eu/en/home/

An Explainer on the Base Rate Fallacy and PNR (22.07.2019)
https://en.epicenter.works/content/an-explainer-on-the-base-rate-fallacy-and-pnr

(Contribution by Kaitlin McDermott, EDRi-member Epicenter.works, Austria)

close
25 Sep 2019

Facebook users blocked simply for mentioning a name?

By Dean Willis

Merely writing or including two words, in this case “Tommy Robinson”, in a Facebook post or link is enough to get the post removed and the writer blocked. At least it seems so in Denmark and Sweden.

Writing the name of the English right-wing activist infringes and violates Facebook’s Community Rules, a particular category aimed at so-called hate preachers including “individuals or organizations that organize or incite violence”. Other Facebook users are not allowed to support, praise or provide representation of the banned hate preachers. According to statements by Facebook’s Nordic Head of Communications, criticism of Tommy Robinson is allowed, but merely mentioning his name in a neutral context will be considered as support of the banned hate preacher.

For example, a Facebook blog post by a member of Danish right-wing political party the New Right, in which he complained that he runs the risk of being blocked by various social media if he writes the name, was removed. A Danish public broadcaster had an interview with a Facebook’s Head of Communications about the platform’s moderation policy to which they linked from their Facebook page, and it was initially taken down because it mentioned Tommy Robinson. Facebook users in Denmark and Sweden are also reporting that posts mentioning Robinson were taken down within minutes from publishing them.

A blogger from a left-wing political party was banned from Facebook for 24 hours for writing that Tommy Robinson was an “idiot” in a blogpost that also criticised Facebook’s excessive moderation policies. This suggests the removals are automated, without consideration of context, contrary to the claims by Facebook than only support and representation of hate preachers is banned, which raises a question about restrictions on freedom of expression and how we discuss and debate online.

If you write this name, you will be blocked on social media (only in Danish, 17.09.2019)
https://jyllands-posten.dk/debat/blogs/larsmathiesen/ECE11623529/hvis-man-skriver-dette-navn-saa-bliver-man-blokeret-paa-de-sociale-medier/

Danish public broadcaster’s interview with Facebook taken down (only in Danish, 23.09.2019)
https://twitter.com/lottefolke/status/1176065883971739648

E-Commerce review: Opening Pandora’s box? (20.06.2019)
https://edri.org/e-commerce-review-1-pandoras-box/

(Contribution by Dean Willis, EDRi intern)

close
25 Sep 2019

Portugal: Data retention complaint reaches the Constitutional Court

By Guest author

September 2019 brought us long-awaited developments regarding the situation of data retention in Portugal. The Justice Ombudsman decided to send the Portuguese data retention law to the Constitutional Court, following the Court of Justice of the European Union’s (CJEU’s) case law on blanket retention of data that lead to invalidation of Directive 2006/24/EC. This decision comes after a complaint presented by EDRi observer Associação D3 – Defesa dos Direitos Digitais, in December 2017.

The Ombudsman had first decided to issue an official recommendation to the government, urging it to propose a legislative solution for the problematic law that originated from the now invalidated Data Retention Directive. Faced with a refusal from the Minister of Justice to find a solution through legislative means, the Ombudsman has now decided to concede to D3’s original request, and has sent the matter for the appreciation of the Constitutional Court, which will have to provide a ruling on the constitutionality of the Portuguese data retention scheme.

A few days later, the same Constitutional Court partially stroke down, for the second time, a law that granted the intelligence services’ access to retained data. In 2015, the Constitutional Court had already declared the unconstitutionality of a similar law, after the president had requested a preventive ruling by the Court before signing it into law. However, in 2017, a new law that addressed some of the problems raised by the Constitutional Court was approved in the Parliament. As the new president opted not to request a preventive decision, the law came into force. 35 Members of the Parliament (MP) from three parties then requested a Constitutional Court ruling on the law, which was now issued.

The fundamental reasoning of this decision is that the Portuguese Constitution forbids public authorities from accessing citizen’s correspondence and telecommunications, except in the context of a criminal procedure. Given that the intelligence services have no criminal procedure competences, they cannot access such data within the existent Constitutional framework. However, the Court did allow access to user location and identification data (in the context of the fight against terrorism and highly organised crime), as such data was not considered to be covered by the secrecy of communications.

This case has also lead to the resignation of the original judge rapporteur due to disagreements related to the reasoning reflected in the final version of the text of the decision.

Associação D3 – Defesa dos Direitos Digitais
https://www.direitosdigitais.pt/

Portugal: Data retention sent to the Constitutional Court (07.03.2018)
https://edri.org/portugal-data-retention-constitutional-court/

European Court overturns EU mass surveillance law (08.04.2014)
https://edri.org/european-court-overturns-eu-mass-surveillance-law/

(Contribution by Eduardo Santos, Associação D3 – Defesa dos Direitos Digitais, Portugal)


close
23 Sep 2019

Your mail, their ads. Your rights?

By Andreea Belu
  • In the digital space, “postal services” often snoop into your online conversations in order to market services or products according to what they find out from your chats.
  • A law meant to limit this exploitative practice is stalled by the Council of European Union

We all expect our mail to be safe in the hands of a mailman. We have confidence that both the post office and the mailmen working there will not take a sneak-a-peak into our written correspondence. Neither we expect mailmen to act like door-to-door salespersons.

When we say “postal services” snoop, it is important to understand that this refers to both traditional mail services such as Yahoo, but also instant messaging apps like WhatsApp. While targeted ads are no longer popular among mail providers, the practice is gaining momentum in the instant messaging zone after Facebook’s CEO announced plans to introduce ads on WhatsApp’s Status feature.

Not just shoes ads

You might think: ”Well, what’s the harm in having shoes advertised after they’ve read the shopping chats between my friend and me?”. Short answer: it’s not just shoes.

Often targeted ads are the result of you being profiled according to your age, location, gender, sexual orientation, political views or ethnicity. You will receive jobs ads based on your gender, or housing ads based on your ethnicity. Sometimes, you may be targeted because you feel anxious or worthless. Are you sure all of these will benefit you? More, your online mailman might be required to read all of your mail, just in case you get in trouble with the law in the future. We call this mass data retention.

Click to watch the animation

The need for encrypted mail in storage *and* in transit

The WhatsApp case is a good example. Currently, WhatsApp seals the message right after you press “send”. The message goes to WhatsApp’s servers, is stored encrypted, and then sent to its recipient, also encrypted. This means that, technically, the mail is encrypted both in storage and in transit and nobody can reads its content. However, as Forbes points out, future ads plans might modify WhatsApp’s encryption so that they “first identify key words in sentences, like “fishing” or “birthday,” and send them to Facebook’s servers to be processed for advertising, while separately sending the encrypted message.

There’s a law for it, but it’s stalled by the EU Council

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of our electronic communications, by complementing and particularising the rules introduced by the General Data Protection Regulation (GDPR). The EU Parliament adopted a good stand for ePrivacy that would ensure your online messages are protected both in storage and in transit (Art.5), that would consider “consent” as the only legal basis for processing data (Art 6), that would make privacy–by–design and privacy–by–default core principles in software design (Art. 10), and that would protect encryption from measures aimed at undermining it (Art. 17). However, the Council of the European Union is yielding under big tech lobby pressure and drafted an opinion that threatens our rights and freedoms. More, the text adopted by the EU Parliament in October 2017 has been stuck in the EU Council, behind closed-door negotiations for soon two years. We have sent several letters (here, here and here) calling for the safeguarding our communications and for the adoption of this much needed ePrivacy Regulation.

Will our voices be heard? If you are worried about being targeted based on your private conversations, join our efforts and stay tuned for more updates coming soon.


Read more:

Your family is none of their business (23.07.2019)
https://edri.org/your-family-is-none-of-their-business/

Real-time bidding: The auction for your attention (4.07.2019)
https://edri.org/real-time-bidding-the-auction-for-your-attention/

e-Privacy Directive: Frequently Asked Questions
https://edri.org/epd-faq/

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

e-Privacy Mythbusting (25.10.2017)
edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

close
11 Sep 2019

Poland challenges copyright upload filters before the CJEU

By Centrum Cyfrowe Foundation

On 24 May 2019, Poland initiated a legal challenge (C-401/19) before the Court of Justice of the European Union (CJEU) against Article 17 of the Directive on copyright in the Digital Single Market. EDRi member Centrum Cyfrowe Foundation has previously tried to get access to the complaint using freedom of information (FOI) requests, without success. Now, the CJEU has finally published the application for this legal challenge.

Bringing the Directive to the Court of Justice is a positive step that can help clear controversies concerning its Article 17. An independent court will assess issues that in the policy debate preceding the adoption of the Directive were typically dismissed by representatives of rights holders as fear-mongering or disinformation.

The Republic of Poland seeks the annulment of Article 17(4)(b) and Article 17(4)(c) of the copyright Directive.Alternatively, should the Court find that the contested provisions cannot be deleted from Article 17 of Directive without substantively changing the rules contained in the remaining provisions of that article, Poland claims that the Court should annul Article 17 of Directive in its entirety.

Poland claims that the Directive infringes the right to freedom of expression and information guaranteed by Article 11 of the Charter of Fundamental Rights of the European Union. The legal challenge mentions as particularly problematic the imposition on online content-sharing service providers to “make best efforts to ensure the unavailability of specific works and other subject matter for which the rights holders have provided the service providers with the relevant and necessary information” and to “make best efforts to prevent the future uploads of protected works or other subject-matter for which the rights holders have lodged a sufficiently substantiated notice make it necessary for the service providers — in order to avoid liability — to carry out prior automatic verification (filtering) of content uploaded online by users, and therefore make it necessary to introduce preventive control mechanisms”. In other words, obliging online platforms to filter all uploads by their users.

Unfortunately, the political context of the challenge has raised some questions. The complaint was submitted just two days before the elections of the European Parliament and Poland’s ruling Law and Justice party (PiS) has been brandishing its opposition to upload filters against the biggest opposition party, Civic Platform.

The EU Member States (and Iceland, Liechtenstein and Norway) have until 2 October 2019 to submit an application to the CJEU to intervene in this case, as defined by Chapter 4 of the CJEU’s Rules of Procedure (RoP). Member States can intervene to support, in whole or in part, either Poland’s position on Article 17 or the Council and Parliament’s position on Article 17.

Centrum Cyfrowe Foundation
https://centrumcyfrowe.pl/en/

The Copyright Directive challenged in the CJEU by Polish government (01.06.2019)
https://www.communia-association.org/2019/06/01/copyright-directive-challenged-cjeu-polish-government/

CJEU Case C-401/19 – Poland v Parliament and Council
http://curia.europa.eu/juris/liste.jsf?language=en&num=C-401/19

Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market, Article 17
https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:32019L0790#017

(Contribution by Natalia Mileszyk, EDRi member Centrum Cyfrowe Foundation, Poland)

close
11 Sep 2019

The Netherlands, aim for a more ambitious copyright implementation!

By Bits of Freedom

All EU Member States are obliged to implement the newly adopted EU Copyright Directive, including its controversial Article 17. But how to interpret it, is up to them. In the Netherlands, there is currently a draft bill, which is unfortunately very disappointing. The government really needs to try much harder to protect the interests of internet users.

What was that again, Article 17 (or 13)?

Article 17 (formerly Article 13) includes a provision that makes platforms directly responsible for the copyright infringement from content that users upload to those services. It does not solve the problem it’s supposed to solve, but it does limit the freedom of internet users tremendously. It will oblige online companies such as Google and SoundCloud to scan and approve everything their users upload. This will likely lead to those companies, in order to avoid legal liability, to refuse many uploads in advance.

The Netherlands found European rules harmful

The Dutch government was crystal clear in the debate that took place at the European level prior to the adoption of the Directive: these rules do more harm than good. In 2017, the Netherlands asked the lawyers of the European Commission critical questions about the legal sustainability of the proposal for the Directive. Much later in the process, the Netherlands voted against the text that was to serve as a basis start of the negotiations of the Member States with the Commission and Parliament. Later again, the Dutch permanent representation stated that the adopted proposal “does not strike the right balance between the protection of right holders and the interests of EU citizens and companies”.

From European to Dutch rules

Since this is a Directive, all Member States must incorporate the rules into national legislation. Now that the Directive has been adopted, and introducing chances at the European level is no more possible, transposition to national laws is the place to limit the damage. In other words: with a minimal transposition the rights of the internet user are protected to the maximum extent. If the Netherlands was so critical of the Directive, you would expect it to also do its utmost to try to limit as much as possible the damage it will cause in the transposition into national legislation. But, unfortunately…

On 2 July 2019, the Dutch Ministry of Justice and Security published a draft bill for this transposition. That proposal is disappointing in its lack of ambition to protect the interests of the internet users It does not limit the harm by providing an adequate limited implementation of Article 17, as it could, and it does not force that the guarantees for users provided in the Directive are properly explained.

A more ambitious proposal is desperately needed

EDRi member Bits of Freedom strongly urges the Dutch government to come up with a more ambitious bill. A transposition in which the damage of the Directive is limited as much as possible and the rights of the internet users are protected as much as possible. Because this is a particularly complex legal matter, Bits of Freedom also recommends that, prior to the drafting of the bill, an investigation be carried out into the scope that a Member State has to limit the damage of the Directive. This research could be carried out by academics with expertise in the field of copyright.

The Netherlands must do better

In short, the Netherlands must do better. The fact that the Directive has been adopted does not mean that the battle is lost. There’s still a lot that can be done to limit its potential negative impacts. Here, too, hard work pays off: you reap what you sow.

Bits of Freedom
https://www.bitsoffreedom.nl/

Come on government, stand up for us! (only in Dutch, 29.08.2019)
https://www.bitsoffreedom.nl/2019/08/29/kom-op-kabinet-kom-op-voor-ons/

Come on government, stand up for us! (11.09.2019)
https://www.bitsoffreedom.nl/2019/09/11/come-on-government-stand-up-for-us/

Bits of Freedom’s advice to the Dutch government (only in Dutch, 26.08.2019)
https://www.bitsoffreedom.nl/wp-content/uploads/2019/08/20190826-inbreng-bitsoffreedom-consultatie-implementatie-auteursrechtrichtlijn.pdf

Does #article13 protect users against unjustified content blockages? (only in Dutch, 25.03.2019)
https://www.bitsoffreedom.nl/2019/03/25/beschermt-artikel13-gebruikers-tegen-onterechte-content-blokkades/

Bill of Implementation Directive Copyright in the Digital Single Market (only in Dutch)
https://www.internetconsultatie.nl/auteursrecht

NGOs call to ensure fundamental rights in copyright implementation (20.05.2019)
https://edri.org/ngos-call-to-ensure-fundamental-rights-in-copyright-implementation/

(Contribution by Rejo Zenger, EDRi member Bits of Freedom; translation from Dutch to English by Bits of Freedom volunteers Celeste Vervoort and Martin van Veen)

close