On 26 February 2019, European Digital Rights and partner organisations from across Europe are re-launching the campaign SaveYourInternet.eu – with new items in the “toolbox”. Today, we add to our website the action prepared by our Austrian member epicenter.works: Pledge2019.eu. The campaign, managed by the EDRi network, has become the main platform for concerned citizens who want to contact EU policy makers about the proposed implementation of upload filters in the European Union.
Pledge2019.eu allows voters from all EU Member States to call their representatives free of charge and convince them to pledge to reject the upload filters included in the Article 13 of the controversial proposal for the EU Copyright Directive. Citizens are encouraged to consider parliamentarians’ stance on Article 13 when voting for the European Parliament election in May 2019.
The final vote in the European Parliament may take place as early as March, with the exact date yet to be announced. The (as of today) 751 representatives from all Member States will then have the option to reject upload filters in the copyright Directive.
However, misinformation fuelled by private interests tried to depict these concerned citizens as “‘bots”.
We are very close to getting rid of upload filters and obtaining a more balanced copyright Directive. Citizens need to raise their voice for the last time and use the EU elections in May to build the democratic echo around the #SaveYourInternet chorus.
On 28 January, EDRi member Panoptykon joined a complaint against Google and the Interactive Advertising Bureau (IAB) in Poland, after it had become clear that the advertising categories provided by these entities are enabling the processing of extremely sensitive data of European citizens. On 20 February, new evidence was published proving that the IAB was all along aware of the incompatibility of its systems with the General Data Protection Regulation (GDPR).
The background of the complaints
Besides Panoptykon’s complaint, proceedings have been launched with the national Data Protection Authorities (DPAs) in Ireland by Johnny Ryan of the browser company Brave, and in the United Kingdom by Jim Killock of EDRi member Open Rights Group (ORG), and by Michael Veale of University College London. The complainants agree that the “Real-Time Bidding” (RTB) standards that Google and the IAB define for the online advertising auction industry infringe Article 1(5)f of the General Data Protection Regulation (GDPR), because they broadcast highly sensitive personal data to thousands of companies. Bid requests are necessary in order to solicit bids from advertisers for the opportunity to show an ad to a person. However, the complainants argue that this can be accomplished safely with non-personal data. Instead, the IAB and Google standards permit labels such as “cancer”, “sexual health” (IAB), “substance abuse”, “eating disorder”, “right-” and “left-wing politics” (Google) to be broadcast along with unique identifiers and other personal data in bid requests. These data are protected as “special category” personal data in Article 9 of the GDPR.
IAB Europe’s response has been that it merely provides a technical standard, which might or might not be used by their members to violate privacy laws. Their statement was immediately countered by the complainants, who said that the IAB cannot claim to be a mere bystander because it organises and encourages a system through which personal data is broadcast billions of times a day without adequate security. The online tracking industry has attracted heavy criticism from civil rights groups in the past for its lobbying against privacy enhancing technologies, for instance regarding their huge influence in the ePrivacy Regulation and in the context of the implementation of the Do Not Track Signal.
The AdTech Lobby’s myths that not even they themselves believe
On 20 February, new evidence was published proving that not even the IAB is believing their public statements regarding the GDPR compliance of their RTB system. In the e-mails disclosed in a freedom of information request, a document was attached admitting that it is “technically impossible for the user to have prior information about every data controller involved in a real-time bidding (RTB) scenario” and that that would seem, “at least prima facie, to be incompatible with consent under GDPR”. Furthermore, the documents acknowledge that there is no technical way of limiting the ways in which personal data is used and shared after broadcasting it to thousands of vendors. This confession is further aggravated by the concrete technical examples of how sensitive the data shared through the system can be, and to what extent pseudonymisation (meaning data that is kept separate from identifiable elements) is lacking in daily practice.
The evidence presented comes with a surprising openness by the AdTech Industry about its likely lack of compliance with the GDPR. However, the argument that “only” organising the processing of personal data does not bring any responsibility for the subsequent uses of the system appeared grossly over-simplistic from the start, looking at European case law. Two recent decisions by the Court of Justice of the European Union (CJEU) suggest that the IAB’s counter argument will not hold: Wirtschaftsakademie and Tietosuojavaltuutettu.
Tietosuojavaltuutettu is particularly relevant to the question of Google’s and IAB’s responsibilities for the use of their RTB standards: the Court ruled that the global Jehova’s witnesses community is a joint controller of data processed solely by local member preachers, by virtue of its role as organiser and promoter of these activities. Clearly, this has an implication for the IAB and Google.
It is difficult to foresee an exact timeline for the complaint procedures, but the authorities are expected to act as soon as possible. After all, the complaint concerns the core mechanism that enables the secretive profiling of every single person that sets their foot online and the tracking of their private life.
Empowered through GDPR (and hopefully, soon, also by an ePrivacy Regulation), citizens and civil society now have the opportunity to reject the collection, broadcasting and ultimately capitalisation of the most private details of their lives. Surveillance Capitalism is starting to show signs of crumbling.
On 20 February, European Digital Rights (EDRi), along with ten civil society organisations from across the globe, responded to a public consultation on the Council of Europe’s Second Protocol to the Convention on Cybercrime (also known as the Budapest Convention).
The draft Protocol aims to establish international rules for cross-border access to personal data by law enforcement authorities from Council of Europe member countries. The Council’s Cybercrime Convention Committee (T-CY) sought contributions from stakeholders in particular on provisions relating to emergency mutual assistance as well as the languages to be used in such requests. Second Protocol is to be adopted by Parties to the Convention by December 2019.
EDRi and Electronic Frontier Foundation (EFF) coordinated a joint submission by civil society organisations to ensure that such emergency procedures would not be abused to circumvent legal safeguards protecting fundamental rights in the context of cross-border access to personal data. Our submission also upholds the right to an effective remedy by requiring the request to be translated into the language of the person whose data is being sought, so he or she can challenge the measure.
On 12 February 2019, the European Union Agency for Fundamental Rights (FRA) published an Opinion regarding the Regulation on preventing the dissemination of terrorist content online. In the same day, the European Data Protection Supervisor (EDPS) submitted its comments on the topic to the responsible committee in the European Parliament. These two texts complement EDRi’s analysis and the previous Report prepared by three UN Special Rapporteurs on the proposal.
FRA: Substantial threats for freedom of expression
In its Opinion, FRA structures its criticism around four main areas.
First, it calls to improve the definition of “terrorist content”. The Opinion highlights the need to add to this definition the concept of “incitement” or giving specific instructions to commit terrorist offences. The definition of such instructions should be aligned with the Terrorism Directive and specific actions such as “providing specific instructions on how to prepare explosives or firearms”. Further, the text calls to limit the proposal to content disseminated to the public and to exclude from the Regulation’s scope certain forms of expression, such as content that relates to educational, journalistic, artistic or research purposes.
Second, FRA calls to ensure that fundamental rights safeguards are in place through “effective judicial supervision”. Currently, there is no mention in the proposal of any “independent judicial authority in the adoption or prior to the execution of the removal order”. FRA also reminds of the need to avoid a disproportionate impact on the freedom to conduct a business when having to react to notices for removals of terrorist content in a very short time-frame (up to one hour in the original proposal). FRA suggests instead a reaction time of 24 hours from the receipt of the removal order. Regarding safeguards in cross-border removal orders, the Opinion calls to ensure that the authorities of the Member State where the content is hosted are “empowered to review the removal order in cases where there are reasonable grounds to believe that fundamental rights are impacted within its own jurisdiction.” FRA thus encourages the EU legislator to require a notification by the issuing Member State to the host Member State – in addition to the notification to the hosting service provider – when the removal order is issued.
Third, FRA states
that the proposal “does not sufficiently justify the necessity of
introducing the mechanism of referrals”, and suggests to
distinguish between content needing a removal order and content
requiring a referral.
Fourth, the Opinion
states that the proposed proactive measures of the Regulation come
very close to a general monitoring obligation. This is not only
prohibited by Article 14 of the EU’s eCommerce Directive, but also
generally incompatible with individuals’ right to freedom of
expression under Article 11 of the Charter of Fundamental Rights in
the European Union. Thus, FRA proposes to delete from the Regulation
text the obligation for Hosting Service Providers’ (HSPs) to
introduce proactive measures.
EDPS: Concerns for the Regulation’s data retention and GDPR compliance
While the EDPS
issued similar concerns regarding the definition of terrorist content
and the “one hour rule”, it also issued some targeted comments on
the concerns surrounding potentially privacy intrusive elements of
the Regulation proposal.
In the Regulation proposal, Hosting Service Providers’ have obligations to retain data of supposed terrorist content that they delete or disable access to on their platform. The EDPS presents substantive doubts whether such obligations would be compliant with case law of the Court of Justice of the European Union (CJEU). This opinion was based on the assessment that the proposed measures, in similarity to the Data Retention Directive that was struck down by the CJEU in 2014, do not lay down specific criteria regarding the time period and access and use limitations for the retained data. The EDPS is further not convinced of the overall usefulness of data retention measures in the Terrorist Content Regulation, given that the text obliges HSPs to promptly inform the competent law enforcement authorities of any evidence regarding terrorist offences.
On the proposal’s
foreseen proactive measures, the EDPS stated that automated tools for
recognising and removing content would likely fall under Article 22
of the General Data Protection Regulation (GDPR), which regulates
citizens’ rights in automated decision making and profiling
activities. This would, in turn, require more substantive safeguards
than the ones provided in the Commission’s proposal, including
case-specific information to the data subject, understandable
information about how the decision was reached, and the right to
obtain human intervention in any case.
The observations of the EU’s most important fundamental rights institutions feed into a steady stream of criticism of the proposal. These represent noteworthy positions for policy makers in the legislator institutions, particularly in the European Parliament’s LIBE, CULT and IMCO committees that are currently adopting their positions. It is now more evident than ever that the proposed Terrorist Content Regulation needs substantive reform to live up to the Union’s values, and to safeguard the fundamental rights and freedoms of its citizens.
The proposal for a new copyright Directive was originally aimed at modernising the copyright framework. However, it has fallen short of the initial expectations. Instead, the current proposal for the Directive text forces the implementation of upload filters and brings only minor improvements in other areas. In effect, the proposal could lead to unlawful restrictions on freedom of speech and reduce access to knowledge.
Read below a brief summary of the most significant developments in the Copyright reform.
September 2016 – EU Commission proposal
After years of public consultations (see here and here), the European Commission published a disappointing proposal in September 2016. European Digital Rights analysed the text.
June 2018 – EU Parliament Legal Affairs Committee (JURI) Report
Following months of intense debates and detailed analysis, the Report in the European Parliament lead Committee – Committee on Legal Affairs (JURI) – ignored key recommendations from the Committees on Internal Market and Consumer Protection (IMCO) and Civil Liberties (LIBE). By doing so, the JURI Committee missed the historic opportunity to bring a workable compromise on the table.
Thanks to massive actions by civil society, the European Parliament denied JURI a mandate to negotiate the proposed text on behalf of the entire Parliament in the trilogues between the EU institutions. With JURI’s proposal threatening open internet and citizens’ right to freedom of expression, the text was sent back to the drawing board and was opened for re-drafting.
September 2018 – European Parliament plenary vote / Trilogue mandate
The European Parliament ignored attempts from Members of the European Parliament (MEPs) Catherine Stihler, Julia Reda and Marietje Schaake to bring a better version to Article 13. After presenting a “reformed text” which contained implicit upload filters, the leading JURI Committee MEP Axel Voss managed to push the European Parliament (in plenary) to vote in favour of them. The new text was passed with a relatively small number of votes (366 for, 297 against).
October 2018 – Start of the trilogues
The European Parliament entered trilogues, the closed-door negotiations with the Council of the European Union, and the EU Commission.
14 February 2019 – End of trilogues: Upload filters strike back
On 14 February 2019, the trilogue negotiations predictably concluded an agreement of the final copyright Directive text that included Article 13’s upload filters. The agreement echoed the position of the two most powerful states, France and Germany, supported by the music industry. The text largely ignored the concerns of the public, internet luminaries, the UN Special Rapporteur on Freedom of Expression, civil society organisations,programmers, and academics.
14 February 2019 – “mob and bots”
The European Commission celebrated the trilogue text and called the adversaries to Article 13 and Article 11 “mob and bots”. The blog was harshly criticised, and thousands of people reacted by protests. The European Commission deleted the blogpost afterwards, justifying the deletion by saying that it had been “misunderstood”.
20 February 2019 – COREPER approval of the text agreed during trilogues
The Committee of Permanent Representatives (COREPER 1) approved the text agreed during trilogues.
What NEXT?
26 February 2019 – JURI Committee votes on trilogue text
It is expected that
the JURI Committee will endorse the negotiated text in the trilogues.
15-18 April 2019 (TBC) – Final vote in the EU Parliament’s plenary
All 750 MEPs will vote on the entire Copyright Directive text. This will be a “Yes / No” vote.
During the past year, our work to defend citizens’ rights and freedoms online has gained an impressive visibility – we counted more than three hundred mentions! – in European and international media. Below, you can find our press review 2018.
EDRi’s General Assembly will elect this year three Board members to replace two outgoing Board members and fill in one vacant position. Once the three Board members are elected, the new Board will select its Treasurer and Vice-President. 2019 is a special year, as EDRi’s General Assembly will also decide which Board member becomes the EDRi President.
All EDRi members are
invited to apply to become a member of theEDRi
Board.
As an EDRi Board Member, you will help shape the future of the organisation and the network and advance our mission to promote and protect human rights in the digital environment. You will have a responsibility as an employer of the EDRi office and vis-à-vis the members.
When electing new Board members, the General Assembly will give due considerations to ensure that the composition of the Board reflects the diversity of the EDRi membership and that there is a gender and geographic balance between the Board members. EDRi should seek to achieve representation of diverse cultural backgrounds on the Board. The GA will also consider the desired skills and experience needed for having the ability to effectively deliver a function of the Board. Such skills and experience include, but are not limited to: being an NGO board member, board chair or working with boards, fundraising knowledge, human resources insight, public relations, legal or technical skills, political experience, outreach/community experience and similar. null
To be eligible for election
Any
candidate must declare that they have permission to stand from their
own organisation, if their organisation is a member of EDRi;
Any
candidate joins the Board in their personal capacity and not as a
representative of an organisation;
Any
candidate must be endorsed. Members may set their own procedures for
endorsements. When an individual steps forward to run for a
position, he or she needs to seek endorsement from a member
organisation. Organisations will have the opportunity to withdraw
endorsement before the Board elections.
Any
candidate shall have the opportunity to make a presentation for
their candidacy in writing and in person at the General Assembly.
Any
candidate must not have served on EDRi’s Board for more than three
of the previous nine years, discounting any years prior to 2017 .
Any candidate must
declare any potential conflicts of interest, and anything else that
might affect their eligibility, such as criminal convictions (as
defined in EDRi’s Conflict of Interest Policy).
Required skills
Finances:
Budgeting and accounting (only for the position of Treasurer);
Experience
in advocacy, activism, outreach/community and campaigning for
digital human rights;
Experience
with NGO board membership, board chairing or working with boards, or
a similar role;
Fluent command of
spoken and written English.
Additional skills (to complement the current
Board):
Fundraising (especially corporate donations, high network individuals, foundation grants);
Knowledge of politics and culture in Northern and/or Southern Europe;
Knowledge of very small digital rights organisations (volunteers only / staff under 3);
Human Resources and governance;
Legal, financial and policy oversight;
Response to emerging issues;
Network development, liaison with the General Assembly.
Governs
EDRi in compliance with its statutes
and internal working regulations;
Meets
remotely on a quarterly basis to discuss general management issues
and twice in person for more detailed face-to-face discussions and
strategic planning workshops (Board travels are reimbursed by the
organisation);
Service
on the Board should not require more than the equivalent of 10 to 12
days (80 – 100 hours) per year (excluding General Assemblies).
Focuses
on strategy and accountability rather than day-to-day operations;
Participates
in deliberations and decisions in matters of policy, finance,
fundraising, programs, human resources and advocacy;
Supports
the growth of the network and deals with membership issues as they
arise;
Provides
advice on how EDRi should respond to emerging issues outside of the
strategic plan;
Assists in
developing and maintaining positive relations among EDRi members,
staff members, membership, donors and other stakeholders to enhance
EDRi’s mission;
Provides all
legal, policy and financial oversight to ensure compliance with the
laws and regulations.
Reviews the
internal regulations and policies and recommending any changes for
approval by Board and/or General Assembly as required
Oversees the
Executive Director(s), including supporting and participating in the
annual evaluation of the Executive Director(s);
Reviews policy and
other recommendations received from Board and senior management
staff, for decision.
How to apply:
To apply, please fill in the application form which includes a self-assessment of your skills and a brief biography (350 words max.) outlining your interest, experience and background. In addition, please include in your application a statement of endorsement by an EDRi-member and the signed Conflict of Interest Form.
The Election Committee
would be happy to answer any questions you may have about what being
a Board member entails or to provide contact details of EDRi members.
Write to elections.committee[at]edri.org
for further info.
The closing date for applications is 3rd March 2019. The vote will take place during the General Assembly, on 7th April 2019 in London.The closing date for applications is 3rd March 2019. The vote will take place during the General Assembly, on 7th April 2019 in London.
The behind-closed-doors discussions between the European Parliament negotiating team, EU Member States and the European Commission on the copyright Directive finalised last night with an agreement. The text, prepared by France and Germany, will be put to a vote between March and April in the European Parliament and could become law soon afterwards. The copyright Directive, originally aimed at “modernising” the copyright framework, has fallen short of those expectations. Instead, it forces the implementation of upload filters and brings only minor improvements in other areas. The proposal could lead to unlawful restrictions on freedom of speech and reduce access to knowledge.
The secret discussions have ended with the worst version of the “Censorship machine” we have seen so far. Citizens need to react, once again, to prevent these upload filters that threaten our freedom of expression from becoming reality.
– said Diego Naranjo, Senior Policy Advisor at European Digital Rights
If the unofficial text available is confirmed, it is in essence a transposition of the bilateral Franco-German deal reached last week. In its current version, Article 13 will bring direct liability for hosting providers.
Internet hosting services would be automatically considered to be performing a “communication to the public” when copyrighted material (or “other subject matter”) is hosted by them, regardless of whether it was uploaded by the company itself or by a user. The internet services shall then make “best efforts” to conclude licensing agreements with the rightsholders on any piece of copyrighted material (potentially every article, image, audio file and video uploaded to the internet). It is unclear how that will work in practice. Nevertheless, the elimination of the intermediate liability exception will likely leave companies no choice than to monitor every piece of content that is shared and uploaded on their platforms.
The only services to be exempted from liability, as introduced in the final deal, would be the few platforms that would fulfil the accumulative criteria that the online platform is: (a) less than three years old (b) making less than 10 million Euro annual turnover and (c) visited by less than 5 million unique visitors a month.
One MEP and two big member states have turned music investors into legislators, despite input from academics, the inventor of the World Wide Web, civil society and even the UN Special Rapporteur on Freedom of Expression. It is now up to people across EU to set the record straight and make their voice heard.
– added Diego Naranjo (EDRi).
This proposal has worsened many better versions that were discussed before in the European Parliament. It further ignores the main critique against Article13 – upload filters empower (mostly US-based) Big Tech companies to decide on restrictions on freedom of speech in the EU.
The vote on the final text will likely be cast in the EP plenary in late March or early April. We will continue to push for a substantial reform of the flawed provisions in the run-up to the vote. EDRi calls on everyone committed to a free and uncensored internet to raise their voice and contact MEP’s through the #SaveYourInternet campaign.
The 14th Big Brother Awards (BBA) in the Czech Republic will take place on 14 February 2019. Awards for the biggest privacy intruders in 2018 will be announced by EDRi member Iuridicum Remedium (IuRe). The Big Brother Awards, based on a concept created by EDRi member Privacy International, are intended to draw public attention to privacy issues and alarming trends in data privacy.
edri.org/wp-content/uploads/2015/09/Supporters_banner.png” alt=”—————————————————————–
Support our work – make a recurrent donation! edri.org/supporters/
—————————————————————–” width=”600″ height=”50″>
A jury comprising of experts on new technologies, lawyers, human rights defenders, as well as journalists will choose the winners out of nominations sent in by the general public. The Czech Big Brother Awards are granted in four categories:
the award for the biggest privacy intruder in the long-term perspective
the award for the biggest business privacy intruder
the award for biggest administrative privacy intruder
the award for Big Brother’s Quote (for the most appalling quote on a privacy-related topic)
Among the this year’s nominees are:
Facebook for large-scale data leaks
Google for continuous tracking the users’ locations without their consent
the People’s Democratic Republic of China for using algorithms and big data to select people for re-education
companies for using chip payments methods followed by collecting clients’ data at music festivals
initiatives requiring the introduction of CCTV in schools
In addition to the Awards for the privacy intruders, there is also a positive award, named after Edward Snowden, that goes to people or projects that stand for the privacy issues.
The 2017 Awards were given to Ministry of Industry and Trade (for data retention), to Equa bank (for forcing its clients to agree to provide the so-called TelcoScore), and to Member of the Parliament Jiří Běhounek (for his proposal for an amendment to the Act on Health Services that introduced an unrestricted access to electronic healthcare documentation). The positive Edward Snowden Award went to Open Whisper Systems (for developing the open source Signal application for end-to-end encrypted mobile communication).
The winners of the 2018 Awards will be revealed during the press conference on the morning of 14 February at the Cross Club Café Prague and announced in the Czech BBA Awards website.
After a six-year hiatus, leading Bulgarian digital rights organisations have revived their country’s version of the Big Brother Awards. Originated by EDRi member Privacy International in 1998, the concept of Big Brother Awards have been adopted by multiple civil society organisations in Europe and beyond. The event aims to increase awareness about the misuse of personal data and the harms that this can bring to individuals and society at large.
With companies and private institutions collecting, storing and processing enormous amounts of personal data in the pursuit of more efficient marketing and greater social control, the event committee received many nominations. EDRi member Internet Society Bulgaria (ISOC-Bulgaria) solicited suggestions from the public. These were evaluated by a jury of renown public figures, including lawyers, academics, consultants, journalists, and civil right activists. The jury subsequently presented two Awards: one to the state institution and the other to the private organisation that had “excelled” at violating citizens’ privacy.
The competition was fierce, with government institutions like the National Security Agency, the State Prosecutors Office, and the Anti-Corruption Commission being nominated. Finally, the 2018 Big Brother Award for a state organisation went to the Bulgarian Parliament, for its adoption of a new personal data law. In addition to setting new standards for the protection of citizens’ data against misuse by companies and institutions, the law sets specific rules for journalists and their reporting, when it touches upon the lives and activities of individuals. The law sets limitations around the impact that disclosing data will have on the personal life of the subject, the circumstances in which personal data becomes known to a journalist, and the importance of the personal data or its public disclosure to the public interest. All of these stipulations, when subject to interpretation by media outlets and courts alike, could result in a threat to media freedom, especially when it comes to reporting on the activities of public officials.
“On their own, the criteria are okay. They indeed aid protection, and are in the direction of increasing sensitivity. The issue we have with them is that they can have side effects, and have an adverse effect on another human right — the right to free expression and media freedom. They can be used as rules for conducting the journalistic profession, which in practice can amount to actual censorship. Therefore, the award is not so much for the criteria, as it is for not taking into account the risk of turning them into a threat to media freedom,” explained jury member and media expert Georgi Lozanov.
Dimitar Ganchev, member of the ISOC-Bulgaria board, added that another reason for which the Parliament was granted the Award was that the vote on the law was taken without any debate and with very low attendance by lawmakers.
The private sector organisation that clenched the Big Brother Award is the Center for Education and Qualification of Pedagogical Specialists Ltd., thanks to a massive personal data leak that affected more than 9 000 students and more than 2 000 of their parents. Lists with their personal data were exposed to the public through various mechanisms, including posting the details on a major social network. Other nominees in the private sector category included Municipal Forestry from Elin Pelin, Sofia University St. Kliment Ohridski, and Trimoncium, a medical centre in Plovdiv.
None of the nominees responded to the invitation to attend the ceremony. However, the event enjoyed wide press coverage.