18 Jun 2018

We’re looking for policy interns to join our Brussels team. Is that you?

By Kirsten Fiedler

European Digital Rights (EDRi) is an international not-for-profit association of 39 digital human rights organisations from across Europe. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, freedom of expression, and access to information.

privacy kids

Join EDRi now and become a superhero for the defence of our rights and freedoms online!

The EDRi office is currently looking for two interns to support our Policy team in Brussels. This is your opportunity to get first-hand experience in EU policy-making and contribute to promote digital rights and freedoms across Europe. This six-month internship starts on 3 September 2018 and ends on 28 February 2019. The internship is paid 750,- EUR per month.

Key tasks:

  • Research and analysis on data protection, privacy, copyright; or on surveillance & law enforcement, freedom of expression and intermediary liability, net neutrality and digital trade;
  • Monitoring and reporting about international, EU and national-related policy developments;
  • Organising and participating in meetings and events;
  • Writing articles for the EDRi-gram newsletter;
  • Assisting with the preparation of draft reports, position papers, presentations and other internal and external documents;
  • Development of public education materials;

Qualifications:

  • A demonstrated interest in and enthusiasm for human rights and technology-related legal or policy issues;
  • Good understanding of the EU decision-making;
  • Experience in the fields of data protection, privacy, copyright, intermediary liability & freedom of expression, surveillance & law enforcement, net neutrality or digital trade would be an asset;
  • Excellent research and writing skills;
  • Fluent command of spoken and written English; other languages is a plus;
  • Computer literacy; advanced technical knowledge is a plus.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to julien.bencze(at)edri.org by 1 July 2018.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. People from all backgrounds are encouraged to apply and we strive to have a diverse and inclusive working environment.

The closing date for applications is 1 July 2018. Please note that due to scarce resources, only shortlisted candidates will be contacted.

Find out more about Policy internships at EDRi.

close
13 Jun 2018

12 days of digital rights in Brussels. Was it Christmas?

By ApTI

This article is a short story about my participation in the Brussels exchange programme. Thanks to the Digital Rights Fund and Wikimedia, I was able to spend two and a half weeks (12 working days) with like-minded people and organisations and bring a new blast of energy to my efforts to fight the copyright censorship machine and snippet tax.

Here is how it went. First the advocacy lessons, then the numbers.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

May 16 was an intense day. It was the first day of my exchange in Brussels and the first visit to the new EDRi office. However, I had to put the kind welcome from the EDRi team behind me quite quickly because because I got an impromptu call from an MEP to reschedule a meeting about the new copyright proposal for that day. This meeting turned out to be like no other. It lasted four hours and the output was a letter, which was sent by the MEP to the JURI committee members asking to vote against Articles 11 and 13.

Victory? Well, looking back I’m thinking: wow, this was quite something! But while it happened I didn’t register it as a success. The main reason for this is that the meeting really was a power show. The MEP had an important message for me. No matter how well prepared you are, no matter how well you know all the details of this and that article, what always wins is whether you have the ability to explain in such simple but impactful language that inspires a positive reaction and support for what you are proposing. Therefore, all my in depth analysis and all my legal arguments were distilled to blunt, stark messages wrapped in this short letter. This was the first reality check for my advocacy skills.

Did this letter influence or change anything? Hard to tell. Looking at Julia Reda’s vote count, there’s still a lot to be done. By the way, did you join our Action Day on 12th June? If not, after 20th of June it’s not too late to pick up the phone and Save Your Internet. Every call, email, post, video, shout for your freedom of expression counts!

Coming back to the advocacy lessons learned, the second reality check for my advocacy skills was an advice I received from one of the decision makers I met: the most effective and sure-to-be-taken-into-consideration format in which they would ideally like to receive amendments is by sending a simple table with two columns. One column with the proposed legal text and the other column with how I want it to be changed. Simple and straightforward.

However, what I also know from previous experience is that decision makers also need long and boring analysis to be able to point to and base their decision. Therefore, in depth analysis and formal opinions need not be underestimated. They just need to be complemented with catchy, simple and distilled documents. How to find resources in small organisations to be able to do both, is still for me to find out in my next quest on advancing digital rights movements.

Now here’s how the exchange programme looked like in numbers:

    • 3 MEP meetings
    • 2 Member State Permanent Representation meetings
    • 3 copyright reform document reviews
    • 7 letters on copyright reform sent to decision makers
    • 2 European Parliament hearings on Cambridge Analytica
    • 1 ePrivacy meeting with Council attachées & civil society & lots of networking and Belgian fries 🙂

But this is just the content part. The other half of my exchange was focused on how to grow an organisation. On the admin side, Kirsten and Katarina fully emerged me in strategic planning and fundraising. As concrete results, I built with their help a Case for Support document for ApTI which will be used in fundraising activities. While there a lot of tips & tricks that I can immediately implement, I am also more confident on how to take strategic planning by the horns once I’m back.

Big “Thank You” to the entire EDRi team for welcoming me into their busy office and to all EDRi members for supporting my Digital Rights Fund application and making this possible! Also, many thanks to Wikimedia for knowledge sharing and preparatory meetings!

ApTI tweets in English @ApTI_ro and started a Bucharest Digital Rights Meetup channel.

Read more:

ApTI
https://www.apti.ro/

EDRi’s “Brussels Exchange Programme” – turning theory into practice (07.02.2018)
https://edri.org/edris-brussels-exchange-programme-turning-theory-into-practice/

(Contribution by Valentina Pavel, EDRi member ApTI, Romania)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

EU – Japan trade agreement undermines algorithmic transparency

By Vrijschrift

The EU trade agreement with Japan undermines algorithmic transparency, Dutch EDRi member Vrijschrift wrote in a letter to the Dutch Parliament. In order to have regulatory supervision, we need access to source code and algorithms. The Volkswagen emissions scandal has shown that devices can be programmed to be misleading. In addition, algorithms in decision making software can be biased. Facebook’s role in elections and referendums shows that the use of personal data is not only a civil rights issue, but may compromise the integrity of our institutions.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Politicians call for algorithmic transparency and software audits. However, the EU-Japan trade agreement’s software code clause limits the possibilities to audit software and algorithms. Under the agreement’s article 8.73 the EU and Japan may not require the transfer of, or access to, source code of software owned by a person of the other Party. The article provides some exceptions, but they have a limited scope or are limited by strict conditions.The clause is in conflict with important policy objectives; Vrijschrift calls for a parliamentary scrutiny reservation.

You can read Vrijschrift’s letter to the chairman of the trade committee Raymond de Roon below:

We would like to express our concerns regarding the trade agreements with Japan and Singapore. These agreements fall under the EU’s competence; no ratification by the Netherlands is necessary. The EU can already decide to sign the treaties on June 26. We believe that the House should make a parliamentary scrutiny reservation.

It has recently become clear that the protection of personal data is not just a matter of civil rights. The scandal surrounding Facebook has shown that also the integrity of our institutions is at stake. The European Commission and European politicians (e.g. Merkel and Verhoeven) rightly want greater algorithmic transparency. However, the EU-Japan trade agreement’s source code clause will undermine the investigation of algorithms. A clear conflict between an important policy objective and a trade agreement.

The European Commission recently proposed a stronger safeguard for the protection of personal data in trade agreements. This safeguard has not been included in the treaties with Japan and Singapore, although these treaties require to allow cross-border data traffic. The Commission provided half-work, which we consider to be irresponsible in the light of the necessity to protect civil rights and the integrity of our institutions.

The treaties with Japan and Singapore limit the possibilities for reforming copyright and patent law. The treaty with Singapore contains higher damages than the ACTA treaty, which was rejected by the European Parliament.

The proposed treaties deserve serious scrutiny; we believe that the House should create room for this.

Read more:

Vrijschrift letter, English translation: EU trade agreement with Japan undermines algorithmic transparency
https://www.vrijschrift.org/serendipity/index.php?/archives/223-EU-trade-agreement-with-Japan-undermines-algorithmic-transparency.html

Vrijschrift letter (original in Dutch):
https://www.vrijschrift.org/serendipity/index.php?/archives/222-Handelsverdrag-met-Japan-ondermijnt-algoritmische-transparantie.html

EU-Japan trade agreement enables Internet of Cheating Things
https://people.vrijschrift.org/~ante/japan/eu-japan-e-commerce.pdf

EU-Japan trade agreement not compatible with EU data protection
https://people.vrijschrift.org/~ante/japan/vrijschrift-japan-data-flows.pdf

EU-Singapore trade agreement not compatible with EU data protection
https://people.vrijschrift.org/~ante/singapore/Singapore-data-flows.pdf

EU-Japan trade agreement’s intellectual property chapter limits options for reform
https://people.vrijschrift.org/~ante/japan/IP-EU-Japan2.pdf

ACTA-plus damages in EU-Singapore Free Trade Agreement
https://blog.ffii.org/acta-plus-damages-in-eu-singapore-free-trade-agreement/

(Contribution by Ante Wessels, EDRi member Vrijschrift, the Netherlands)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

ePrivacy for Children: What is Data Protection Culture?

By Alternatif Bilisim

The General Data Protection Regulation (GDPR) attracted widespread attention and comment in recent weeks when it came into force on 25 May 2018. Having taken several years to get from being proposed by the European Commission to entering into force, the GDPR has been designed as a concerted, holistic and unifying effort to regulate personal data protection in the digital age.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

At a time when many public, private and third sector organisations have only recently ‘gone digital’ and when data has very rapidly becoming seen as ‘a new currency,’ the scope of application of the GDPR is vast. Serious fines can applied to firms, that do not abide by the new rules. This is no coincidence of course; recent Cambrige Analytica and Facebook violations of privacy forced the public debate to grow and with that awareness of what is at stake.

It is not only the scandals on the surface that have piqued the interest of the average user, though; the capital and energy spent on the data gathering fetish of social media platforms is also a key determinant of the process. The right to erasure is also more easily applicable from now on, signifying more meaningful control over data and the erosion of post-capitalist surveillance society. However, in the decade of tl;dr (too-long-did-not-read) and post-truth, this type of detailed regulation might be a little too complicated to understand for internet users of all ages.

Through the lens of a researcher-mother, one is quickly struck by the image of hyper-socialised millennium generation on massive platforms like Facebook and Instagram. GDPR brings special conditions for childrens’ data. Well, living in Turkey with your child right beside you is not a comfort; you are still spending 16+ hours of your day connected to inter-networks.

The GDPR makes some specific requirements in respect of children’s data, for reasons set out in recital 38: “Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counseling services offered directly to a child.”

While this statement has much merit, it is only an explanatory recital, guiding implementation of the GDPR but lacking the legal force of an article. In a recent London School of Economics Media Policy Project roundtable, it became clear that there is considerable scope for interpretation, if not confusion, regarding the legal basis for processing (including, crucially, when processing should be based on consent), the definition of an information society service (ISS) and the meaning of the phrase “directly offered to a child” in Article 8 (which specifies a so-called “digital age of consent” for children), the rules on profiling children, how parental consent is to be verified (for children younger than the age of consent), and when and how risk-based impact assessments should be conducted (including how they should cover intended or actual child users). It is also unclear in practice just how children will be enabled to claim their rights or seek redress when their privacy is infringed.

Already there are some surprises. WhatsApp, currently used by 24% of UK 12-15 year olds, announced it will restrict its services to those aged 16+, regardless of the fact that in many countries in Europe the digital age of consent is set at 13. Instagram is now asking its users if they are under or over 18 years old, perhaps because this is the age of majority in the United Nations Convention on the Rights of the Child (UNCRC)? We will see how things will unfold in the coming months.

In the meantime, a few suggestions are made by Sonia Livingstone of the London School of Economics in the light of a new project. For exploring how children themselves understand how their personal data is used and how their data literacy develops through the years from 11-16 years old, (1) conducting focus group research with children; (2) organising child deliberation panels for formulating child-inclusive policy and educational/awareness-raising recommendations; and (3) creating an online toolkit to support and promote children’s digital privacy skills and awareness. The young generation reminds us once again of the responsibility for creating commons data culture at grassroots level.

Do such changes mean effective age verification will now be introduced (leading to social media collecting even more personal data?), or will the GDPR become an unintended encouragement for children to lie about their age to gain access to beneficial services, as part of their right to participate? How will this protect them better? And what does this increasingly complex landscape mean for media literacy education, given that schools are often expected to overcome regulatory failures by teaching children how to engage with the internet critically? As in the case of Turkey, teachers digital literacy skills need a serious and rapid boost and even more primarily, policies regarding internet governance and community education must be redrafted.

Translated from the Original Text by Asli Telli Aydemir, Alternative Informatics (Alternatif Bilisim)
You can read the original text in Turkish here.

Read more:

A Digestible Guide to Individual’s Rights under GDPR (29.05.2018)
https://edri.org/a-guide-individuals-rights-under-gdpr/

EDRi General Director Joe McNamee live interview on TRTWorld
https://www.youtube.com/watch?v=pKvcZz8TKuE

GDPR Exlained Campaign
https://gdprexplained.eu

Time to Disagree Campaign
https://timetodisagree.eu

EDRi`s privacy for kids booklet
https://edri.org/privacy-for-kids-digital-defenders/

(Contribution by Alternatif Bilisim, EDRi member, Turkey)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

Litigation against the Danish government over data retention

By IT-Pol

Despite two rulings from the Court of Justice of the European Union (CJEU) in 2014 and 2016 against general and undifferentiated (blanket) data retention, a majority of EU Member States still have national data retention laws in place. Denmark is one these Member States.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Two months after the Tele2 judgment of 21 December 2016 (joined cases C-203/15 and C-698/15), the Danish Minister of Justice told the Legal Affairs Committee of the Parliament that the Danish data retention framework does not comply with EU law because it covers every subscriber. At the same time, the Minister of Justice refused to repeal the illegal data retention provisions and argued that there was not a specific deadline for how quickly EU Member States are required to adapt their national laws to comply with a judgment from the CJEU.

As of June 2018, the Danish Ministry of Justice is officially waiting for guidance from the European Commission before a new data retention law can be proposed to the Danish Parliament. The European Commission has promised to deliver guidance on how Member States can comply with the Tele2 judegment which bans blanket data retention, but does not rule out targeted data retention. From the outside, it would appear that the Danish government has very little interest in resolving the current deadlock, since the existing data retention provisions are still in place. The same applies to the provisions for access to the retained data which also require substantial changes to comply with the second part of the Tele2 judgment. For example, access to certain types of retained data, in particular location data, is not limited to cases involving serious crime.

A majority in the Danish Parliament has so far approved the government’s plan to postpone the revision of the now illegal data retention law. However, the Danish government is facing a new challenge since the Associaton Against Illegal Surveillance filed a lawsuit on 1 June 2018 against the Minister of Justice demanding the immediate annulment of the data retention provisions. The Association Against Illegal Surveillance was formed in the beginning of 2018, shortly after the Minister of Justice announced that contrary to his earlier expectations, there would not be a revision of the data retention law in the parliamentary year 2017-18. The association, led by spokesperson Rasmus Malver, with a professional background in human rights law, initiated a very successful crowdfunding campaign on social media, which has so far collected 60,000 euros from more than 1000 individuals and some larger donations from civil rights organisations, including Amnesty International Denmark.

On 4 June 2018, the data retention lawsuit against the Danish government received a considerable economic boost when the Danish Civil Rights Fund (Borgerretsfonden) backed the lawsuit with a guarantee to cover up to 50% of the legal expenses. This effectively doubles the contributions received through crowdfunding. The objective of the Civil Rights Fund is promote the rights of the individual versus the state and to provide legal assistance to individual citizens whose rights have been violated by public authorities. Law professor and board member of the Civil Rights Fund Hanne Marie Motzfeldt gave the following explanation for the large economic donation to the data retention lawsuit, in an interview with the Danish newspaper Politiken: ”The foundation of our form of government is that public authorities comply with the law. If they fail to do so, we must use the courts.”

The Minister of Justice has not yet responded to the lawsuit. Denmark does not have a constitutional court or a specialised legal system for challenging the validity of laws or administrative provisions. Such legal challenges are handled by the ordinary court system as civil litigation procedures, starting at the lowest court level (District Courts). The Association Against Illegal Surveillance has applied to have the case transferred to the High Court, from which the appellant court would be the Danish Supreme Court. If approved, this is likely to save time.

Since legal challenges against laws or government decisions occur very infrequently in Denmark, it is difficult to predict how long the legal proceedings will take. The Ministry of Justice has various options for delaying the case. One of these options is to ask the court to rule on whether the plaintiff has legal standing in the case. The plaintiff has carefully addressed the issue of legal standing in the complaint to the court, but a dispute over legal standing will inevitably delay the proceedings and the possibility of getting a court ruling against data retention in Denmark.

Read more:

Denmark: Our data retention law is illegal, but we keep it for now (08.03.2017)
https://edri.org/denmark-our-data-retention-law-is-illegal-but-we-keep-it-for-now/

Eurojust: No progress to comply with CJEU data retention judgements (29.11.2017)
https://edri.org/eurojust-no-progress-to-comply-with-cjeu-data-retention-judgements/

Website of the Association Against Illegal Surveillance (”Foreningen mod ulovlig logning”)
https://ulovliglogning.dk/en/

Website of the Civil Rights Fund (”Borgerretsfonden”), in Danish
http://borgerretsfonden.dk/

The citizens’ lawsuit against the Minister of Justice receives a large economic boost, Politiken (in Danish, 04.06.2018)
https://politiken.dk/indland/art6559885/Borgeres-retssag-mod-S%C3%B8ren-Pape-f%C3%A5r-stort-%C3%B8konomisk-boost

(Contribution by Jesper Lund,  EDRi member IT – Political Association of Denmark (IT-Pol), Denmark)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

Wiretapping & data access by foreign courts? Why not!

By Anamarija Tomicic

After the European Commission published two new legislative proposals for law enforcement authorities to be able to reach across EU borders to have access to data directly from service providers, the EU Member States started working on this new “e-evidence” package. The proposal has so far become the object of wide-spread criticism from service providers, and civil society organizations, including EDRi, because it raises serious questions concerning privacy, data protection and basic principles such as the right to defence and access to effective remedies. At the request of very few Member States, the EU Council had a discussion about including two new elements to the already broad scope of the proposals, direct access to data and real-time interception of data. These new proposals add yet more concerns regarding individuals’ rights.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

While the objective of this legislationis to improve criminal investigations by facilitating law enforcement authorities’ cross-border access to data in another EU Member State, the proposal creates a shortcut giving private companies a role previously carried out by judicial authorities. According to the existing proposal, companies that store individual’s data – including big companies such as Facebook, Google and Microsoft, but more crucially, small companies without the resources or expertise of the internet giants – will be obliged to give access to individuals’ data if demanded by law enforcement authorities, in some cases without the intermediation of a court.

The new proposal of some EU Member States to have direct access to data takes this practice a step further. It would mean that authorities could access data stored on private companies’ servers at any time. In combination with real-time interception of data, the proposed law could enable mass surveillance of individuals across Europe without appropriate safeguards. On 4 June the EU Council decided to postpone the discussions on whether to include direct access to and real time interception of data in the Regulation to October 2018.

In addition to working on its own legislation, the EU is considering an executive agreement with the US Government based on the flawed US CLOUD Act, which would enable the US to directly access data from European companies and viceversa, and potentially include real time surveillance like in the US CLOUD Act. The agreement would grant the EU and the US mutual access to data.

In conclusion, some EU Member States more making these proposals even more extreme than they already are by pushing for real time interception and direct access to citizens’ data without appropriate safeguards and an agreement with the US that could have even further implications on mass surveillance and individuals’ rights such as data protection and privacy. EDRi warned against these proposals even before the drafts were published and will keep working with other stakeholders and policy-makers to change this worrisome situation.

Read more:

Council Presidency Note on “E-evidence” (28.05.2018)
http://data.consilium.europa.eu/doc/document/ST-9418-2018-INIT/en/pdf

Outcome of Council of meeting on Justice and Home Affairs (04-05.06.2018)
http://www.consilium.europa.eu/media/35542/st09680-en18.pdf

EU “e-evidence” proposals turn service providers into judicial authorities (17.04.2018)
https://edri.org/eu-e-evidence-proposals-turn-service-providers-into-judicial-authorities/

EDRi’s response and annex to the European Commission’s consultation on cross-border access to e-evidence (16-27.10.2017)
https://edri.org/files/consultations/e-evidence_edriresponse_20171027.pdf

https://edri.org/files/consultations/annexconsultatione-evidence_20171026.pdf

(Contribution by Anamarija Tomicic, EDRi Communications and Community Officer)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

Civil society calls for protection of communications confidentiality

By Diego Naranjo

On 31 May EDRi, Access Now, and Privacy International met attachés to the EU Council (representatives of EU Member States) who work on the ePrivacy Regulation proposal. Following up to our recent two letters on ePrivacy (here and here), the Dutch Permanent Representation in Brussels and the Bulgarian EU Council Presidency kindly hosted us to discuss the ePrivacy proposal.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

During our meeting with the attachés we expressed, first of all, the need to adopt a strong ePrivacy Regulation in 2018. In an interconnected world, where our online behaviour and our private communications can be tracked by a duopoly of advertisers and a murky cloud of data brokers, the EU needs to take a step forward and ensure a high level of protection for the confidentiality of our electronic communications.

Second we also highlighted the need to clarify that privacy and confidentiality should cover our devices and information about them (location, type of software, etc.). We also highlighted the need to protect communications metadata and for clarification regarding the use of offline tracking for “measuring” purposes.

In addition to this, we stressed the threat to confidentiality that tracking walls represent, and how they are the opposite of informed consent. The current situation, where users are required to agree to an unlimited “take it or leave it” amount of unnecessary processing of their personal data, needs to be corrected by the ePrivacy Regulation.

Finally, we called for a strong provision requiring privacy by design and by default in software and hardware used for electronic communications. In order for our devices (IoT objects, laptops, smartphones…) to operate securely, the settings of all the components of terminal equipment placed on the market should be configured by design and by default to prevent third parties from storing information, processing information already stored in the terminal equipment and preventing the use by third parties of the equipment’s processing capabilities.

The EU Council needs to finalise negotiations during the Austrian Presidency (starting July 2018) while taking care of the details that need to be improved. Given that the GDPR has entered into application in the European Union, the ePrivacy Regulation needs a final commitment from EU policy makers in order to ensure legal certainty, enhanced privacy protections and a ban on pervasive tracking of individuals.

Read more:

Document presented to TELE Council attachés as our key points during the meeting with them on 31.05.2018
https://edri.org/files/eprivacy/20180530-TELE-EUCouncil-EDRi-AN-PI.pdf

Mythbusting – Killing the lobby myths that are polluting the preparation of the e-Privacy Regulation
https://edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

EU Member States fight to retain data retention in place despite CJEU rulings (02.05.2018)
https://edri.org/eu-member-states-fight-to-retain-data-retention-in-place-despite-cjeu-rulings/

ePrivacy: Civil society letter calls to ensure privacy and reject data retention (24.04.2018)
https://edri.org/eprivacy-civil-society-letter-calls-to-ensure-privacy-and-reject-data-retention/

Cambridge Analytica access to Facebook messages a privacy violation (18.04.2018)
https://edri.org/cambridge-analytica-access-to-facebook-messages-a-privacy-violation/

(Contribution by Diego Naranjo, EDRi Senior Policy Adviser)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

Censorship – don’t look left or right. Look ahead, look behind!

By EDRi

There is discussion about arbitrary censorship of our freedom of expression in every possible policy area these days. While the issue is intensely political, it is crucial to understand that arbitrary censorship is not a matter of left-wing or right-wing politics, but a threat to democracy as a whole.

----------------------------------------------------------------- Support our work - make a recurrent donation! https://edri.org/supporters/ -----------------------------------------------------------------

Human rights law in Europe and internationally foresees that there are conditions where restrictions on freedom of expression can justifiably be implemented. However, such restrictions need to be genuinely necessary and provided for by an accessible law that can be challenged in an independent court.

We can see how this principle works in practice in the US and Europe. The US Supreme Court ruled that speech that tends “to incite an immediate breach of the peace” is not protected by free speech rules. Similarly, in the well-known Handyside vs UK case, the European Court of Human Rights ruled that the plaintiff’s free speech rights had not been breached by a fine applied for the publication of an “obscene” book, but did set a high bar for such restrictions to be imposed.

The need for protection of freedom of expression represents a long-standing democratic consensus about safeguarding our fundamental freedoms from abuses, regardless of the political motivation of the abuse. Without freedom of expression, there is less political accountability. Less accountability means more abuse and more corruption.

Arbitrary censorship: An issue of human rights, not left and right.

We know from history that oppressive regimes and ideologies, whether they claim be from the left or from the right, have always sought to undermine freedom of expression. This is done overtly by censorship laws and/or insidiously by intimidation of the media.

Arbitrary censorship has resulted in pro-choice channels being repeatedly blocked on YouTube.
Arbitrary censorship has resulted in LGBT channels being “hidden” by YouTube.
Arbitrariness threatens the weakest.
Arbitrariness threatens democracy and accountability.

Looking back – history teaches us all we need to know.
Looking forwards – we must never repeat the mistakes of the past.

Fighting arbitrary censorship is about decency, equality and truth, not politics.

(Contribution by Joe McNamee, EDRi Executive Director)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
13 Jun 2018

Answering guide for European Commission’s “illegal” content “consultation”

By EDRi

The European Commission has published a short “consultation” on countering “illegal” content online, with a deadline of 25th June to respond. In order to ensure at least a little balance in outcome of the consultation, EDRi has prepared an answering guide to help you respond as an individual. We suggest opening the consultation in one browser tab and our answering guide in another tab, as the most user-friendly way of availing of our guide.*

Responding should take about 15 to 20 minutes and could have a long-lasting impact on anti-racism, hate speech, child protection, counter-terrorism, freedom of expression, privacy, among other important topics, in Europe.

The consultation follows increasingly frequent demands from the European Commission for arbitrary, unaccountable policing of the internet by service providers, including a Communication in September 2017 and a Recommendation in March 2018. Now, France and Germany are demanding legislation to impose still further restrictions – in the total absence of any evidence that this is necessary, proportionate… or even that it wouldn’t be counterproductive.

Techdirt, provided some good background on the consultation in an article entitled “EU Commission asks public to weigh in on survey about just how much they want the Internet to be censored”.

Click below to access the guide. Each response counts – please play your part.

 

Read more:

Commission’s position on tackling illegal content online is contradictory and dangerous for free speech (28.09.2017)
https://edri.org/commissions-position-tackling-illegal-content-online-contradictory-dangerous-free-speech/

EU Commission’s Recommendation: Let’s put internet giants in charge of censoring Europe (28.09.2017)
https://edri.org/eu-commissions-recommendation-lets-put-internet-giants-in-charge-of-censoring-europe/
*We would prefer to use frames to make it easier to see both the consultation and the guide at the same time. This is currently not possible due to the way the consultation is coded. We have asked the Commission to change this.

close
11 Jun 2018

EU Censorship Machine: Legislation as propaganda?

By EDRi

The European Parliament’s Legal Affairs Committee will vote on 20 June on a proposal which will require internet companies to monitor and filter all uploads to web hosting services.

The provisions are so controversial that supporters in the European Parliament have resorted to including purely political – and legally meaningless – “safeguards” in the text as a way of getting the proposal adopted.

For example:

“the measures referred to in paragraph 1 should not require the identification of individual users and the processing of their personal data.

The proposal requires internet companies to provide an “effective and expeditious complaints and redress mechanism”. It is logically impossible to have a filtering system that neither identifies the users nor processes their personal data but still, when content is removed, allows them to complain. What do they complain about when there is no record of the content uploaded by that specific person being deleted?

“ensure the non-availability

This is simply a more complicated and less easy to understand way of saying “upload filtering”

“1.b Members States shall ensure that the implementation of such measures shall be proportionate and strike a balance between the fundamental rights of users and rightholders”.

The Charter of Fundamental Rights applies to governments and the European Commission. The “agreements” to block and filter content would be a commercial decision and therefore outside the reach of fundamental rights legislation.

The Parliament and Member States already agreed (in the recently concluded Audiovisual Media Services Directive) to reject proposals for specific laws to protect fundamental rights in this field.

“and shall in accordance with Article 15 of Directive 2000/31/EC, where applicable not impose a general obligation on online content sharing service providers to monitor the information which they transmit or store”

Article 15 of Directive 2000/31/EC prohibits Member States from imposing a general obligation on internet companies to monitor information that they store. This text suggests upload filters indirectly, in order to circumvent the Charter and EU courts. The reasoning behind is that an obligation to enter into “voluntary” commercial agreement between two private parties “to prevent the availability” of online content will respect EU legislation, while the practices derived from its implementation can only lead to de facto general monitoring of uploads.

“The definition on online content sharing service providers under this directive does not cover services acting in a non-commercial purpose capacity such as online encyclopaedia, and providers of online services where the content is uploaded with the authorisation of all concerned rightholders, such as educational or scientific repositories.”

The fact that they had to include this text proves how wide the effects of Article 13 can be. The problem with this carve out is in the details: what is “acting in a non-commercial purpose” for foundations accepting donations? Similarly, how could future uses of such services be monetised without being “non-commercial”? Furthermore, these carve outs (allegedly targeting individual organisations like Wikipedia and GitHub) are written so vaguely that may not leave sufficient room for them – depending how each court in each Member State will interpret this – or for future similar services.

The vote is on 20 June. If you want to have your say and tell Parliamentarians what you think about this, go to www.saveyourinternet.eu to find out how.

close