05 Jun 2019

Our dependency on Facebook – life-threatening?

By Bits of Freedom

What is your priority when a terrorist attack or a natural disaster takes place close to where your parents live or where your friend went on holidays? Obviously, you would immediately like to know how your loved ones are doing. You will call and text them until you get in touch.

Or, imagine that you happen to be close to an attack yourself. You have little or no information, and you see a person with weapons running down the road. You would urgently call the police, right? You try to call, but it isn’t possible to connect to the mobile network. Your apps are not working either. You can’t inform your loved ones, you can’t find information about what’s going on, and you can’t call the police. Right at the time that communication and knowledge are vital, you can’t actually do anything. Afterwards, it appears that the telecom providers switched off their mobile networks directly after the attack, obeying police orders. This measure was necessary for safety, because it was suspected that the perpetrators were using the mobile network.

This scenario isn’t that far-fetched. A few years ago the telephone network in the San Francisco underground was partially disconnected. The operator of the metro network wanted to disrupt the demonstration against police violence after such a protest disturbed the timetable. The intervention was considered justified based on the safety of passengers. As a consequence of the previous demonstrations, the platforms had become overcrowded with passengers that couldn’t continue their journeys. However, the intervention was harshly criticised as the deactivation of the phone network had endangered the passengers – because, how do you, for example, alert the emergency services in an emergency situation when nobody’s phone is working?

Immediately after the terrorist attacks in Sri Lanka in April 2019, the government did something similar: it made services like Facebook unavailable, to avoid that the flow of speculations spread through platforms like Facebook would worsen the chaos.

In Sri Lanka, Facebook is practically a synonym for “the internet” – it’s the main communication platform in the country where the practice of zero-rating flourishes. As a result of Facebook’s dominance, contents that are published on the platform can very quickly have an enormous reach. And, it is exactly the posts that capitalise fear, discontentment, and anger that have a huge potential to go viral, whether they are true or not. Facebook in itself doesn’t have an incentive to limit the impact of these posts. On the contrary: the most extreme messages are contributing to the addictive nature of the social network. The posts themselves aren’t a threat to people’s physical safety, but in the context of terrorist attacks, they can be lethal.

The distribution of false information is apparently such a huge problem that the Sri Lankan government has no other option than to disconnect the main communication platform in the country. It’s a decision with far-reaching consequences: people are being isolated from their main source of information and from the only communication tool to reach their family and friends. We find ourselves in a situation in which the harmful side-effects of such a platform are perceived to be bigger than the gigantic importance of open communication channels and provision of information – rather no communication than Facebook-communication.

This shows how dangerous it is when a society is so dependent on one online platform. This dependency also makes it easier for a government to gain control by denying access to that platform. The real challenge is to ensure a large diversity of news sources and means of communication. In the era of information, dependency on one dominant source of information can be life-threatening.

This article was first published at https://www.bitsoffreedom.nl/2019/05/29/life-threatening-our-dependency-on-facebook/

Life-threatening: Our dependency on Facebook (only in Dutch, 06.05.2019)
https://www.bitsoffreedom.nl/2019/05/06/levensgevaarlijk-onze-afhankelijkheid-van-facebook/

BART Pulls a Mubarak in San Francisco (12.08.2011)
https://www.eff.org/deeplinks/2011/08/bart-pulls-mubarak-san-francisco

Social media temporarily blocked (21.04.2019)
https://news.lk/news/sri-lanka/item/25077-social-media-temporarily-blocked

Sri Lanka blocks social media, fearing more violence (21.04.2019)
https://www.nytimes.com/2019/04/21/world/asia/sri-lanka-social-media.html

(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands; translation from Dutch to English by Bits of Freedom volunteers Winnie van Nunen and Amber Balhuizen)

close
05 Jun 2019

Czech Constitutional Court rejects complaint on data retention

By Iuridicum Remedium

Czech EDRi member Iuridicum Remedium (IuRe) has fought for 14 years against Czech implementation of the controversial EU data retention Directive which was declared invalid by the Court of Justice of the European Union (CJEU). After years of campaigning and many hard legislative battles, the fight has finally come to an end: on 22 May 2019, the Czech Constitutional Court rejected IuRe’s proposal to declare the Czech data retention law unconstitutional. The court ended up rejecting the claim, despite it being supported by 58 deputies of the parliament across the political spectrum.

In the Czech Republic, data retention legislation was first adopted in 2005. In March 2011, the Constitutional Court upheld first IuRe’s complaint on original data retention legislation and canceled it. In 2012, however, a new legal framework was adopted to implement the EU Data Retention Directive – that the CJEU found to contravene European law in Digital Rights Ireland case in 2014, and to comply with the Constitutional Court’s decision. This new legislation contained still problematic general and indiscriminate data retention and a number of sub-problems. Therefore, even in the light of CJEU’s decisions, IuRe decided to prepare a new constitutional complaint.

IuRe originally submitted a complaint to challenge the very principle of bulk data retention as massive collection and storage of data of people, without any link to the individual suspicion in criminal activities, extraordinary events, or terrorist threats. The CJEU already declared this general and indiscriminate data retention principle inadmissible in two of its decisions (Digital Rights Ireland and Tele2). Although the Czech Constitutional Court refers to both judgments several times, their conclusions – especially when it comes to analyse the foundations of why data retention is not in line with the Czech Constitution – does not deal with it properly.

The Constitutional Court’s main argument to declare data retention constitutional is that as communications increasingly occur in the digital domain, so does crime. Even though this could be true,it is regrettable that the Constitutional Court did not further develop this reasoning and argued why this is in itself a basis for bulk data retention. The Court also ignored that greater use of electronic communication also implies greater interference with privacy that is associated with general data retention.

The Court further argued that personal data, even without an obligation to retain it, are kept in any case for other purposes, such as invoicing for services, answering to claims and behavioral advertising. In the Court’s opinion, the fact that people give operators their “consent” to process their personal data reinforces the argument to claim that data retention is legal and acceptable. Unfortunately, the Constitutional Court does not take into consideration that the volume, retention period and sensitivity of personal data held by operators for other purposes is quite different from the obligatory data retention prescribed by the Czech data retention law. Furthermore, the fact that operators need to keep some data already (for billing purposes for example) shows that police would not be completely left in the dark without a legal obligation to store data.

In addition to the proportionality of data retention, which has not been clarified by the Court, another issue is how “effective” data retention is to reduce crime. Statistics from 2010 to 2014 show that there was no significant increase in crime or reduction of the crime detection in the Czech Republic after the Constitutional Court abolished the obligation to retain data in 2011. Police statistics presented to the Court that data retention is not helping to combat crime in general, nor facilitating investigation of serious crimes (such as murders) or other types of crimes (such as frauds or hacking). In arguments submitted by police representatives and by the Ministry of the Interior, some examples of individual cases where the stored data helped (or hampered an investigation when missing) were repeatedly mentioned. However, it has not been proven by any evidence shown to the Court that general and indiscriminate data retention would improve the ability of the police to investigate crimes.

The Court also did not annul the partially problematic parts of the legislation, such as the data retention period (six months), the volume of data to be retained, or too broad range of criminal cases where data may be required. Furthermore, the Court has not remedied the provisions of the Police Act that allow data to be requested without court authorisation in cases of search for wanted or missing persons or the fight against terrorism.

In its decision, the Constitutional Court acknowledges that stored data are very sensitive and that in some cases the sensitivity of so-called “metadata” may even be greater than the retention of the content of the communications. Thus, the retention of communications data represents a significant threat to individuals’ privacy. Despite all of this, the Court discarded IuRE’s claim to declare data retention law unconstitutional.

IuRe disagrees with the outcome of this procedure in which the Court has come to a conclusion on the constitutional conformity of the existing Czech data retention legislation. Considering the wide support for the complaint, IuRe will work on getting at least a part of existing arrangements changed by legislative amendments. In addition to this, we will consider the possibility for the EC to launch infringing proceedings or initiate other judicial cases, since we strongly believe that the existing bulk data retention of communications data in Czech law still contravenes the aforementioned CJEU decisions on mass data retention.

Czech constitutional decision (only in Czech)
https://www.usoud.cz/fileadmin/user_upload/Tiskova_mluvci/Publikovane_nalezy/2019/Pl._US_45_17_vcetne_disentu.pdf

Proposal to revoke data retention filed with the Czech Court (10.01.2018)
https://edri.org/proposal-to-revoke-data-retention-filed-with-the-czech-court/

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium, Czech Republic)

close
05 Jun 2019

Facebook fails to avoid CJEU judgment on NSA case

By noyb

On 31 May 2019, the Irish Supreme Court decided over an unprecedented application by Facebook. The decision is part of an ongoing procedure on Facebook’s involvement with the United States Nationa Security Agency (NSA) under the so-called “PRISM” surveillance program before the Irish Data Protection Commission (DPC) and the Irish High Court.

The Supreme Court denied Facebook’s application in substance as the company was unable to substantiate its appeal. As a result, the Supreme Court decided not to take the actions requested by Facebook.

“Facebook again likely invested, again, millions to stop this case from progressing. It is good to see that the Supreme Court has not followed Facebook’s arguments that were in total denial of all existing findings so far. We are now looking forward to the hearing at the Court of Justice in Luxembourg next month,” said Max Schrems, complainant and chairperson of noyb.

The case follows a complaint by privacy lawyer Max Schrems against Facebook in 2013. More than six years ago, Edward Snowden revealed that Facebook allowed the US secret services to access personal data of Europeans under surveillance programs like “PRISM”. So far, the Irish DPC has not taken any concrete actions, despite the clear demands of the complaint to stop the EU-US data transfers by Facebook.

The case was first rejected by the Irish Data Protection Commissioner in 2013. It was then subject to a judicial review by the Irish High Court which made a reference to the Court of Justice of the European Union (CJEU). The latter ruled in 2015 that the so-called “Safe Harbor” agreement that allowed EU-US data transfers is invalid judgment in C-362/14 and that the Irish DPC must investigate the case.

The investigation lasted only a couple of months between December 2015 and spring of 2016. Instead of deciding over the complaint, the DPC filed a lawsuit against Facebook and Mr. Schrems at the Irish High Court in 2016, with a view to sending further questions to the CJEU. After more than six weeks of hearings, the Irish High Court found that the US government had engaged in “mass processing” of Europeans’ personal data and referred eleven questions to the CJEU for the second time in 2018.

In an unprecedented application made thereafter, Facebook has tried to stop the reference by asking the Irish Supreme Court to advise the High Court on the reference. The CJEU announced that it plans to hear the case (now C-311/18) on 9 July 2019 – about six years from the filing of the original complaints.

After a judgment of the CJEU, the DPC would have to finally decide over the complaint for the first time. This decision would again be subject to possible appeals by Facebook or Mr. Schrems to the Irish Court.

noyb
https://noyb.eu

Press Release: Irish Supreme Court dismisses Facebook’s final attempt to block CJEU reference on involvement with NSA mass surveillance (31.05.2019)
https://noyb.eu/wp-content/uploads/2019/05/SC_PA.pdf

Europe vs. Facebook
http://www.europe-v-facebook.org/prism/facebook.pdf

(Contribution by EDRi member noyb, Austria)

close
05 Jun 2019

BEREC workshop: Regulatory action by NRAs and consumer empowerment

By IT-Pol

On 29 May 2019, EDRi was invited to participate in a workshop of the Body of European Regulators for Electronic Communications (BEREC) on the planned update of its Net Neutrality Guidelines. Thomas Lohninger from Austrian EDRi member Epicenter.works and Jesper Lund from Danish EDRi member IT-Pol represented our network. Lund provided the following input to the regulators on regulatory action by the National Regulatory Authorities (NRAs).

Epicenter.works published a report in January 2019 which, among other things, surveys regulatory action based on the annual net neutrality reports by the NRAs. Port blocking is a severe form of traffic management since entire services, such as hosting of email or web servers by the end-user, are suppressed. This may be justified in certain situations, but requires a rigorous assessment under Article 3(3) third subparagraph, point b (preserve the integrity of the network) of Europe’s Net Neutrality Regulation (2015/2120).

Port blocking is generally quite easy to detect with network measurement tools. This is also noted in section 4.1.1 of BEREC’s Net Neutrality Regulatory Assessment Methodology (BoR (17) 178). Other forms of discriminatory traffic management are harder to detect. Based on this, it seems a reasonable conjecture to take NRA enforcement action on port blocking as indicative of the rigorousness of wider enforcement practices regarding traffic management. Unfortunately, detailed information on port blocking cases is not contained in most NRAs’ net neutrality reports.

Since the publication of the Net Neutrality Guidelines in August 2016, BEREC has launched a project to create an EU-wide network measurement tool, expected in late 2019. The measurement tool is based on the core principles of open methodology, open data, and open source. This means that the tool can be deployed on many devices, used by many end-users, and that the data generated through “crowdsourcing” by end-users (subscribers of internet access services, IAS) can be analysed by NRAs and other interested parties. In the opinion of EDRi, effective use of the forthcoming measurement tool, with crowdsourced measurement by end-users, will be a milestone in supervision and enforcement actions for traffic management practices.

Among other things, the measurement tool can be used for detection of unreasonable traffic management practices, establishing the real performance and Quality of Service (QoS) parameters of an IAS, assessing whether IAS are offered at quality levels that reflect advances in technology, and assessing whether the provision of specialised services risks deteriorating the available or general quality of IAS for end-users.

All of these tasks are specific obligations for NRAs under the Open Internet Regulation. As EDRi has highlighted before, the crowdsourcing aspect of the deployment of the measurement tool is very important as single measurements can contain a large element of noise, for example because of characteristic of the specific testing environment. In the aggregate, the noisy element can be expected to “wash out”, leaving the effect of the IAS traffic management practices or other network design choices by IAS providers.

When a measurement tool developed by BEREC is freely available to NRAs, the Guidelines on Article 5 of the Regulation should be updated to contain specific requirements and recommendations for the use of network measurement tools in the NRA supervision tasks. NRAs should, of course, be free to choose between their own measurement tools and methodology and the one offered by BEREC to all NRAs.

The Regulation does not per se require NRAs to establish or certify a monitoring mechanism. Needless to say, the Guidelines cannot change that. Therefore, most provisions in the Guidelines related to network measurement tools will have to be recommendations for NRAs.

However, the Regulation specifically requires NRAs to closely monitor and ensure compliance with Article 3 and 4 of the Regulation. While NRAs should be free to choose their own regulatory strategies, allowing these strategies to be adapted to the local “market” conditions and need for enforcement action, some proactive element is required on behalf of NRAs. Simply responding to end-user complaints cannot be sufficient to satisfy the obligation under Article 5.

In the opinion of EDRi, it will be very difficult for NRAs to fulfil their monitoring obligations under Article 5 without some form of quantitative measurement from the IAS network. The last sentence of recital 17 of the Regulation oncretely requires network measurements of latency, jitter and packet loss by NRAs to assess the impact of specialised services.

BEREC’s Guidelines with recommendations on the use of crowdsourced network measurements will have two positive implications for the net neutrality landscape in Europe. For the NRAs that follow the recommendations, and actively use the BEREC measurement tool, we will have quantitative monitoring of the compliance with articles 3 and 4 that is harmonised and comparable across EU Member States. This will, in itself, be hugely beneficial, and contribute to a consistent application of the net Neutrality Regulation.

In Member States where the NRA decides not to use the BEREC measurement tool (or its own), the recommendations in the Net Neutrality Guidelines could potentially facilitate shadow monitoring reports by civil society or consumer organisations. Of course, this can also be done without recommendations in the BEREC Guidelines or even with alternative measurement tools (than the one developed by BEREC), but adhering to the BEREC recommendations would create results that can be more easily compared with for example NRA net neutrality reports in Member States where the BEREC measurement tools is actively used.

EDRi will be pleased to contribute draft amendments to the Guidelines in order to formally incorporate a network measurement tool and crowdsourced measurements in the IAS network by end-users.

IT-Pol
https://itpol.dk/

Epicenter.works
https://epicenter.works/

BEREC Workshop on the update of its Net Neutrality Guidelines
https://berec.europa.eu/eng/events/berec_events_2019/202-berec-workshop-on-the-update-of-its-net-neutrality-guidelines

Europe’s Net Neutrality Regulation (2015/2120)
https://eur-lex.europa.eu/legal-content/EN/ALL/?uri=CELEX%3A32015R2120

BEREC Net Neutrality Regulatory Assessment Methodology
https://berec.europa.eu/eng/document_register/subject_matter/berec/regulatory_best_practices/methodologies/7295-berec-net-neutrality-regulatory-assessment-methodology

BEREC Guidelines on the Implementation by National Regulators of European Net Neutrality Rules
https://berec.europa.eu/eng/document_register/subject_matter/berec/regulatory_best_practices/guidelines/6160-berec-guidelines-on-the-implementation-by-national-regulators-of-european-net-neutrality-rules

Two years of net neutrality in Europe – 31 NGOs urge to guarantee non-discriminatory treatment of communications (30.04.2019)
https://edri.org/two-years-of-net-neutrality-in-europe-29-ngos-urge-to-guarantee-non-discriminatory-treatment-of-communications/

NGOs and academics warn against Deep Packet Inspection (15.05.2019)
https://edri.org/ngos-and-academics-warn-against-deep-packet-inspection/

(Contribution by Jesper Lund, EDRi member IT-Pol, Denmark)

close
05 Jun 2019

Facebook and Google asked to appoint representatives in Serbia

By SHARE Foundation

Three months before the new Serbian Law on Personal Data Protection becomes applicable, EDRi member SHARE Foundation asked 20 data companies from around the world – including Google and Facebook – to appoint representatives in Serbia as required by the new law. This is crucial for providing Serbian citizens and competent authorities with a contact point for all questions around the processing of personal data.

The new Law on Personal Data Protection in Serbia is modelled after the EU’s General Data Protection Regulation (GDPR) and creates an obligation for almost all large data companies to appoint representatives in the country. As soon as companies such as Google, Facebook, Amazon, Netflix or other IT giants offer products and services in Serbia for which it collects or processes personal data, it must appoint a representative. This can be a natural or legal person to which citizens can address their questions regarding their personal data rights. The representative must also cooperate with the Commissioner for Information of Public Importance and Personal Data Protection of the Republic of Serbia.

Google, for instance, has long recognised Serbia as a significant market and has adapted many services such as Gmail, YouTube, Google Chrome and Google Search to the local market. Additionally, Google targets Serbian citizens with localised advertisements and monitors their behaviour through cookies and other tracking technologies. Facebook is also available in Serbian and has about three million users in Serbia and collects and process huge amounts of personal data in order to profile them and show them targeted ads as described in SHARE Lab’s Facebook algorithmic factory research.

But because Serbia is not yet member of the EU, these companies do not grant Serbian users the same privacy protections as EU citizens. With permanent company representatives in Serbia, however, it would be more likely that Serbian citizens exercise their rights or initiate proceedings before competent authorities. This is why SHARE Foundation sent open letters to demand the appointment of representatives in Serbia to the following companies: Google, Facebook, Amazon, Twitter, Snap Inc – Snapchat, AliExpress, Viber, Yandex, Booking, Airbnb, Ryanair, Wizzair, eSky, Yahoo, Netflix, Twitch, Kupujem prodajem, Toptal, GoDaddy, Upwork.

SHARE calls Facebook and Google to appoint their representatives in Serbia (21.05.2019)
https://www.sharefoundation.info/en/share-calls-facebook-and-google-to-appoint-their-representatives-in-serbia/

Will Serbia adjust its data protection framework to GDPR? (24.04.2019)
https://edri.org/will-serbia-adjust-its-data-protection-framework-to-gdpr/

Running an algorithmic empire: The human fabric of Facebook (14.06.2017)
https://edri.org/running-an-algorithmic-empire-the-human-fabric-of-facebook/

Letter sent to Google
https://www.sharefoundation.info/wp-content/uploads/Law-on-Data-Protection-in-Serbia-New-legal-obligation-for-Google.pdf

Letter sent to Facebook
https://www.sharefoundation.info/wp-content/uploads/Law-on-Data-Protection-in-Serbia-New-legal-obligation-for-Facebook.pdf

(Contribution by EDRi member SHARE Foundation, Serbia)

close
23 May 2019

Captured states – e-Privacy Regulation victim of a “lobby onslaught”

By Chloé Berthélémy

Compared to non-governmental organisations and trade unions, private corporations are far better equipped to influence European level decision-making. A report “Captured states: when EU governments are a channel for corporate interests” by Corporate Europe Observatory’s (CEO) describes the various ways corporations approach the Member States of the European Union to maximise their impact.

When adopting EU’s laws and policies, Member States are key actors, along with the European Parliament and the Commission. Thus, lobbyists representing private corporations consider Member States as primary targets to influence the decisions at the European level in favour of their interests. The CEO report exemplifies how national governments become channels for corporate interests by relating numerous lobbying successes, including the e-Privacy Regulation (pdf).

The report maps out the various channels and decision-making fora that the EU, national-level trade associations, and multinational corporations target to push for their private interests. This includes the European Council, the rotating presidencies of the Council of the EU, the EU technical and scientific committees, and officials working at the permanent representations of Member States. Corporate lobbies also use the services of Brussels-based lobby consultancy firms to receive advice and to multiply lobby opportunities and accesses. As a result, and in comparison to the influence of NGOs, Corporate Europe Observatory finds a massive asymmetry in terms of lobbying capacity and resources.

ePrivacy Regulation – a case story of “corporate hyperbole”

As the case of the e-Privacy Regulation proposal outlines, the deeply problematic issue of corporate capture also threatens citizens’ fundamental rights in the digital sphere. Regulating the use of personal data by advertisers, publishers, and social media platforms, the proposal has been the victim of “a veritable lobby onslaught” by corporate lobbies with an interest in Big Data. An official following the e-Privacy file said that “99 per cent of the lobbying” had been from industry. These lobbying efforts have been so far successful in delaying negotiations and the adoption of the update of the only piece of privacy legislation in place in the EU. As a result of this pressure from private interests, the proposal is stalled by Member States, and EU citizens do not enjoy the full protection of their private communications online.

The report focuses on the German position and reveals the imbalance of representation in meetings with German officials between NGOs and industry lobbyists such as the publishing corporation Axel Springer, Deutsche Telekom, Facebook, and Google. The German government has been keen on defending its key telecom operator Deutsche Telekom’s demands, in particular asking for the processing of personal data on a pseudonymous basis and without consent.

Countering corporate influence and saving democracy

The report lays down primary ideas to reduce the impact of corporate lobbying on European legislative outcomes. These include:

  • Adopting national rules to prevent privileged access for corporate lobbies and to promote full lobby transparency.
  • Strengthening national parliamentary pre-decision scrutiny and post-decision accountability on government decision-making at EU level.
  • Reforms of the ways of working of the Council of Ministers, the European Council and the European Commission’s committees and expert groups to solve the democratic deficit.
  • Introducing new models of participation for citizens, such as participatory hearings on upcoming pieces of EU legislation, and improving and increasing key online consultations.

In addition to these issues raised by CEO, EDRi has repeatedly voiced criticism with regards to the transparency of trilogues – which are informal, non-democratic and non-transparent negotiations to fast-track adoption of legislation – and transparency of the Council of the EU, whose “confidential documents” are difficult to access, and whose working parties discussions are still taking advantage of significant opacity.

Without greater transparency and fairness of the process, civil society work will remain difficult, and corporate interests will continue to reign over public interests.

Infographics: Corporate lobbying & EU Member States
https://edri.org/files/Corporate-lobbying_EU-MS_web.pdf

Council continues limbo dance with the ePrivacy standards (24.10.2018) https://edri.org/council-continues-limbo-dance-with-the-eprivacy-standards/

How the online tracking industry “informs” policy makers (12.09.2018) https://edri.org/how-the-online-tracking-industry-informs-policy-makers/

European Ombudsman shares EDRi’s concerns on Council transparency (21.02.2018) https://edri.org/european-ombudsman-shares-edris-concerns-on-council-transparency/

EDRi’s response to the European Ombudsman consultation on transparency of legislative work within Council preparatory bodies (20.12.2017) https://edri.org/files/consultations/euombudsman_counciltransparency_20171212.pdf

(Contribution by Chloé Berthélémy, EDRi)

Twitter_tweet_and_follow_banner
close
22 May 2019

EDRi is looking for a new Head of Policy

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital human rights organisations. We defend and promote rights and freedoms in the digital environment, such as the right to privacy, personal data protection, freedom of expression, and access to information.

EDRi is looking for an experienced, strategic and dedicated Head of Policy to join EDRi’s team in Brussels. This is a unique opportunity to be part of the growth of a well-respected network of NGOs making a tangible difference in the defence and promotion of online rights and freedoms in Europe and beyond. This is a full-time, permanent position. The deadline to apply has been extended until 16 June 2019.

The Head of Policy will provide strategic leadership to EDRi Policy Team and designs policy and advocacy strategies in line with EDRi’s Strategic objectives and in consultation with member organisations. S/he is expected to bring a strategic vision on human rights in the digital environment as well as solid experience on human rights advocacy and digital rights. The successful candidate will have a strong track record in policy development and strategic planning in addition to an excellent understanding of working in the EU or national policy/advocacy environment.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of racism or other forms of discrimination to apply for this post.

Job title: Head of Policy
Reports to: Executive Director
Location: EDRi Office, Brussels, Belgium
Line management: The Head of Policy leads the advocacy effort of the Policy Team (4 persons) while the team is line managed by the Executive Director. The Head of Policy will participate in the Policy staff members’ appraisal and objective setting meetings. With the future growth of the organisation, and in consultation with employees, the position can include line management responsibilities.

RESPONSIBILITIES:

As Head of Policy, your main tasks will be to:

  • Advocate for the protection of digital rights, such as in the areas of data protection, privacy, freedom of expression, platform regulation, surveillance and law enforcement, telecommunications and digital trade;
  • Contribute to and evaluate progress towards EDRi policy strategic outcomes and develop activities in response to the external environment and in partnership with the team, members and the Board;
  • Provide the Policy Team with strategic advice and lead on advocacy strategies, including by coordinating, designing, and executing policy strategies and workplans in line with EDRi overall strategic objectives;
  • Draft and oversee the production of all policy documents, such as briefings, position papers, amendments, advocacy one-pagers, letters, blogposts, and EDRi-gram articles;
  • Support and work closely with EDRi colleagues including policy, communications, and campaigns – ensuring smooth working relations between the Policy Team and other teams – and report to the Executive Director;
  • Coordinate and collaborate with EDRi members on relevant legislative processes in the EU, including coordinating working groups, developing policy positions and campaign messages;
  • Collaborate with the EDRi team to communicate to the public about relevant legislative processes and EDRi’s activities;
  • Provide policy-makers with expert, timely, and accurate input and organise and participate in expert meetings;
  • Develop and strengthen relationships with civil society partners, EU institutions, government and institutional officials, academics and industry representatives working on related issues;
  • Represent – when relevant and in collaboration with the Executive Director and the Policy Team – the organisation as a spokesperson at public events, meetings and to the media.

QUALIFICATIONS AND EXPERIENCE:

  • Passionate about digital rights and enthusiasm to work within a small team to make a big difference;
  • Minimum 6 years of relevant experience in a similar role;
  • A university degree in law, EU affairs, policy, human rights or related field or equivalent experience;
  • Demonstrable knowledge of, and interest in human rights, in particular privacy, net neutrality, digital trade, surveillance and law enforcement, freedom of expression, as well as other internet policy issues;
  • Knowledge and understanding of the EU, its institutions and its role in digital rights policies;
  • Experience in leading advocacy efforts and creating networks of influence;
  • Exceptional written and oral communications skills;
  • Technical IT skills and knowledge of free and open source operating systems and software are a plus;
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Experience of organising events and/or workshops;
  • Ability to work in English. Other European languages an advantage.

How to apply:

To apply, please send a maximum one-page cover letter and a maximum two-page CV in English and in .pdf format to applications(at)edri.org by 16 June 2019.

Please note that only shortlisted candidates will be contacted.

close
22 May 2019

Hey Google, where does the path lead?

By Bits of Freedom

If you do not know the directions to a certain place, you use a digital device to find your way. With our noses glued to the screen, we blindly follow the instructions of Google Maps, or its competitor. But do you know which way you are being led?

Mobility is a social issue

Mobility is an ongoing debate in the Netherlands. Amsterdam is at a loss on how to deal with the large cars on the narrow canals, and smaller municipalities such as Hoogeveen is constructing a beltway to offset the Hollandscheveld area. Governors want to direct the traffic on the roads and as a result, they deliberately send us either right or left.

If all is well, all societal interest are weighed in on that decision. If one finds that the fragile village centre should be offset, the road signs in the berm direct the drivers around it. If the local authorities want to prevent cars from rushing past an elementary school, the cars are being routed through a different path.

Being led by commercial interests

However, we are not only being led by societal interests. More and more, we use navigation systems to move from place A to place B. Those systems are being developed by an increasingly smaller group of companies, of which Google seems to be the frontrunner. Nowadays, hardly anyone navigates using a map and the traffic signs on the side of the road. We only listen to the instructions from the computer on the dashboard.

In this way, a commercial enterprise determines which route we take – and it has other interests than the local authorities. It wants to service its customers in the best possible way. But who are these customers? For some companies, that’s the road users, but for others – often those where the navigation is free for the users – the customers that really matter are the invisible advertisers.

Too much of a short cut

And even that is too limited of course. Because which consideration the developer of the navigation system really takes is rarely transparent. When asking Google for a route from the Westerpark to the Oosterpark in Amsterdam, it leads you around the canal belt, instead of through it. That doesn’t seem to be the shortest route for someone on a bicycle.

Why would that be? Maybe Google’s algorithm is optimised for the straight street patterns of San Francisco and it’s unable to work with the erratic nature of the Amsterdam canals. Maybe it’s the fastest route available. Or maybe it’s a very conscious design choice so that the step-by-step description of the route does not become too long. Another possibility is that the residents of the canal belt are sick of the daily flood of cycling tourists and have asked Google, or maybe paid for it, to keep the tourists out of the canal belt. We simply don’t know.

Being misled

Incidentally, the latter-mentioned reason is less far-fetched than you would think at first. When you are in Los Angeles, you can’t miss the letters of the Hollywood Sign. A lot of tourists want to take a picture with it. Those living on the hill underneath the monumental letters are sick of it. They have, sometimes illegally, placed signs on the side of the road that state that the huge letters are not accessible through their street.

With the rise of digital maps that action became less and less successful. Pressurised by a municipal councillor, Google and Garmin, a tech company specialising in GPS technology, adjusted their maps so that tourists are not being led to the actual letters, but to a vantage point with a view of the letters. Both mapmakers changed their service under pressure of an effectively lobbied councillor.

Serving a different interest

It’s very rarely transparent which interests companies are taking into consideration. We don’t know which decisions those companies take and on which underlying data and rules they are based. We don’t know by whom they are being influenced. We can easily assume that the interests of such companies are not always compatible with public interests. This has a major impact on the local situation. If a company like Google listens to retailers, but not residents, the latter will be disadvantaged. The number of cars in and around the shopping streets is growing – which sucks, if you happen to live there. And even more so, if the local authorities do try to route the cars differently.

Again, this is another good example of how the designer of a technology impacts the freedom of the user of the technology. It also impacts society as a whole: we lose the autonomy to shape our living environment with a locally elected administration.

Moreover, this story is not only about the calculated route, but also about the entire interface of the software. The Belgian scientist Tias Guns described that very aptly: “There is, for example, an option to avoid highways, but an option to avoid local roads is not included.” As a driver, try and spare the local neighbourhood then.

The platform as a “dead end”

Adding to that – ironically – is that the major platforms are not always reachable. Where do you have to turn to if you want Google Maps to route less traffic through your street? Or, actually more, if you are a retailer? On a local level, this is different. There is a counter at the city hall where you can go, and there is a city council where you can put traffic problems on the agenda. This, by itself, is already very difficult to coordinate. The Chief Technology Officer of the city Amsterdam recently told in an interview about the use of artificial intelligence in the local authority:

“In some areas, residents have a larger capability to complain. Think of the city centre or the ‘Oud-Zuid’ area both more affluent areas and home to a large number of lawyers. It’s general knowledge that in those areas a complaint is made far easier than, for example, the less affluent area of Amsterdam “Noord”. This is not difficult for trained operators. They can handle experienced grumblers, and can judge for themselves whether the complaint is valid. A computer can not.”

Another issue is that some digital mapmakers are so large – and will continue to grow – that they can afford to listen selectively.

Who determines the path?

So, who decides how our public space is being used? Is that a local city council or a commercial enterprise? This makes quite a difference. In the first case, citizens can participate, decisions are made democratically, and there is a certain amount of transparency. In the second case, you have no information on why you were led left or right, or why shopping streets have become desolate overnight. Most likely the rule is: whoever pays, gets to decide. The growing power of commercial enterprises in the issue of mobility is threatening to put local administrations – and with that us, the citizens and small companies – out of play.

Bits of Freedom
https://www.bitsoffreedom.nl/

Hey Google, which way are we being led? (15.05.2019)
https://www.bitsoffreedom.nl/2019/05/15/hey-google-which-way-we-being-led/

Hey Google, which way are we being led? (in Dutch, 15.05.2019)
https://www.bitsoffreedom.nl/2019/04/15/hey-google-waarheen-leidt-de-weg/

Why people keep trying to erase the Hollywood sign from Google Maps (21.11.2014)
https://gizmodo.com/why-people-keep-trying-to-erase-the-hollywood-sign-from-1658084644

(Contribution by Rejo Zenger, EDRi member Bits of Freedom, the Netherlands; translation from Dutch to English by Bits of Freedom volunteers Alex Leering and Amber Balhuizen)

close
22 May 2019

Passenger surveillance brought before courts in Germany and Austria

By Gesellschaft für Freiheitsrechte

EDRi members Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights) and Epicenter.works have taken legal action against the mass retention and processing of Passenger Name Records (PNR) before German and Austrian courts and authorities. The European PNR Directive (Directive 2016/681) requires airlines to automatically transfer their passengers’ data to state authorities. There, the data are stored and automatically compared with pre-determined “criteria” that describe, for example, the flight behavior of known criminals. The data will also be distributed to other authorities and even non-EU member countries.

The EU Member States have been, since May 2018, obliged by the European PNR Directive to have adopted legislation for the retention of passenger data from airlines. For each passenger who takes a flight, a record is created. It contains at least 19 data items, including data such as the date of birth, details of accompanying persons, payment information, and the IP address used for online check-in. Together with information on the flight time and duration, booking class and baggage details, PNR data provides a detailed picture of the trip and the passenger.

PNR data is stored centrally at the respective Passenger Information Unit (PIU). These PIUs are usually located at national police authorities. The data can then be accessed by numerous other authorities and even transmitted to other countries. In addition, an automated comparison of the data records with pre-determined “criteria” is conducted.

This is a way of identifying new suspects in the mass of previously unsuspicious passengers – and a new level of dragnet action by collecting data from all citizens to “catch a few fish”. Thus, each individual, whether previously suspected of a crime or not, can thus be subjected to stigmatising investigations, just for coincidentally having similar flight patterns to past offenders.

GFF and epicenter.works argue that the PNR Directive in its current form violates the Charter of Fundamental Rights of the European Union, in particular the right to respect for private and family life (Article 7), as well as the right to the protection of personal data (Article 8). The Court of Justice of the European Union (CJEU) already took a similar view in its 2017 Opinion on the draft PNR agreement between the EU and Canada.

Since it isn’t possible to appeal the case against the PNR Directive directly before the CJEU, GFF and epicenter.works have brought legal actions before courts and authorities, civil and administrative courts, as well as the data protection authorities (DPAs) in Germany and Austria. The complaints lodged argue that the storage and processing of data by the police authorities violates the Charter of Fundamental Rights. Due to the case’s evident implications of EU law and the CJEU’s aforementioned opinion, it is expected that national courts will eventually refer the question to the CJEU.

The basic funding for the project is provided by the Digital Freedom Fund.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/

Epicenter.works
https://epicenter.works

No PNR campaign
https://nopnr.eu

Directive 2016/681 (PNR Directive)
https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016L0681&from=DE

CJEU Opinion on the Draft PNR agreement between Canada and the European Union
http://curia.europa.eu/juris/document/document.jsf?docid=193216&text=&dir=&doclang=EN&part=1&occ=first&mode=DOC&pageIndex=0&cid=7992604

(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)

close
22 May 2019

Google-Huawei case highlights the importance of free software

By Free Software Foundation Europe - FSFE

Google denies the Chinese IT giant Huawei access to Google’s proprietary components of the Android mobile operating system, which threatens IT security. This highlights the importance of free software for technology users, public bodies, and businesses.

Following the US administration’s decision to effectively ban American companies from doing trade with the Chinese company Huawei, Google suspended all business with the company. This affects all software which is not covered under free software licences. In practice, Huawei’s upcoming and potentially also current phones will no longer get support and updates for the Android operating system. They will also not have access to the proprietary Google apps and services like Gmail and Google Play. Especially the latter will put Huawei users at risk, because without access to the default app store on most Android phones they will miss important security updates for the apps installed through it.

Google offers only a base version of Android under a free software licence, but bundles it together with proprietary apps and services. The non-free components of most stock Android devices have numerous downsides for users, as EDRi member Free Software Foundation Europe (FSFE) has documented since 2012. The current case demonstrates that even tech giants like Huawei face similar dependencies and vendor lock-in effects as any individual users if they rely on proprietary software.

The following lessons can be drawn from this case:

  1. Users should favour free software operating systems and applications on their computing devices. With proprietary software, they are on the receiving end only, and vendors may deny them access to crucial security updates. Free software enables control of technology, and the more important that technology becomes in our daily lives, the more relevant free software becomes for users. For Android, the FSFE helps users to regain more control with its Free Your Android initiative.
  2. Governments and especially the European Union should invest more resources in free software to gain independence from large enterprises and other states. The current case highlights the lack of influence the EU has on outside technology providers. Instead of waiting for a future European IT monopolist to enter the stage, the EU and its members states should invest in free software development and focus on supporting local free software organisations as well as businesses. This would effectively foster the inner-European market and enable independence for European citizens and the EU economy. This step is essential for avoiding exposing European infrastructure to shutdowns controlled by external factors.
  3. Companies should use as much free software as possible in their supply chains. Proprietary software makes a company dependent on its vendor and this vendor’s government. The current case shows that the US was able to force Google to stop delivery of its proprietary products – but could not stop delivery of the free software components of Android. Had Huawei invested more resources in free software apps and services, the US strategy would not have hit them as hard. Although the current events are linked to the scrutiny the Chinese company is under right now, it is obvious that this could happen to any other company based in any other country as well.

The earlier allegations against Huawei already showed that code for all critical infrastructure should be published under a free software licence. The latest episode of the Huawei affair illustrates that the same applies to apps and services. Just days before the European elections, this should be a wake-up call for the next constituent Parliament to ask the European Commission for European Directives that foster independence of European technical infrastructure and that build on free software, starting with the demand to release publicly funded software as public code.

Free Software Foundation Europe (FSFE)
https://fsfe.org/

Three conclusions to draw from Google denying Huawei access to software (20.05.2019)
https://fsfe.org/news/2019/news-20190520-01.en.html

Free Your Android!
https://freeyourandroid.org

Public Money, Public Code
https://publiccode.eu

Huawei case demonstrates importance of Free Software for security (05.02.2019)
https://fsfe.org/news/2019/news-20190205-01.en.html

(Contribution by EDRi member Free Software Foundation Europe – FSFE, Europe)

close