13 May 2020

Ban biometric mass surveillance!

By EDRi

Across Europe, highly intrusive and rights-violating facial recognition and biometric processing technologies are quietly becoming ubiquitous in our public spaces. As the European Commission consults the public on what to do, EDRi calls on the Commission and EU Member States to ensure that such technologies are comprehensively banned in both law and practice.

Keep walking. Nothing to see here….

By the end of 2019, at least 15 European countries had experimented with invasive biometric mass surveillance technologies, such as facial recognition. These are designed to watch, track or analyse people, score them, and make judgements about them as they go about their daily lives.

Worse still, many governments have done this in collaboration with secretive tech companies, in the absence of public debate, and without having demonstrated that the systems meet even the most basic thresholds of accountability, necessity, proportionality, legitimacy, legality or safeguarding.

A few thousand cameras to rule them all

Without privacy, you do not have the right to a private chat with your friends, your family, your boss or even your doctor. Your activism to save the planet becomes everyone’s business. You will be caught when blowing the whistle on abuse and corruption, or when attending a political march that your government does not want you to attend. You lose the right to go to a religious service or Trade Union meeting without someone keeping an eye on you; to hug your partner without someone snooping; or to wander freely without someone thinking you are being suspicious.

With constant mass surveillance, you lose a way to ever be truly alone. Instead, you become constantly surveilled and controlled.

COVID-1984?

Since the start of the Coronavirus pandemic, apps and other proposals have been suggested to rapidly expand bodily and health surveillance systems under the guise of public health. However, there is a real risk that the damage caused by widening surveillance measures will last long after the pandemic is over. For example, will employers remove the cameras doing temperature checks in offices after the pandemic?

Biometric mass surveillance systems can exacerbate structural inequalities, accelerate unlawful profiling, have a chilling effect on their freedoms of expression and assembly, and put limits on everyone’s ability to participate in public and social activities.

Fanny Hidvégi, Europe Policy Manager at EDRi member Access Now (AN) explains:

Human rights apply in emergencies and health crises. We don’t have to choose between privacy and health: protecting digital rights also promotes public health. The suspension of data protection rights in Hungary show why the EU needs to step up to protect fundamental rights.

Biometric surveillance – an architecture of oppression

Portrayed as an “architecture of oppression”, the untargeted capture or processing of sensitive biometric data makes it possible for governments and companies to build up incredibly detailed permanent records of who you meet, where you go, and what you do. More, it allows these actors to use all these records against you – whether for law enforcement, public authority or even commercial uses. By linking them to faces and bodies, these permanent records become quite literally carved into your skin. The increased capacity of states to track and identify individuals through facial recognition and other biometric processing is likely to disproportionately impact populations which are already highly policed, surveilled and targeted by abuse, including people of colour, Roma and Muslim communities, social activists, LGBTQ+ people and people with irregular migration status. There can be no place for this in a democratic, rights-based, rule-of-law-respecting society.

Ioannis Kouvakas, Legal Officer at EDRi member Privacy International (PI) warns that:

The introduction of facial recognition into cities is a radical and dystopic idea which significantly threatens our freedoms and poses fundamental questions about the kind of societies we want to live in. As a highly intrusive surveillance technique, it can provide authorities with new opportunities to undermine democracy under the cloak of defending it. We need to permanently ban its roll out now before it’s too late.

EDRi is therefore calling for an immediate and indefinite ban on biometric mass surveillance across the European Union.

Biometric mass surveillance is unlawful

This ban is grounded in the rights and protections enshrined in the Charter of Fundamental Rights of the European Union, the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED) which are currently under the spotlight for their two-year anniversary reviews. Together, these instruments guarantee that the people of the EU can live without fear of arbitrary treatment or abuse of power; with respect for their autonomy and self-development; and in safety and security by setting strong data protection and privacy standards. Biometric mass surveillance constitutes a violation of the essence of these instruments, and a contravention of the very heart of the EU’s fundamental rights.

Once systems are in place that normalise and legitimise the 24/7 watching of everyone, all the time, it’s a slippery slope towards authoritarianism. The EU must ensure, therefore, through legislative and non-legislative means, that biometric mass surveillance is comprehensively banned in law and in practice. Lotte Houwing, Policy Advisor at EDRi member Bits of Freedom (BoF) cautions that:

We are shaping the world of tomorrow with the measures we are taking today. It is of utmost importance that we keep this in mind and do not let the COVID-19 crisis scare us in to a (mass) surveillance state. Surveillance is not a medicine.

The EU regulates everything from medicines to children’s toys. It is unimaginable that a drug which has not been shown to be effective, or a toy which poses significant risks to children’s wellbeing, would be allowed onto the market. However, when it comes to biometric data capture and processing, in particular in an untargeted way in public spaces (i.e. mass surveillance), the EU has been a haven for unlawful biometric experimentation and surveillance. This has happened despite the fact that a 2020 study demonstrated that over 80% of Europeans are against sharing their facial data with authorities.

EDRi calls on the EU Commission, European Parliament and Member States to stick to their values and protect our societies by banning biometric mass surveillance. Failing to do so will increase the risks of an uncontrolled and uncontrollable demise into a digital dystopia.

Read more:

EDRi paper: Ban Biometric Mass Surveillance (13. 05. 2020)
https://edri.org/wp-content/uploads/2020/05/Paper-Ban-Biometric-Mass-Surveillance.pdf

Explainer: Ban Biometric Mass Surveillance (13. 05. 2020)
https://edri.org/wp-content/uploads/2020/05/Explainer-Ban-Biometric-Mass-Surveillance.pdf

EDRi calls for fundamental rights-based responses to COVID-19 (20. 03. 2020)
https://edri.org/covid19-edri-coronavirus-fundamentalrights/

Emergency responses to COVID-19 must not extend beyond the crisis (15. 04. 2020)
https://edri.org/emergency-responses-to-covid-19-must-not-extend-beyond-the-crisis/

COVID-19 & Digital Rights: Document Pool (04. 05. 2020)
https://edri.org/covid-19-digital-rights-document-pool/

close
04 May 2020

COVID-19 & Digital Rights: Document Pool

By EDRi

The Coronavirus (COVID-19) pandemic poses a global public health challenge of unprecedented proportions. In order to tackle it, countries around the world need to engage in coordinated, evidence-based responses grounded in solidarity, support and respect for human rights. This means that measures cannot lead to disproportionate and unnecessary actions. It is also vital that measures are not extended once we are no longer in a state of emergency. Otherwise, the actions taken under exceptional circumstances today can have significant repercussions on human rights both today and tomorrow.

In this document pool we will be listing relevant articles and documents related to the intersection of the COVID-19 crisis and digital rights. This will allow you to follow the developments of surveillance measures, content moderation, tracking and privacy-threatening actions in Europe as they relate to the coronavirus pandemic, as well as offer the set of perspectives and recommendations put forth by a host of digital rights watchdog organisations across Europe and the world. The document pool is updated regularly to ensure the delivery of the most up-to-date information.

  1. EDRi’s Analysis and Recommendations
  2. EDRi Articles, blog posts and press releases
  3. Mapping Exercise
  4. Official EU Documents
  5. Other Useful Resources

1. EDRi’s Analysis and Recommendations

Official EDRi statement on COVID-19 and Digital Rights

EDRi Members’ Responses and Recommendations on COVID-19

Analysing Tracking & Tracing Apps


2. EDRi’s Articles, blog posts and press release

EDRi Reporting

#COVIDTech – An EDRi Blog Series


3. Mapping Exercises

EDRi Members Mapping

Other Mapping Excercises


4. Official EU Documents


5. Other Useful Resources

With huge thanks to the individuals and organisations across the EDRi network who have shared resources for this document pool.

close
29 Apr 2020

#WhoReallyTargetsYou: DSA and political microtargeting

By Panoptykon Foundation

Europe is about to overhaul its 20-year-old e-Commerce Directive and it is a once-in-a-decade chance to correct the power imbalance between platforms and users. As part of this update, the Digital Services Act (DSA) must address the issue of political microtargeting (PMT).

Microtargeting, and PMT in particular, has the alarming power to derail democracy, and should be regulated. According to self-assessment reports, political advertisers spent €31 million (excluding the UK) on Facebook, and only €5 million on Google between March and September 2019. Facebook’s role in developing and targeted adverts goes far beyond a simple presentation medium — its tools for optimising ad delivery, targeting audiences and defining delivery criteria are far beyond the capacity of most political parties alone. A detailed report based on data collected during two Polish election campaigns in 2019 carried out by Panoptykon and partners, shed critical light on the role of the company, and what it revealed was extremely informative:

The study found that Facebook’s transparency and control tools that would explain how ad targeting works offered to both researchers and users are “insufficient and superficial.” Users are targeted by Facebook’s algorithm based on potentially thousands of distinct selectors following a a set of criteria that only the company knows. Advertisers on Facebook can opt to select audiences on obvious factors such as age, gender, language spoken and location. But the Facebook machine also steers them towards increasingly narrower criteria such as interests (political affiliation, sex orientation, musical tastes, etc…), “life events” and behaviour, as well as more than 250,000 free-text attributes including, for example, Adult Children of Alcoholics, or Cancer Awareness, which constitute a deeper privacy concern.

Facebook is not merely a passive intermediary; its algorithms interpret criteria selected by advertisers and deliver ads in a way that fulfils advertisers’ objectives, and actively curate the content that users see in their timelines based on those assumptions. In 2016, the company introduced a feature allowing them to target “lookalikes” – profiles similar to a target audience. It also allows A/B testing so advertisers can compare which ads are more effective.

But Facebook’s “why am I seeing this ad?” transparency tool can be misleading, revealing only the “lowest common denominator” attribute. For example, according to the report, during the European elections campaign in Poland in May 2019, a person who was pregnant saw a political ad referring to prenatal screenings and perinatal care. “Why am I seeing this ad?” informed her that she was targeted because she was interested in “medicine” (potential reach 668 million) rather than “pregnancy” (potential reach of 316 million). Users can only verify (check, delete, or correct) a short list of interests that the platform is willing to reveal.

Here is where upcoming regulation comes into play: At the very least, the Digital Services Act should prohibit PMT based on characteristics which expose our mental or physical vulnerabilities (e.g. depression, anxiety, addiction, illness). But if the EU wants to be ambitious and tackle many of the associated problems with the current business model, the DSA should go further and regulate any sort of advertising aimed at profiling users, particularly as there appears to be a gap between ads labelled as “political” by the platform, and ads perceived as political by researchers.

Regulating targeted ads, requiring greater transparency for researchers and users, opt-in rather than opt-out, tighter requirements for political advertising and recognising PMT as an application of AI that poses serious risks for human rights will not solve all the problems of political disinformation in society, but they would certainly eliminate some of the worst practices today.

Read more:

Who (really) targets you? Facebook in Polish election campaigns
https://panoptykon.org/political-ads

Annual self-assessment reports of signatories to the Code of Practice on Disinformation 2019 (29.10.2019)
https://ec.europa.eu/digital-single-market/en/news/annual-self-assessment-reports-signatories-code-practice-disinformation-2019

(Contribution by Karolina Iwańska, from EDRi member Panoptykon)

close
29 Apr 2020

Member in the spotlight: Homo Digitalis

By EDRi

This is the tenth article of the series “EDRi member in the Spotlight” in which our members introduce themselves and their work in an in-depth highlight in interview format.

Today we introduce our Greek member: Homo Digitalis.

1. Who are you and what is your organisation’s goal and mission?

Homo Digitalis is the only digital rights civil society organization in Greece. Our goal is the protection of human rights and freedoms in the digital age. We strive to influence legislators & policy makers on a national level, and to raise awareness amongst the people of Greece regarding digital rights issues. Moreover, when digital rights are jeopardized by public or private actors, we carry out investigations, conduct studies and proceed to legal actions.

2. How did it all begin, and how did your organisation develop its work?

Homo Digitalis was founded in 2018 by 6 tech lawyers with a strong passion about the protection and promotion of digital rights. No digital rights organisations existed in Greece before. So, we wanted to create an organisation that could bring like-minded people together and shake things up. After two years of voluntary work, we have managed to grow into an organization with more than 100 members, who bring together a wide variety of disciplines such as law, computer science, humanities and social sciences.

We aim to transform Homo Digitalis from an organization based on voluntary work to a strong watchdog with a long-term strategy and full-time personnel. It will be a long and difficult path, but we have started acquiring our first grants and we are confident that we will grow, gaining more recognition and support for us and our vision.

3. The biggest opportunity created by advancements in information and communication technology is…

…facilitating access to information all around the globe, and building bridges between people. These advancements constitute a driver for positive change in our societies, and could lead to enhanced equality and transparency.

4. The biggest threat created by advancements in information and communication technology is…

…mass surveillance of our societies and power asymmetry in the information economy.

5. Which are the biggest victories/successes/achievements of your organisation?

Becoming a full member of EDRi is certainly a great success of Homo Digitalis so far!

Additionally, Homo Digitalis has managed to achieve important accomplishments over the last two years. We have increased public awareness on digital rights issues by generating media interest in our actions, visiting educational institutions and participating in events, campaigns, and giving talks all around Greece. Moreover, we were instrumental in influencing the public debate around data protection reform in Greece by cooperating with related stakeholders, and by filing complaints and requests before EU and national authorities, respectively.

Also, through access to information requests, complaints, and investigations we have attained a high level of scrutiny regarding projects on technology-led policing and border management activities in Greece. In addition, we have collaborated with investigative journalists to reveal important facts. Even though we are an organization based solely on volunteers, we give our best to respond quickly to the challenges that arise.

Furthermore, we have been fortunate enough to participate shoulder to shoulder with powerful digital rights organisations in EU-wide projects and campaigns and to learn from their expertise and knowledge. Finally, we also had the great opportunity to present our views and opinions in important fora, such as the UN Human Rights Council 39th session in Geneva or the European Parliament in Brussels.

All these accomplishments over the last two years give us the strength to continue our work towards the protection and promotion of human rights in the digital age.

6. If your organisation could now change one thing in your country, what would that be?

Active participation of people in collective activities such as digital rights activism. If individuals could devote a part of their knowledge and time to such activities, we would have a stronger voice to influence policy makers and legislators towards political decisions that respect our rights and freedoms and not violate them, instead.

7. What is the biggest challenge your organisation is currently facing in your country?

After 10 years of financial crisis and austerity measures in Greece that limited public spending, we witness over the last years an increase in funds used for technology-led policing and border managements projects. Thus, we must stay wide-awake in order to challenge and fight back the implementation of intrusive tools and technologies in our societies that limit our rights and freedoms.

8. How can one get in touch with you if they want to help as a volunteer, or donate to support your work?

You can visit our website to help us as a volunteer or to donate and support our work.

Also, we always appreciate a good conversation, so feel free to reach out to info@homodigitalis.gr. Last but not least, you can subscribe to our newsletter here.

Read more:

EDRi member in the spotlight series
https://edri.org/member-in-the-spotlight/

Join Homo Digitalis as member/supporter/volunteer
https://www.homodigitalis.gr/en/join-us

Donate to Homo Digitalis
https://www.homodigitalis.gr/en/donations/help-us-grow

close
29 Apr 2020

Why COVID-19 is a Crisis for Digital Rights

By Guest author

The COVID-19 pandemic has triggered an equally urgent digital rights crisis.

New measures being hurried in to curb the spread of the virus, from “biosurveillance” and online tracking to censorship, are potentially as world-changing as the disease itself. These changes aren’t necessarily temporary, either: once in place, many of them can’t be undone.

That’s why activists, civil society and the courts must carefully scrutinise questionable new measures, and make sure that – even amid a global panic – states are complying with international human rights law.

Human rights watchdog Amnesty International recently commented that human rights restrictions are spreading almost as quickly as coronavirus itself. Indeed, the fast-paced nature of the pandemic response has empowered governments to rush through new policies with little to no legal oversight.

There has already been a widespread absence of transparency and regulation when it comes to the rollout of these emergency measures, with many falling far short of international human rights standards.

Tensions between protecting public health and upholding people’s basic rights and liberties are rising. While it is of course necessary to put in place safeguards to slow the spread of the virus, it’s absolutely vital that these measures are balanced and proportionate.

Unfortunately, this isn’t always proving to be the case. What follows is an analysis of the impact of the COVID-19 pandemic on the key subset of policy areas related to digital rights:

a) The Rise of Biosurveillance

A panopticon world on a scale never seen before is quickly materialising.

“Biosurveillance” which involves the tracking of people’s movements, communications and health data has already become a buzzword, used to describe certain worrying measures being deployed to contain the virus.

The means by which states, often aided by private companies, are monitoring their citizens are increasingly extensive: phone data, CCTV footage, temperature checkpoints, airline and railway bookings, credit card information, online shopping records, social media data, facial recognition, and sometimes even drones.

Private companies are exploiting the situation and offering rights-abusing products to states, purportedly to help them manage the impact of the pandemic. One Israeli spyware firm has developed a product it claims can track the spread of coronavirus by analysing two weeks’ worth of data from people’s personal phones, and subsequently matching it up with data about citizens’ movements obtained from national phone companies.

In some instances, citizens can also track each other’s movements leading to not only vertical, but also horizontal sharing of sensitive medical data.

Not only are many of these measures unnecessary and disproportionately intrusive, they also give rise to secondary questions, such as: how secure is our data? How long will it be kept for? Is there transparency around how it is obtained and processed? Is it being shared or repurposed, and if so, with who?

b) Censorship and Misinformation

Censorship is becoming rife, with many arguing that a “censorship pandemic” is surging in step with COVID-19.

Oppressive regimes are rapidly adopting “fake news” laws. This is ostensibly to curb the spread of misinformation about the virus, but in practice, this legislation is often used to crack down on dissenting voices or otherwise suppress free speech. In Cambodia, for example, there have already been at least 17 arrests of people for sharing information about coronavirus.

At the same time, many states have themselves been accused of fuelling disinformation to their citizens to create confusion, or are arresting those who express criticism of the government’s response.

As well as this, some states have restricted free access to information on the virus, either by blocking access to health apps, or cutting off access to the internet altogether.

c) AI, Inequality and Control

The deployment of AI can have consequences for human rights at the best of times, but now, it’s regularly being adopted with minimal oversight and regulation.

AI and other automated learning technology are the foundation for many surveillance and social control tools. Because of the pandemic, it is being increasingly relied upon to fight misinformation online and process the huge increase in applications for emergency social protection which are, naturally, more urgent than ever.

Prior to the COVID-19 outbreak, the digital rights field had consistently warned about the human rights implications of these inscrutable “black boxes”, including their biased and discriminatory effects. The adoption of such technologies without proper oversight or consultation should be resisted and challenged through the courts, not least because of their potential to exacerbate the inequalities already experienced by those hardest hit by the pandemic.

d) Eroding Human Rights

Many of the human rights-violating measures that have been adopted to date are taken outside the framework of proper derogations from applicable human rights instruments, which would ensure that emergency measures are temporary, limited and supervised.

Legislation is being adopted by decree, without clear time limitations, and technology is being deployed in a context where clear rules and regulations are absent.

This is of great concern for two main reasons.

First, this type of “legislating through the back door” of measures that are not necessarily temporary avoids going through a proper democratic process of oversight and checks and balances, resulting in de facto authoritarian rule.

Second, if left unchecked and unchallenged, this could set a highly dangerous precedent for the future. This is the first pandemic we are experiencing at this scale – we are currently writing the playbook for global crises to come.

If it becomes clear that governments can use a global health emergency to instate human rights infringing measures without being challenged or without having to reverse these measures, making them permanent instead of temporary, we will essentially be handing over a blank cheque to authoritarian regimes to wait until the next pandemic to impose whatever measures they want.

Therefore, any and all measures that are not strictly necessary, sufficiently narrow in scope, and of a clearly defined temporary nature, need to be challenged as a matter of urgency. If they are not, we will not be able to push back on a certain path towards a dystopian surveillance state.

e) Litigation: New Ways to Engage

In tandem with advocacy and policy efforts, we will need strategic litigation to challenge the most egregious measures through the court system. Going through the legislature alone will be too slow and, with public gatherings banned, public demonstrations will not be possible at scale.

The courts will need to adapt to the current situation – and are in the process of doing so – by offering new ways for litigants to engage. Courts are still hearing urgent matters and questions concerning fundamental rights and our democratic system will fall within that remit. This has already been demonstrated by the first cases requesting oversight to government surveillance in response to the pandemic.

These issues have never been more pressing, and it’s abundantly clear that action must be taken.

If you want to read more on the subject, follow EDRi’s new series #COVIDTech here: https://edri.org/emergency-responses-to-covid-19-must-not-extend-beyond-the-crisis/

This article was originally published at: https://digitalfreedomfund.org/why-covid-19-is-a-crisis-for-digital-rights/

Read more:

Tracking the Global Response to COVID-19:
https://privacyinternational.org/examples/tracking-global-response-covid-19

Russia: doctor who called for protective equipment detained (03.04.2020)
https://www.amnesty.org.uk/press-releases/russia-doctor-who-called-protective-equipment-detained

A project to demystify litigation and artificial intelligence (06.12.2019)
https://digitalfreedomfund.org/a-project-to-demystify-litigation-and-artificial-intelligence/

Making Accountability Real: Strategic Litigation (30.01.2020)
https://digitalfreedomfund.org/making-accountability-real-strategic-litigation/

Accessing Justice in the Age of AI (09.04.2020)
https://digitalfreedomfund.org/accessing-justice-in-the-age-of-ai/

(Contribution by Nani Jansen Reventlow, Digital Freedom Fund)

close
29 Apr 2020

Everything you need to know about the DSA

By Chloé Berthélémy

In her political guidelines, the President of the European Commission Ursula von der Leyen has committed to “upgrade the Union’s liability and safety rules for digital platforms, services and products, with a new Digital Services Act” (DSA). The upcoming DSA will revise the rules contained in the E-Commerce Directive of 2000 that affect how intermediaries regulate and influence user activity on their platforms, including people’s ability to exercise their rights and freedoms online. This is why reforming those rules has the potential to be either a big threat to fundamental rights rights or a major improvement of the current situation online. It is also an opportunity for the European Union to decide how central aspects of the internet will look in the coming ten years.

A public consultation by the European Commission is planned to be launched in May 2020 and legislative proposals are expected to be presented in the first quarter of 2021.

In the meantime, three different Committees of the European Parliament have announced or published Own Initiative Reports as well as Opinions in view of setting the agenda of what the DSA should regulate and how it should achieve its goals.

We have created a document pool in which we will be listing relevant articles and documents related to the DSA. This will allow you to follow the developments of content moderation and regulatory actions in Europe.

Read more:

Document pool: Digital Service Act (27. 04. 2020)
https://edri.org/digital-service-act-document-pool/

close
29 Apr 2020

Digital Services Act: Document pool

By EDRi

In her political guidelines, the President of the European Commission Ursula von der Leyen has committed to “upgrade the Union’s liability and safety rules for digital platforms, services and products, with a new Digital Services Act” (DSA). The upcoming DSA will revise the rules contained in the E-Commerce Directive of 2000 that affect how intermediaries regulate and influence user activity on their platforms, including people’s ability to exercise their rights and freedoms online. This is why reforming those rules has the potential to be either a big threat to fundamental rights rights or a major improvement of the current situation online. It is also an opportunity for the European Union to decide how central aspects of the internet will look in the coming ten years.

A public consultation by the European Commission is planned to be launched in May 2020 and legislative proposals are expected to be presented in the first quarter of 2021.

In the meantime, three different Committees of the European Parliament have announced or published Own Initiative Reports as well as Opinions in view of setting the agenda of what the DSA should regulate and how it should achieve its goals.

In this document pool we will be listing relevant articles and documents related to the DSA. This will allow you to follow the developments of content moderation and regulatory actions in Europe.

Last update: 27 April 2020

Table of content

EDRi’s analysis and recommendations
Legislative documents
Blogposts and press releases
EDRi members’ publications
Key policymakers
Key dates


EDRi’s analysis and recommendations


Legislative documents

European Commission

  • Public consultation announced for May 2020
  • Legislative proposals announced for Q1/2021

European Parliament


EDRi’s blogposts and press releases


EDRi members’ publications


Key policymakers


Key dates

  • European Commission’s consultation: May 2020
  • European Commission’s legislative proposal: Q1 2021
Twitter_tweet_and_follow_banner
close
28 Apr 2020

COVID-19: A Commission hitchhiker’s tech guide to the App Store

By EDRi

“We’re being asked what do we want these systems to look like. If we don’t make the decision it will be made for us (…) This virus will pass, but the measures will last”

Edward Snowden

According to the World Health Organisation (WHO), closely watching contacts during a pandemic “will prevent further transmission of the virus”. In response to the COVID-19 crisis many technical responses (or acts of techno-solutionism) arose shortly after the pandemic was declared by the WHO. Contact–tracing applications are one of the notable solutions brought forward, and currently occupy the center of the public debate in the European space.

Whether contact-tracing technology will help or not, however, is still contested. Technology is not a silver bullet, as Carly Kind, director of AI research center Ada Lovelace Institute, puts it.  Moreover, Dr. Michael Ryan, a key advisor for the WHO, warned that “when collecting information on citizens or tracking their movements there are always serious data protection and human rights principles involved”. Several voices in the EDRi community also question whether the risks in using apps may outweigh the benefits (La Quadrature du Net) and if apps are just “we-did-something” political responses (FIPR – Ross Anderson).

That said, if apps (and technology in general) are proven to be useful in any significant way, they need to fully protect fundamental rights, since the risks created by these technologies could outlast the pandemic itself.

European Digital Rights, as the voice of 44 organisations working to advance and uphold human rights in the digital space, warned early on of the potential problems that a rushed technological solution could lead us to.

In reaction to the debate regarding the safeguards potential technical solutions must provide, the European Commission (EC) has published a toolbox and guidelines for ensuring data protection standards. The two instruments aim to guide the responses that Member States are already preparing nationally, sometimes in very different directions.

In this article, we aim to provide insight into European Commission’s proposals and how they fit with civil society views on this subject.

A techie toolbox

A fragmented and uncoordinated approach to contact tracing apps risks hampering the effectiveness of measures aimed at combating the COVID-19 crisis, whilst also causing adverse effects to the single market and to fundamental rights and freedoms”

European Commission Common EU Toolbox for Member State

The EC argued for the need of a toolkit as national authorities are developing mobile applications (apps) to monitor and mitigate the COVID-19 pandemic. The Commission agrees that contact tracing, as usually done manually by public health authorities, is a time-consuming process and that the “promising” technology and apps in particular could be useful tools for Member States.

However, the EC points out that, in order for apps to be efficient, they need to be adopted by 60-75% of population – a very high threshold for a voluntary app. As comparison, in the famous case of Singapore, only 20% of the population downloaded the app.

The toolbox calls for a series of concrete requirements for these apps: interoperability (apps must work well with each other in order to be able to trace transnational cases); voluntary; approved by the national health authorities; privacy-preserving and dismantled as soon as they are no longer needed.

The time principle was a key point in our statement laying out fundamental rights – based recommendations for COVID-19 responses. On apps in particular, EDRi member Access Now advocates that access to health data shall be limited to those who need information to conduct treatment, research, and otherwise address the crisis . Finally, EDRi members Chaos Computer Club (CCC), Free Software Foundation Europe (FSFE) and noyb are among those that agree on the need for the apps to be voluntary.

Decentralised or centralised, that is the question

The Toolbox describes two categories of apps: those that operate via decentralised processing of personal data, which would be stored only on a person’s own device; and those operating via a centralised back-end server which would collect the data. The EC argues that this data should be reduced to the “absolute minimum” necessary, with technical requirements compiled by ENISA (encryption, communications security, user authentication….) and “preferably” the Member State should be the controller for the processing of personal data. The Annexes list key recommendations, background information on contact tracing , background on symptom checker functionalities and an inventory of existing mobile solutions against COVID-19.

Our member noyb agrees with the Commission requiring strong encryption, an essential element of secure technologies for which we have also advocated before. More, EDRi member CCC sides with the decentralisation option rather than a centralised one, as well as with strong communication security and privacy requirements.

Readers who liked the Toolbox… also liked the Guidelines

People must have the certainty that compliance with fundamental rights is ensured and that the apps will be used only for the specifically-defined purposes, that they will not be used for mass surveillance, and that individuals will remain in control of their data.

European Commission Guidance on Apps supporting the fight against COVID 19 pandemic in relation to data protection

The Commision guidance summarises some of the key points of the Toolbox but provides more insight on some of the features, as well details on ensuring data protection and privacy safeguards. The guidance focuses on apps which are voluntary and that offer one or more functionalities: provide accurate information to individuals about the pandemic or provide questionnaires for self-assessment and guidance for individuals (symptom checker). Other functionalities could include alerting individuals if they have been in close contact with an infected person (contact tracing and warning functionality) and/or provide means of communication between patients and doctors.

The guidance relies heavily on references to the ePrivacy Directive (currently blocked by EU Member States from becoming an updated Regulation for 3 years and 4 months) and the General Data Protection Regulation (GDPR) . The references include data minimisation, purpose limitation, time limitation (apps deactivated after the pandemic is over) and top-of the-art security protections.

Our member Access Now has thoroughly gone through the data protection and privacy requirements of purpose limitation, data minimisation and time limitation , largely coinciding with the Commission, while Bits of Freedom has also mentioned the minimal use of data needed and time limitation, in addition to the apps being based on scientific insight and demonstrable effectiveness.

Location data is not necessary, decentralisation is

The Commission states that location data is not necessary for the purpose of contact tracing functionalities and that it would even be “difficult to justify in light of the principle of data minimisation”, and that it can create “security and privacy issues”. Regarding the debate of centralisation vs decentralisation, the Commission believes that decentralisation is more in line with the minimisation principle and that, as Bits of Freedom, CCC and many other groups have suggested, only “health authorities should have access to proximity data [which should be encrypted]” and therefore no law enforcement agencies can access the data. What about the well-intended but risky use of data for “statistics and scientific research”? Commission says no, unless it is necessary and included in the general list of purposes and clearly communicated to users.

Get us some open code. And add good-old strong encryption to go with it, please

The Commission asks for the source code to be made public and available for review. In addition to this, the Commission calls for the use of encryption when transmitting the data to national health authorities, if that is one of the functionalities. Both of these conclusions have been some of the key requests from EDRi members such as FSFE, both for transparency and security purposes but also as a an appeal for solidarity. We consider the call for openness as a positive request from the Commission.

Finally, the guidance brings back the forgotten Data Protection Authorities (DPAs) who, as we have also suggested, should be the ones consulted and fully involved when developing and implementing the apps.

Moving forward

We have many uncertainties regarding the actual pandemic, especially regarding whether any technical solution will help or not. Furthermore, it is unclear how these technologies should be designed, developed and deployed in order to avoid mass surveillance of citizens, stigmatisation of those who are sick and reinforced discrimination of people living in poverty, people of colour and other individuals of groups at risks who are already disproportionately affected by the pandemic.

The voices of experts and civil society must be taken into consideration, before taking the road of an endless “war on virus” that normalises mass surveillance. If proven that technologies are indeed helpful to combat this crisis, technological solutions need to comply with very strong core principles. Many of these strong principles are already present in the Commission’s two documents and in many of the civil society views in this ongoing debate.

In the meantime, strong public health systems, strong human rights protections (including extra protections for key workers), a human-rights centric patent system that puts humans at its core and open access to scientific knowledge are key principles that should be implemented now.

Read more:

Press Release: EDRi calls for fundamental rights-based responses to COVID-19 (01.04.2020)
https://edri.org/edri-calls-for-fundamental-rights-based-responses-to-covid-19/

noyb Active overview of projects using personal data to combat SARS-CoV-2.
https://gdprhub.eu/index.php?title=Data_Protection_under_SARS-CoV-2

Privacy International Extraordinary powers need extraordinary protections. (20. 03. 2020)
https://privacyinternational.org/news-analysis/3461/extraordinary-powers-need-extraordinary-protections

Access Now Protect digital rights, promote public health: toward a better coronavirus response. (05. 03. 2020)
https://www.accessnow.org/protect-digital-rights-promote-public-health-towards-a-better-coronavirus-response/

Ada Love Lace Institute: Exit through the App Store? (20. 04. 2020)
https://www.adalovelaceinstitute.org/wp-content/uploads/2020/04/Ada-Lovelace-Institute-Rapid-Evidence-Review-Exit-through-the-App-Store-April-2020-1.pdf

European Commission – Mobile applications to support contact tracing in the EU’s fight against COVID-19: Common EU Toolbox for Member States (15. 04. 2020)
https://ec.europa.eu/health/sites/health/files/ehealth/docs/covid-19_apps_en.pdf

European Commission (COMMUNICATION)- Guidance on Apps supporting the fight against COVID 19 pandemic in relation to data protection (16. 04. 2020)
https://ec.europa.eu/info/sites/info/files/5_en_act_part1_v3.pdf

close
22 Apr 2020

EDRi is looking for a Communications and Media Manager (Permanent position)

By EDRi

European Digital Rights (EDRi) is an international not-for-profit association of 42 digital rights organisations from across Europe and beyond. We advocate for robust and enforced laws, inform and mobilise people, promote a healthy and accountable technology market and build a movement of organisations and individuals committed to digital rights and freedoms in a connected world.

EDRi is looking for an experienced Communications and Media Manager to join EDRi’s team in Brussels. This is a unique opportunity to help shape and lead on the communications of a well-respected network of NGOs at a time of numerous challenges to our rights and freedoms in the digital age. The deadline to apply is 22ndMay 2020. This is a full-time, permanent position and the start date is expected to be July 2020.

The Communications and Media Manager leads and is responsible for EDRi’s strategic communications and engagement with the media. We are looking for an individual that will bring a strategic and creative mindset to communicate about complex human rights and technology issues in a diverse, fast-changing political environment. The successful candidate will have a strong track record in working with European and national media, excellent storytelling and drafting skills, as well as the ability to establish a network of journalists.

We are an equal opportunities employer with a strong commitment to transparency and inclusion. We strive to have a diverse and inclusive working environment. We encourage individual members of groups at risk of discrimination to apply for this post.

Job title: Communications and Media Manager
Start date (expected): July 2020
Reports to: Executive Director
Location: EDRi Office, Brussels, Belgium

RESPONSIBILITIES:
As Communications and Media Manager, working closely with EDRi’s Policy, Campaigns and Network colleagues, you will:

  • Develop EDRi’s communications and media engagement long term strategies and short term plans;
  • Promote EDRi’s work and narrative to the media, raising the profile of the EDRi network, including by helping to disseminate their work;
  • Establish and maintain a robust network of media contacts and relationships;
  • Support the production and secure the publication of EDRi op-eds and other materials in leading outlets;
  • Write, edit and send press releases and manage media inquiries;
  • Support the production and editing of content for EDRi’s website and of the EDRi-gram bi-monthly newsletter;
  • Oversee EDRi’s social media presence;
  • Oversee the editing and design of EDRi’s publications;
  • Analyse media coverage and contribute to report on EDRi’s media exposure and communication work.

QUALIFICATIONS AND EXPERIENCE:

  • Minimum 3 years of relevant experience in a similar role;
  • A university degree in journalism, communications, media studies, EU affairs, public relations or related field or equivalent experience;
  • Demonstrable knowledge of the European media landscape and of European institutions, and an interest in communicating about and framing of human rights, in particular privacy, surveillance and law enforcement, freedom of expression, as well as other internet policy issues;
  • Experience in media relations, leading networks of journalists and creating networks of influence;
  • Exceptional writing skills in particular for op-eds and press releases;
  • Ability to create visuals for social media, work on simple graphic designs and formatting.
  • Strong multitasking abilities and ability to manage multiple deadlines;
  • Experience of working with and in small teams;
  • Excellent level of English;
  • Knowledge of another European language is an advantage;
  • Knowledge of database management systems, mailing list and free and open software is an advantage.

HOW TO APPLY:
To apply, please send a maximum one-page cover letter and a maximum two-page CV in English (including two professional references) and in .pdf format to applications(at)edri.org by 22ndMay 2020.

Please note that only shortlisted candidates will be contacted.

close
15 Apr 2020

COVID-Tech: Emergency responses to COVID-19 must not extend beyond the crisis

By Ella Jakubowska

In EDRi’s new series on COVID-19, we will explore the critical principles for protecting fundamental rights while curtailing the spread of the virus, as outlined in the EDRi network’s statement on the virus. Each post in this series will tackle a specific issue at the intersection of digital rights and the global pandemic in order to explore broader questions about how to protect fundamental rights in a time of crisis. In our statement, we emphasised the principle that states must “[i]mplement exceptional measures only for the duration of the crisis”. In this first post of the series, we take a look at what experiences in the UK, Poland and Hungary could teach states as they work out the most effective ways of stopping the spread of coronavirus – without leaving the door ajar for future fundamental rights violations to creep in.

Public health responses to the coronavirus pandemic are putting unprecedented limits on the daily lives of people across the world. A range of important rights, including to life and health, are of course threatened by COVID-19 – but the responses that we are seeing across Europe also threaten fundamental liberties and freedoms, both freedoms to do things (express ourselves, associate in public) and freedom from things (government surveillance, abuses of power, discrimination). In some cases, fundamental rights in the EU are not just under threat, but are already being unjustifiably, disproportionately and unlawfully violated under the guise of public health.

The state of play in Hungary:

On 30 March 2020, the Prime Minister of Hungary was granted sweeping powers to rule the country by decree. Hungary has been under the EU’s spotlight over the last two years for failing to comply with the EU’s core values, with the European Parliament launching infringement proceedings about the deteriorating respect for the rule of law, and civil society raising serious concerns including lack of respect for privacy and data protection, evidence of widespread state surveillance, and infringements on freedom of expression online. Following the enactment of a state of emergency on 11 March and the tabling of the indefinite decree on 23 March, ostensibly in response to the coronavirus pandemic, EU Parliament representatives issued a clear warning to Hungary, stating that “extraordinary measure adopted by the Hungarian government in response to the pandemic must respect the EU’s founding values.” This warning did little to temper Orbán’s ambitions, and leaves the people of Hungary vulnerable to expanded powers which can be abused long after the spread of coronavirus has been checked.

The state of play in Poland:

The European Parliament have also raised concerns about the worsening rule of law in Poland, in particular threats to the independence of the judiciary, with investigations activated in 2016. On 19 March 2020, the country’s efforts to tackle the spread of coronavirus received widespread attention when the government announced the use of a ‘Civil Quarantine’ app which they explained would require people in quarantine to send geo-located selfies within 20 minutes of receiving an alert – or face a visit from the police. according to the announcement, the app even uses controversial facial recognition technology to scan the selfies.

Early in April, the Polish government looked to make the use of the app mandatory, in a move which, as reported by EDRi member Panoptykon, was not proportionate [PL] (due to factors like people’s images being sent to government servers) and additionally not compliant with Poland’s constitution [PL]. Panoptykon emphasise important rules [PL] when implementing technological applications to combat COVID-19 such as minimising the data collected and having strict time periods for its retention, which states must follow in order to comply with fundamental rights.

The state of play in the UK:

The UK’s Coronavirus Act was passed on 25 March 2020, giving the UK government a suite of extraordinary powers for a period of 2 years. Following pressure from civil society, who called the proposed Bill “draconian”, the disproportiontely long period for the restrictions of people’s rights was adjusted to include parliamentary checks every 6 months. Yet NGOs have continued to question why the Bill is not up for renewal every 30 days, given the enormous potential for abuse of power that can happen when people’s fundamental rights protections are suspended or reduced. This is especially important given that, as EDRi member Privacy International has pointed out, the UK already has worryingly wide powers over forms of surveillance including bulk data interception and retention.

The UK has also come under fire for the sharp rise in disproportionate police responses since the introduction of the Bill, including stopping people from using their own gardens or using drones to chastise dog walkers. If not properly limited by law, these powers (and their abuse) have the potential to continue in ordinary times, further feeding the government’s surveillance machine.

POLITICO reports that UK authorities are not alone, with countries across Europe exploiting the climate of fear to encourage people to spy on and report their neighbours, alongside a rise in vigilante attacks, public shamings and even support for police violence. Such behaviour indicates an increasingly hostile, undemocratic and extra-judicial way of enforcing lockdowns. And it is frighteningly reminiscent of some of the most brutal, repressive twentieth-century police states.

What an open door could mean for the future of digital rights:

Allowing states to dispense with the rule of law in times of crisis risks putting those rights in a position of vulnerability in ordinary times, too. The legitimation and normalisation of surveillance infrastructures creates a sense that being watched and analysed all the time is normal (it is not) and contributes to societies filled with suspicion, abuse and mistrust. Before coronavirus was a household name, for example, Hungary’s secretive Szitakötő project was preparing to install 35,000 facial recognition cameras across the country for mass surveillance. This would allow the state to undermine the principle of investigatory warrants, and instead watch and analyse everyone, all the time. The current, indefinite state of emergency could remove any potential checks and balances on the Szitakötő plans, and allow Orbán to push through a wide range of ever-more-violatory measures such as repression of the free media, freedom of expression and political dissent.

Throughout 2019, Poland made its aspirations for global AI leadership clear, having experimented with automating state welfare since at least 2015. As UN Special Rapporteur Philip Alston has emphasised, the global uptake of “automated welfare” is a direct result of government goals to spend less on welfare, control people’s behaviour and punish those who do not conform. There is a risk that the Polish state could exploit new tech, like their quarantine app, to expand an undemocratic agenda and make technology the go-to solution for any public or societal issue. And the UK is already infamous for having one of the highest rates of surveillance cameras per capita in the world. Combined with the fact that the UK’s health service have employed the help of notorious personal-data-exploiting software company Palantir to manage coronavirus data, this suggests that the UK’s pre-existing public-private surveillance economy is the one area profiting from this crisis.

Conclusion:

Desperate times call for desperate measures – or so the saying goes. But this should not undermine the core values of our societies, especially when we have many reasons to be positive: compassionate and brave health workers treating patients; civil society working to protect rights in coronavirus apps and help governments make the right decisions; and global health organisations working to prevent future incidences of the virus and develop vaccines.

As the EU’s Committee for civil liberties state, mass surveillance does not make us safer. It puts undue limits on our liberties and rights which will continue long after the current emergency has been eased. As a result, we will all be less safe – and that really would be desperate times. In the words of Yuval Noah Harari:

[T]emporary measures have a nasty habit of outlasting emergencies, especially as there is always a new emergency lurking on the horizon […] Centralised monitoring and harsh punishments aren’t the only way to make people comply with beneficial guidelines. […] A self-motivated and well-informed population is usually far more powerful and effective than a policed, ignorant population.

Read more:
Fundamental rights implications of COVID-19 (various dates)
https://fra.europa.eu/en/themes/covid-1

Extraordinary powers need extraordinary protections (20/03/2020)
https://privacyinternational.org/news-analysis/3461/extraordinary-powers-need-extraordinary-protections

Use of smartphone data to manage COVID-19 must respect EU data protection rules (07.04.2020)
https://www.europarl.europa.eu/news/en/press-room/20200406IPR76604/use-of-smartphone-data-to-manage-covid-19-must-respect-eu-data-protection-rules

Contract Tracing in the Real World (12.04.2020)
https://www.lightbluetouchpaper.org/2020/04/12/contact-tracing-in-the-real-world/

(Contribution by Ella Jakubowska, EDRi Policy Intern)

close