Europe is about to overhaul its 20-year-old e-Commerce Directive and it is a once-in-a-decade chance to correct the power imbalance between platforms and users. As part of this update, the Digital Services Act (DSA) must address the issue of political microtargeting (PMT).
Microtargeting, and PMT in particular, has the alarming power to derail democracy, and should be regulated. According to self-assessment reports, political advertisers spent €31 million (excluding the UK) on Facebook, and only €5 million on Google between March and September 2019. Facebook’s role in developing and targeted adverts goes far beyond a simple presentation medium — its tools for optimising ad delivery, targeting audiences and defining delivery criteria are far beyond the capacity of most political parties alone. A detailed report based on data collected during two Polish election campaigns in 2019 carried out by Panoptykon and partners, shed critical light on the role of the company, and what it revealed was extremely informative:
The study found that Facebook’s transparency and control tools that would explain how ad targeting works offered to both researchers and users are “insufficient and superficial.” Users are targeted by Facebook’s algorithm based on potentially thousands of distinct selectors following a a set of criteria that only the company knows. Advertisers on Facebook can opt to select audiences on obvious factors such as age, gender, language spoken and location. But the Facebook machine also steers them towards increasingly narrower criteria such as interests (political affiliation, sex orientation, musical tastes, etc…), “life events” and behaviour, as well as more than 250,000 free-text attributes including, for example, Adult Children of Alcoholics, or Cancer Awareness, which constitute a deeper privacy concern.
Facebook is not merely a passive intermediary; its algorithms interpret criteria selected by advertisers and deliver ads in a way that fulfils advertisers’ objectives, and actively curate the content that users see in their timelines based on those assumptions. In 2016, the company introduced a feature allowing them to target “lookalikes” – profiles similar to a target audience. It also allows A/B testing so advertisers can compare which ads are more effective.
But Facebook’s “why am I seeing this ad?” transparency tool can be misleading, revealing only the “lowest common denominator” attribute. For example, according to the report, during the European elections campaign in Poland in May 2019, a person who was pregnant saw a political ad referring to prenatal screenings and perinatal care. “Why am I seeing this ad?” informed her that she was targeted because she was interested in “medicine” (potential reach 668 million) rather than “pregnancy” (potential reach of 316 million). Users can only verify (check, delete, or correct) a short list of interests that the platform is willing to reveal.
Here is where upcoming regulation comes into play: At the very least, the Digital Services Act should prohibit PMT based on characteristics which expose our mental or physical vulnerabilities (e.g. depression, anxiety, addiction, illness). But if the EU wants to be ambitious and tackle many of the associated problems with the current business model, the DSA should go further and regulate any sort of advertising aimed at profiling users, particularly as there appears to be a gap between ads labelled as “political” by the platform, and ads perceived as political by researchers.
Regulating targeted ads, requiring greater transparency for researchers and users, opt-in rather than opt-out, tighter requirements for political advertising and recognising PMT as an application of AI that poses serious risks for human rights will not solve all the problems of political disinformation in society, but they would certainly eliminate some of the worst practices today.
Read more:
Who (really) targets you? Facebook in Polish election campaigns
https://panoptykon.org/political-ads
Annual self-assessment reports of signatories to the Code of Practice on Disinformation 2019 (29.10.2019)
https://ec.europa.eu/digital-single-market/en/news/annual-self-assessment-reports-signatories-code-practice-disinformation-2019
(Contribution by Karolina Iwańska, from EDRi member Panoptykon)
close