Facebook’s New Data Use Policy (December 11th, 2012) – What Has Changed and What Do These Changes Mean?

Facebook’s New Data Use Policy (December 11th, 2012) – What Has Changed and What Do These Changes Mean?

On December 11th, 2012, Facebook published a revised version of its Data Use Policy. There are three observations I have made when reading the new policy and comparing it to the previous one.

The length of the policy has further increased, from 8896 to 9379 words, which means an increase by 483 words. The longer a policy, the more unlikely it gets that users fully read it and therefore take notice of how their data is being processed. The question is if one can expect that Facebook’s more than a billion users[1] give unambiguous consent to a policy if it requires an hour to read its content. According to the European Union’s Data Protection Directive 95/46/EC, personal data may only be processed if “the data subject has unambiguously given his consent” (§7(a)). In this context, “’the data subject’s consent’ shall mean any freely given specific and informed indication of his wishes by which the data subject signifies his agreement to personal data relating to him being processed” (§2(h)). The basic question is if it is an unambiguous consent, if a user enters some basic data and then clicks on a “sign up” button, above which s/he reads the text: “By clicking Sign Up, you agree to our Terms and that you have read our Data use policy, including our Cookie Use”. Observers such as the Norwegian Data Inspectorate (Årnes, Skorstad and Paarup Michelsen 2011) have expressed strong doubts in this context and ask if pressing a button that says one agrees to certain terms constitutes consent according to European data protection legislation: “- Is the Facebook consent not, at least partly, a retrospective one? – Is the user really ‘informed’ after having read the Policy and the Statement? – What if he or she didn’t read the statement at all?” (Årnes, Skorstad and Paarup Michelsen 2011, 40).

Facebook has abolished a governance mechanism that allowed users to vote on changes of the privacy policy and the terms of use. The formulation in the previous policy was: “Unless we make a change for legal or administrative reasons, or to correct an inaccurate statement, we will give you seven (7) days to provide us with comments on the change. If we receive more than 7000 comments concerning a particular change, we will put the change up for a vote. The vote will be binding on us if more than 30% of all active registered users as of the date of the notice vote“. In the new policy, this passage was subsituted by the following one: “Unless we make a change for legal or administrative reasons, or to correct an inaccurate statement, we will give you seven (7) days to provide us with comments on the change. After the comment period, if we adopt any changes, we will provide notice (for example, on the Facebook Site Governance Page or in this policy) of the effective date“.

If one interprets the Facebook governance system as a political system, then this means that in the first instance election results were only accepted if the voter turnout was higher than 30%. In elections to the European Parliament, voter turnout has in the past been lower than 30% in some European countries: 29.2% in Bulgaria (2007), 28.2% (2009) and 28.3% (2004) in the Czech Republic, 24.5% (2009) and 20.9% (2004) in Poland, 27.7% (2009) and 29.5% (2007) in Romania, 19.6% (2009) and 17.0% (2004) in Slovakia, 28.3% (2009) and 28.4% (2004) in Slovenia and 24.0% (1999) in the United Kingdom (data source: International Institute for Democracy and Electoral Assistance, http://www.idea.int/vt/viewdata.cfm). Nonetheless in those cases the outcome of the elections was accepted and these countries sent representatives to the European Parliament. Facebook’s practice was that it decided itself what changes are taken if the voter turnout was low, i.e. it only allowed democracy if a certain share of voters was achieved and practiced dictatorship in other cases. If the old governance mechanism was a strange mix of dictatorship and democracy, then Facebook now leaves no doubt in the new policy that decision making on part of the users is not welcome. They are allowed to comment, but decision power is entirely controlled by Facebook itself. 668 872 users participated in the vote on the privacy policy change, 589 141 (88.1%) opposed changes.

Facebook stated on its government site that this is less than 1% of all its users and that it therefore changes the policy. It furthermore wrote: “We understand that many of you feel strongly about maintaining the participatory nature of our site governance process. We do too. We believe that having a meaningful dialogue with our community through our notice and comment process is core to that effort moving forward. We also plan to explore and implement new, innovative and effective ways to enhance this process in order to maximize user engagement”. What Facebook here makes clear is that it is in favour of listening and talking without any significant decision power. Political communication in such a process becomes a mere ideology: the dictator listens to citizens’ concerns, they are allowed to voice an opinion, but have no right to decide. The policy change shows that one should not have any illusions that capitalist companies have any significant interest in democracy.

The new privacy policy does not have any surprises in the context of targeted advertising: Just like the old one it legitimates the use of profile data, browsing data, social network data for the purpose of targeting ads, a practice that means that user data is commodified and that users create this commodity, i.e. work without payment for producing a data commodity that is sold to advertisers who then present ads to users:

* “Sometimes we get data from our affiliates or our advertising partners, customers and other third parties that helps us (or them) deliver ads, understand online activity, and generally make Facebook better. For example, an advertiser may tell us information about you (like how you responded to an ad on Facebook or on another site) in order to measure the effectiveness of – and improve the quality of – ads“.

* “We use the information we receive, including the information you provide at registration or add to your account or timeline, to deliver ads and to make them more relevant to you. This includes all of the things you share and do on Facebook, such as the Pages you like or key words from your stories, and the things we infer from your use of Facebook. Learn more at: https://www.facebook.com/help/?page=226611954016283

When an advertiser creates an ad, they are given the opportunity to choose their audience by location, demographics, likes, keywords, and any other information we receive or can tell about you and other users. For example, an advertiser can choose to target 18 to 35 year-old women who live in the United States and like basketball. An advertiser could also choose to target certain topics or keywords, like “music” or even people who like a particular song or artist“. Facebook exploits the digital labour of its users:

* “Sometimes we allow advertisers to target a category of user, like a ’moviegoer’ or a ’sci-fi fan’. We do this by bundling characteristics that we believe are related to the category. For example, if a person ’likes’ the ’Star Trek’ Page and mentions ’Star Wars’ when they check into a movie theater, we may conclude that this person is likely to be a sci-fi fan. Advertisers of sci-fi movies, for example, could ask us to target ’sci-fi fans’ and we would target that group, which may include you. Or if you ’like’ Pages that are car-related and mention a particular car brand in a post, we might put you in the ’potential car buyer’ category and let a car brand target to that group, which would include you“.

One change in the privacy policy is the addition of the following sentence to the section that covers “Personalized ads“: “If you indicate that you are interested in topics, such as by liking a Page, including topics such as products, brands, religion, health status, or political views, you may see ads related to those topics as well“. Religious and political views and health data are according to the EU Data Protection Directive sensitive data that need to be processed with particular caution. In its new Data Use Policy, Facebook unambigously makes clear that it uses sensitive data for targeted advertising. The EU Data Protection Directive says in this context: “1. Member States shall prohibit the processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life. 2. Paragraph 1 shall not apply where: (a) the data subject has given his explicit consent to the processing of those data“ (§8). Therefore Facebook is only allowed to use health data, political and religous views in the EU if the users give “explicit consent“. So the question is if clicking on a button that says that one accepts the terms of use and privacy policy is an “explicit consent“.

The Article 29 Data Protection Working Party (2010) says that any “possible targeting of data subjects based on sensitive information opens the possibility of abuse. Furthermore, given the sensitivity of such information and the possible awkward situations which may arise if individuals receive advertising that reveals, for example, sexual preferences or political activity, offering/using interest categories that would reveal sensitive data should be discouraged” (19). The Article 29 Data Protection Working Party (2010, 20) also holds that “in no case would an opt-out consent mechanism meet the requirement of the law” and one would here require “separate prior opt-in consent” for the processing of sensitive data.

This means that there are doubts if mentioning that sensitive data is used for targeted advertising in one sentence that is part of a policy that is longer than 9000 words and telling users that they agree if they click on the “sign up”-button constitutes explicit consent. The Article 29 Data Protection Working Party’s opinion implies that the use of sensitive data for targeted advertising is in general problematic and violates users’ privacy.

So overall, how can Facebook’s new privacy policy best be characterised? Exploitation of users, increased complexity, violation of privacy in the context of the processing of sensitive personal data.

References

Article 29 Data Protection Working Party. 2010. Opinion 2/2010 on online behavioural advertising. Adopted on 22 June 2010. http://ec.europa.eu/justice/policies/privacy/docs/wpdocs/2010/wp171_en.pdf

Årnes, Atle, Jørgen Skorstad and Lars-Henrik Paarup Michelsen. 2011. Social network services and privacy. A case study of Facebook. Oslo: Datatilsynet.


[1] Facebook use within a 3 month period: 43.328% of all Internet users, alexa.com, accessed on December 14th, 2012; number of worldwide Internet users: 2 405 518 376, internetworldstats.com, accessed on December 14th, 2012. => number of Facebook users: approx. 1.04 billion

  • Share/Bookmark
SociBook del.icio.us Digg Facebook Google Yahoo Buzz StumbleUpon

You can follow any responses to this entry through the RSS 2.0 feed.
You can leave a response, or create a trackback from your own site.

One Response to “Facebook’s New Data Use Policy (December 11th, 2012) – What Has Changed and What Do These Changes Mean?”



Leave a Reply