Facebook and Twitter's real sin goes beyond fake news

Facebook CEO Mark Zuckerberg...Several major U.S. tech companies have announced steps to reign in fabricated news.
Facebook CEO Mark Zuckerberg...Several major U.S. tech companies have announced steps to reign in fabricated news. Bloomberg

Social media companies are taking heat for influencing the outcomes of the U.S. presidential election and Brexit referendum by allowing fake news, misinformation campaigns and hate speech to spread.

But Facebook and Twitter's real sin was an act of omission: they failed to contribute to the data that democracy needs to thrive. While sitting on huge troves of information about public opinion and voter intent, social media firms watched as U.S. and UK pollsters, journalists, politicians and civil society groups made bad projections and poor decisions with the wrong information.

The data these companies collect, for example, could have told us in real-time whether fake news was having an impact on voters. Information garnered from social media platforms could have boosted voter turnout as citizens realised the race was closer than the polls showed - and that their votes really would matter. Instead, these companies let the United States and UK tumble into a democratic deficit, with political institutions starved of quality data on public opinion.

Legally, social media companies aren't obligated to share data in the public interest. And what they can share is always shaped by users' privacy settings, country-specific rules about selling personal information, and the particular deals companies like Facebook and Twitter make with third party businesses. But they are now the primary platforms for political conversation. As such, they should act in ways that support democratic practices, especially around sensitive political moments like elections.

Facebook and Twitter have the ability to reach, and target, millions of voters. From the minute you sign up on one of these platforms, the companies use data about your behaviour, interests, family and friends to recommend news and new social connections. And they sell this data to other companies for even deeper analysis on what you might buy and what you think about important social issues.

By examining data about the connections you make and content you share, social media companies can make powerful inferences about whether you are likely to vote, how you are likely to vote, and what kinds of news or advertisements might encourage or discourage you to engage as a citizen.

Social media firms regularly study the news consumption habits of users, producing fine-grained analysis of the causes and consequences of political polarisation on its platform. To that end, only Facebook and Twitter know how pervasive fabricated news stories and misinformation campaigns have become during referendums and elections. They know who clicked on what links, how much time each user spent reading an "article," and where the user was physically located.

If the companies merged user data with other datasets - say, from credit card records or voter registration files - they may even know the user's voting history and which political groups the user has donated to. These companies know enough about voter attitudes to serve up liberal news to liberals and conservative news to conservatives, or fake news to undecided voters.

During the recent U.S. presidential election, there was a worrying amount of false information on both Facebook and Twitter, and research suggests that many users can't distinguish between real and fabricated news. My own research on this "computational propaganda" shows that Facebook and Twitter can be easily used to poison political conversations. Trump campaigners were particularly good at using bots - basic software programs with communication skills - to propagate lies. Bogus news sites were started just to make money for their founders, but undoubtedly influenced some voters' view when manipulated images and false reports went viral.

Several major U.S. tech companies have since announced steps to reign in fabricated news. In response to criticism about the spread of misinformation on Facebook, Mark Zuckerberg described in a post some of the projects the company already has underway, including making it easier for users to report fake news. Facebook has also updated its advertising policies to spell out that its ban on deceptive and misleading content applies to this type of content. Google has said it is working to prevent websites that spread bogus news from using its advertising platform. But more can be done.

Polls miss the full picture

While social media use has been on the rise, our systems for measuring public opinion have been breaking down. Telephone- and internet-based surveys are increasingly inaccurate. With so many people on mobile phones, consuming political content that comes to them through friends, family and Facebook, traditional polling companies no longer get a full picture of what the public knows and wants.

For modern democracies to work, three kinds of polling systems need to be up and running. First, nationwide exit polls, which identify mistakes in how elections are run, helping to confirm or refute claims of fraud. For several decades exit polling was coordinated by major news outlets, but the coalition broke down in the United States in 2002 and 2005 in the UK. Today, exit polls are run haphazardly, and are more about predicting winners and outcomes than systematically checking the results.

Second, democracies need a regular supply of public policy polls so that journalists, public policy makers, civic groups and elected officials can understand public opinion before and after voting day.

Third, democracies need "deliberative polls" that put complex policy questions to representative groups of voters who are given time to evaluate the possible solutions. These kinds of polls engage citizens about public policy options through extended conversations with experts and each other. They lead to more informed decision-making.

Companies like Facebook and Twitter manage the platforms over which most citizens in advanced democracies now talk about politics, and they could be the critical new platforms for these polling systems. They could never completely replace existing techniques for measuring public opinion. But our existing polling systems are weakening, and social media platforms have an obvious role to play.

With the data at their disposal and the platforms they maintain, social media companies could raise standards for civility by refusing to accept ad revenue for placing fake news. They could let others audit and understand the algorithms that determine who sees what on a platform. Just as important, they could be the platforms for doing better opinion, exit and deliberative polling.

This year, Facebook and Twitter watched as ways of measuring public opinion collapsed. Allowing fake news and computational propaganda to target specific voters is an act against democratic values. But withholding data about public opinion is the major crime against democracy.

Philip N. Howard is a professor of sociology, information and international affairs at Oxford University. He is the author, most recently, of "Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up."

magazine.afr.com

Reuters