Has Facebook learned anything from Trump and Brexit?

The social platform launched new tools to curb fake information influencing the general election, but did it work?
Sean Gallup/Getty Images

Facebook and Twitter invented the tools and wrote the rules of social media. But, after being gamed by dark forces with an eye on disrupting global politics, they're being forced to rip up and rewrite the playbook.

In response to the backlash around the spread of fake news, or more accurately false information, during Brexit and the US presidential election, Facebook has taken a different approach to the recent federal election in Germany. In a blog post, Richard Allan, Facebook's vice president of public policy for EMEA, listed the ways the social network was targeting the spread of fake news and disinformation and eliminating fake accounts from its service, which has two billion users.

Last week, Facebook CEO Mark Zuckerberg backtracked on what he had previously called a “crazy idea”: that Russian hackers had influenced the 2016 US presidential election through social media. In an abrupt volte-face, Zuckerberg now says his company is “taking steps to protect election integrity and make sure Facebook is a force of good in democracy”.

He explained the “amount of problematic content we’ve found so far remains relatively small”. A week earlier, chief security officer Alex Stamos said the company had uncovered $100,000 (£74,380) in ad spending between June 2015 and May 2017 associated with 3,000 ads connected to 470 inauthentic accounts and pages that violated Facebook’s policies. These accounts, Stamos confirmed, were affiliated with each other and operated out of Russia.

In response, the US House Intelligence Committee has announced it will hold a public hearing on the matter of Russian election influence and has invited Facebook, Google and Twitter to testify.

Read more: Facebook has 2 billion users. Here's its plan to keep growing

Twitter has also come under fire for its role in allowing bot accounts to spread false information on divisive issues. In addition, there’s evidence of Russian-linked accounts that posed as Americans – highlighting the network’s struggles to get a grip on fake accounts. Around 600 Twitter accounts linked to Russian influence operations are being tracked by the Alliance for Securing Democracy, a bipartisan initiative of the German Marshall Fund, a public policy research group in Washington.

In the lead up to the general election in the United Kingdom, Facebook and Google part-funded a team of fact checkers, economists and statisticians to thwart fake news – a move that followed the learnings of Brexit when sites such YourBrexit.uk were found to be spreading misinformation through the network, influencing undecided voters.

By the time of the German election, investigations into how Facebook, Twitter and others handled the infiltration of fake news, fake accounts, and fake advertising were in full swing. And, to an extent, Facebook has responded to this.

In his post, Stamos wrote that the social network had seen good results post-election in Germany from the new measures it had implemented. These included removing tens of thousands of fake accounts from Facebook in the month before the election and testing a new related articles to give people access to different perspectives.

But some will be left wondering how reliable the results from Germany really are. Germans tend to be more suspicious of social media than Americans – five out of ten get their news from Facebook compared to eight out ten in America. The country’s two public broadcasters and other leading news outlets took the same approach as the UK – assembling teams of fact checkers working to examine and trace inflammatory material that had been flagged by the public. Germany also has stricter laws regulating data privacy and political speech.

“These actions did not eliminate misinformation entirely in this election – but they did make it harder to spread, and less likely to appear in people’s news feeds. We learned a lot, and will continue to apply those lessons in other forthcoming elections,” Stamos concluded.

While Facebook’s new rules attempt to solve the problem by getting people to spend more time and money on Facebook, the company does seem to be learning, or at least acknowledging its own vulnerabilities and influence. Time will tell how effective it will be, and the next US election is only three years away.

This article was originally published by WIRED UK