For as long as human beings have told each other stories, fake news has existed; Roman emperors, Greek politicians and Egyptian pharaohs all disseminated fake news about their enemies. Leaders could choose to a variety of methods; commission poems, use powerful rhetoric, erect a symbolic statue. As the saying goes, “a lie is halfway around the world before the truth has got its pants on”.
Figure 2 Statue of Augustus. The Roman emperor used fake news to help him legitimise his war against Anthony and Cleopatra, turning it from being a civil war to a defence of the Republic.
Propaganda is a tried and tested tool in the arsenal of political campaigning and international cyber-warfare. The term “fake news” has been popularised over the last two years, with a 365% increase in usage from 2016 to 2017. Collins Dictionary made it their Word of the Year in 2017. Donald Trump’s presidential campaign flourished on the back of conspiracy theories and fake news generated and shared across social media in 2016. Hungary’s Viktor Orbán utilised disinformation around illegal immigration to help him secure another term as Prime Minister this year. In Myanmar, fake news was used to stoke tensions and incite violence against religious minorities. All three relied heavily on Facebook as a vehicle for spreading their message, ideas, and beliefs.
It’s not hard to see why. Every second, five new Facebook profiles are created, adding to the two billion already in existence. Every minute, 510,000 comments are posted, 293,000 statuses updated, and 136,000 photos uploaded. Every day, 4.75 billion pieces of content – news stories, articles, viral videos – are shared across the platform. The amount of raw data Facebook needs to process and regulate (filtering posts that break the law) each 24-hour cycle is vast. On top of this, Facebook then personalises each newsfeed to each user, considering over 1,000 variables per post to make it as engaging as possible for the users. Eventually, around 300 posts are selected and placed in a user’s Newsfeed each day.
Figure 3 Most popular fake election stories in the United States in 2016, by Facebook engagement (in thousands)
It took Trump’s Presidential campaign and allegations of Russian interference, along with incoming legislation in Europe threatening fines for extremist content, to jolt Facebook into action. The increasingly partisan nature of the American political landscape proved a fertile breeding ground for fake news articles to be shared, commented on and spread far and wide. More than two thirds of American adults use Facebook to get at least some of their news, compared to a quarter of Brits. Furthermore, as Facebook tailored its algorithms to try and keep users on the site, it pushed and recommended related articles, regardless of the authenticity of the site it was taking them from. Conspiracy websites like infowars, which continues to preach that the Sandy Hook massacre was staged, began prominently featuring on Newsfeeds after breaking events took place.
Figure 4 An example of a fake news story shared by far-right site Your News Wire
Clearly, Facebook had a problem. However, tackling the spread of fake news on the site raised ethical questions for management. Accused for a long time of already having an anti-Conservative leaning, the leadership team struggled with the ethics of determining what exactly fake news was, and if it was their job to limit the freedom of expression of its user. Whilst ‘Ranking’ news sites has been floated as the latest possible answer, the company seems to only want to utilise this as a last resort.
Facebook initially decided to tackle the spread of fake news articles across the site by introducing ‘Disputed Flags’. The idea was simple; when a user saw an article on their feed, it would have a small flag next to it. Red flags would indicate an article that had been disputed by at least two independent fact checkers, and therefore a user would know that the article in question was likely to be fake news. In theory, this would stop users from being fooled by fake news sites posing as genuine established media companies. Unfortunately, Facebook failed to comprehend human nature. Red flags next to articles actually made it more likely for users to click on the link, and in some cases reinforced the pre-conceived beliefs of the user.
Facebook next toyed with the idea of minimizing attachments from untrustworthy sites in the newsfeed. Links shared by users from ‘untrustworthy’ sites would not include image previews, in theory making them less visually attractive and therefore less likely to be clicked on by someone’s Facebook friends. The jury is still out as to how successful this has been.
At the beginning of this month, Facebook entered its most recent chapter in its war against the spread of fake news – it eliminated the sidebar of trending articles. A long-contested feature, the sidebar accounted for only 1.5% of clicks throughs to publishers but often contained inflammatory or inaccurate news reports. Examples in the past included strange headlines such as “DEEZ NUTS” and “LADY GAGA – Photos appear to show singer falling to ground while getting in her convertible”. In the states, Conservatives launched a campaign against the trending sidebar, accusing it of having a deep seated liberal bias, a claim Facebook was forced to refute. In the United Kingdom, Nigel Farage accused the company of censoring conservative viewpoints, citing a drop in engagement on his page. In a statement, Facebook’s Head of News Products Alex Hardiman credited a change in consumer habits to the removal of the trending section, saying: “We’ve seen that the way people consume news on Facebook is changing to be primarily on mobile and increasingly through news video. So we’re exploring new ways to help people stay informed about timely, breaking news that matters to them, while making sure the news they see on Facebook is from trustworthy and quality sources”. The company also released this cutesy video explaining how they would be using new algorithms to try and predict what news stories a user wants to read, based of a variety of factors.
Perhaps greatest hurdle when it comes to the spread and dissemination of fake news across the site is its complicit userbase. Facebook’s fastest growing audience are 55+ years old. Since 2012, this group has grown by 46% while the younger generations either leave the site or refuse to sign up in the first place. Unfortunately, the older generation tend to be the least internet-literate. Unlike younger, more internet savvy generations, they haven’t grown up with instant access to information from across the world at their fingertips. A reddit group called r/oldpeoplefacebook that shares screenshots of old people’s internet mishaps has a dedicated following of over half a million people.
Meet Daniel’s mum Lorna…
Facebook can continue to roll out measures that aim to reduce the spread of fake news across the site. It can minimize links and bump articles down the order to manipulate newsfeeds. But at the end of the day, the user base will decide what content is shared or not shared. Ultimately, as multiple studies have highlighted, users are more likely to share, interact with and react to stories that engage with them emotionally. 62% of users will click on a news article shared by a friend on Facebook. Depressingly, a recent study found 59% of people share stories without ever clicking beyond the headline. And if these stories happen to be poorly-sourced or ill-judged fake news? So be it.
So what does this mean from a communications perspective? Primarily, two things. The first is that Facebook is now more than ever a pay-to-play platform for brands and companies. For your page updates to reach current and potential followers, even clickbait headlines are unlikely to be seen unless you are paying to promote your posts; long gone are the days of organic reach. The second point is that Facebook is currently looking to prioritise human-on-human interaction. To this end, posts shared by other users will be far more likely to appear in the feeds of their friends than a post from a page they ‘liked’ two years ago. Every positive interaction with your page is therefore crucial – meaningful engagement and discussion that keeps people coming back will be rewarded in the newsfeed and keep surfacing up top. Our advice? Quality over quantity is the key to success.