Published: Wed, January 24, 2018
Science | By Eileen Rhodes

Facebook admits it was too slow to stop spread of fake news

Facebook admits it was too slow to stop spread of fake news

In response to mounting criticism, Facebook has used third-party fact-checkers to help flag fake-news stories and is planning to survey users on what news sources they trust.

In a series of blog posts by executives, the tech company critically reckoned with its impact on the global political process.

Chakrabarti expressed Facebook's regrets about the 2016 USA elections, when according to the company Russian agents created 80,000 posts that reached around 126 million people over two years.

Facebook's global politics and government outreach director Katie Harbath also acknowledged the potential harm the platform could cause when left unchecked.

Chakrabarti also lists the fact that "two-thirds of United States adults consume at least some of their news on social media" as a positive, but given Facebook's general track record as a news source, that, uh, seems like a bit of a stretch.

Dr Aly also warned that it was not the role of social media companies to decide what the truth was, saying education was critical in empowering regular people to be discerning and analytical in their consumption of information online.

More news: At LA Women's March, Viola Davis, Scarlett Johansson rally crowd

Chakrabarti pointed at Facebook's pledge past year to identify the backers of political advertisements-while also stressing the need to tread carefully, citing the example of rights activists who could be endangered if they are publicly identified on social media. Facebook CEO Mark Zuckerberg originally pledged to do more to protect "election integrity", but later defended his company on several occasions, arguing the amount of ads were minuscule relative to the large digital ad market.

"While personalization initially sounds like a positive effort, Sunstein noted that feeding people a single perspective has the potential to be extremely risky, even leading to extremist viewpoints and group polarization", he wrote.

It's abhorrent to us that a nation-state used our platform to wage a cyberwar meant to divide society. "Unplanned, unanticipated encounters are central to democracy itself". And we asked them to take on a broader, and perhaps more blunt, question: Is social media good or bad for democracy?

Facebook also talked openly about the challenge of working with government leaders who harass their own citizens - the subject of recent Bloomberg investigations. "At its worst, it allows people to spread misinformation and corrode democracy", it adds. "That's why we have a moral duty to understand how these technologies are being used and what can be done to make communities like Facebook as representative, civil and trustworthy as possible". Yet while social media may encourage group polarization, Sunstein wrote,"on balance, they are not merely good; they are terrific". Our role is to ensure that the good outweighs the forces that can compromise healthy discourse.

Facebook, which is striving hard to make its platform sanitised from fake news and echo chambers, has realised that it can't guarantee that social media is not harmful to democracy.

Like this: