Facebook sets up a ?war room? to battle misinformation ahead of elections: all you need to know

By Elton Gomes

Taking concrete steps to prevent the misuse of its platform during elections, Facebook has set up a ‘war room’ to curb the spread of potentially harmful content.

Facebook has been facing serious complaints of being unable to control content on its platform. The social media network has been questioned particularly for its inaction in preventing the spread of misinformation by Russia-linked accounts during the 2016 US Presidential elections.

By launching ‘war room’ at its headquarters in Menlo Park, California, Facebook aims to fight election interference on its platform.

What is Facebook’s war room, and what does it aim to do?

Facebook’s war room can be said to be the nerve centre in the fight against misinformation and manipulation of the largest social network by foreign actors attempting to rig elections in the United States and other places.

The war room is nothing but a conference room. The room’s walls have clocks showing the time in various regions of the US and Brazil. In addition, the room is equipped with maps and TV screens showing CNN, Fox News, and Twitter, as well as other monitors showing graphs of Facebook activity in real time.

“Our job is to detect … anyone trying to manipulate the public debate,” Nathaniel Gleicher told AFP. Gleicher is currently the chief of Facebook’s cybersecurity policy, and was previously associated with the White House as a policy director for the National Security Council.

“The War Room has over two dozen experts from across the company — including from our threat intelligence, data science, software engineering, research, community operations and legal teams,” Samidh Chakrabarti, Facebook’s director of Product Management, Civic Engagement, said in a statement, IANS reported. Chakrabarti added, “These employees represent and are supported by the more than 20,000 people working on safety and security across Facebook.”

Facebook said that its dashboards offer real-time monitoring of key elections issues, such as efforts to prevent people from voting, an increase in spam, potential foreign interference, or reports of content that violates its policies.

The war room team also monitors news coverage and election-related activity across other social media channels and traditional media so as to get an idea of what type of content could go viral.

If the team detects an untoward occurrence, it will be placed on a “situation board”. Data scientists will then investigate the problem before transferring it to the operation specialists to determine how Facebook’s rules can be applied, including those against hate speech, false news, and spam.

Facebook to delete post spreading voter misinformation

In another crackdown on misinformation, Facebook recently announced that it will ban fake content that has been designed to suppress voter turnout.

Specifically, Facebook revealed it would be removing posts that encourage “voter suppression”, which constitutes anything that might deter or prevent people from voting. Posts that imply others shouldn’t vote or attempting to feed voters incorrect information on the voting process in their region, would fall in that category.

Jessica Leinwand, Facebook’s public policy manager, explained that basic misinformation about voting booths, voting dates, and the like are already against the site’s rules. Facebook has just expanded the rules to include more forms of misinformation.

The war room is Facebook’s best effort to weed out misinformation

The war room, which will see a spike in activity for the November 6 midterm US elections, seems to be the most concrete step taken by Facebook in restricting misinformation. With experts in computer science, cybersecurity, and legal specialists, Facebook’s war room plans to eventually work 24/7.

The company said that by having different experts in this one room, representing their larger teams and coordinating the response together, they were able to address any shortcomings in two hours rather than several days.

“We were all delighted to see how efficient we were able to be, from point of detection to point of action,”  Chakrabarti told Bloomberg.

 


Elton Gomes is a staff writer at Qrius

Facebook