Facebook unveiled its plan Tuesday to combat voter misinformation and foreign interference ahead of the 2022 midterms, which includes prohibiting ads that question the integrity of elections or encourages people not to vote.
The company said it spent $5 billion on security last year and has added hundreds of employees whose sole focus will be safety and security on the platform.
The measures in place for this year’s election will be similar to the 2020 presidential election, Meta President of Global Affairs Nick Clegg said in a blog post. Meta is Facebook's parent company.
Clegg wrote that Facebook is constantly investing in online election security – not just as Election Day nears.
TWITTER ANNOUNCES STRATEGY TO COMBAT ELECTION MISINFORMATION
,,,,,,ssd;fsldkfjsldkfjsdfsfsdsdff
"With each major election around the world — including national elections this year in France and the Philippines — we incorporate the lessons we learn to help stay ahead of emerging threats," he wrote.
Facebook has been criticized by all sides in the past for its handling of elections – some accusing it of being too hands-off and others of censoring free speech.
Last fall, Facebook whistleblower Frances Haugen testified to Congress that the platform is harmful to children, divisive and undermines democracy.
Facebook creator and CEO Mark Zuckerberg has also testified before Congress more than once about election integrity and other issues on the platform and the company has recently been criticized for failing to stop misinformation during Brazil’s elections.
Clegg said the company is using advanced security operations to fight influence campaigns and that they also have independent fact-checkers, transparency measures for advertising and other content and new measures to keep poll workers safe.
"As we did in 2020, we have a dedicated team in place to combat election and voter interference while also helping people get reliable information about when and how to vote," he said.
He said the company has stopped dozens of groups that have tried to interfere in elections and removed millions of content tied to hate globally along with more than 270 White supremacist groups.
"Of the content we removed, nearly 97% of it was found by our systems before someone reported it," he said. "We’re also investing in proactive threat detection and expanding our policies to help address coordinated harassment and threats of violence against election officials and poll workers."
He added that just like in 2020 the company will ban new political ads in the last week before an election.
"To simplify the process from the last cycle, any edits related to creative, placement, targeting and optimization won’t be permitted," he said. "Our rationale for this restriction period remains the same as 2020: in the final days of an election, we recognize there may not be enough time to contest new claims made in ads. This restriction period will lift the day after the election and we have no plans to extend it."
He said Meta would apply "labels that connect people with reliable information" if necessary to content that discusses the integrity of an election but said that users complained the labels were over-used "so in the event that we do need to deploy them this time round our intention is to do so in a targeted and strategic way."