As voters voted face-to-face across the United States on Tuesday, Facebook promised to monitor misinformation and manipulation on election day in real time.
Like other social platforms, the company has pledged to prevent misinformation during the election, including prematurely declaring victory, in order to avoid repeating 2016 manipulation efforts.
“Our Election Operations Centre will continue monitoring a range of issues in real time — including reports of voter suppression content,” said a Facebook statement posted on Twitter.
“If we see attempts to suppress participation, intimidate voters, or organise to do so, the content will be removed.”
Facebook said its election center is still tracking other issues, such as actions taken by supporters of President Donald Trump, surrounding Democrat Joe Biden’s campaign bus.
“We are monitoring closely and will remove content calling for coordinated harm or interference with anyone’s ability to vote,” Facebook said.
Facebook reiterated that it would place warning labels on any posts which seek to claim victory prematurely.
“If a presidential candidate or party declares premature victory, we will add more specific information in the labels on candidate posts, add more specific information in the top-of-feed notifications and continue showing the latest results in our Voting Information Center,” the social giant said.
At the same time, Twitter added a warning label to a Trump tweet late Monday to spread misleading information.
The post stated that slow voting on the battlefield in Pennsylvania could lead to “rampant and unrestricted cheating.”
“It will also induce violence in the streets. Something must be done!” he tweeted.
Both Facebook and Twitter have taken multiple steps to stop the flow of false and misleading election information, but encountered glitches and loopholes in implementing their policies.
On Monday, Twitter Inc had outlined a plan for placing warning labels on tweets from US election candidates and campaigns that claim victory in advance of official results.
The move came as the social network braced for what it has called an unusual election due to a high number of mail-in ballots that may cause a delay in final results.
Beginning on election night through the inauguration, Twitter said it would place warning labels such as “official sources called this election differently”, or “official sources may not have called the race when this was tweeted”.
US-based accounts with over 100,000 followers and a significant engagement will also be considered for labeling, Twitter said.
Google-owned YouTube has also sought to limit the sharing of videos with election misinformation.
Social media companies are under pressure to combat election-related misinformation and prepare for the possibility of violence or poll place intimidation around the November vote.