Facebook's doomsday scenario for the US presidential election is now becoming clear. Last-minute political ads spread disinformation days before the vote and then posts casting doubt on the election result go viral and spread to millions of people.
Hit fast forward on this scenario and it's not difficult to see Facebook being used to pour fuel on the fire, amplifying violent protests across America in the days immediately following the election.
The social network is hoping to avoid the threat through a series of new measures announced on Thursday. Put together, the aim is to reduce any potential problems which may crop up before votes are cast on November 3.
Facebook will ban all new political adverts during the final week of the election campaign in an effort to buy the social network time to check them before voting day.
It will also place notices on posts claiming early victories or suggesting that the result of the election is not legitimate. And any posts which spread misinformation about the voting process will be removed.
Facebook will also introduce a new limit preventing people from forwarding Facebook messages to many people at one time in a bid to slow down the distribution of misleading content.
"With our nation so divided and election results potentially taking days or even weeks to be finalized, there could be an increased risk of civil unrest across the country," Facebook chief executive Mark Zuckerberg wrote in a post announcing the changes.
Experts say Facebook's new steps will go some way in helping to reduce the possibility that the social network could become a prime driver of unrest around the time of the election.
"The limiting of forwarding on Messenger could have a significant impact on the scale and speed at which disinformation travels through closed spaces," says Chloe Colliver, the head of digital policy and strategy at the ISD.
But there are concerns that Facebook's new steps don't go far enough. Justin Hendrix, the executive director of NYC Media Lab, cited the growth of the conspiracy on Facebook in an article in which he said Facebook's changes are "too little, too late."
"There is cause for great concern that news of these efforts are less of a reason to be optimistic about what is going to unfold, and more of a sign of the chaos to come," he wrote.
Damian Collins, the former chairman of the Digital, Culture, Media and Sport select committee, calls Facebook's new measures a "gesture" but wants the social network to do more.
"I think it misses the problem," he says. "The problem is not that there are political ads on Facebook, it's the way Facebook's targeting tools can be used and they allow microtargeting of political ads at voters in a way that other platforms don't."
The limits on political advertising have been welcomed by experts, but there's a risk that a focus on these paid posts ignores content which goes viral without any money spent promoting them such as posts related to the Qanon conspiracy theory.
"The problems on Facebook aren't just down to political ads that people spend money on," Collins says. "As we consistently see, it's also networks of groups and what goes on inside these Facebook groups where we often don't know who's running them or what they're doing in them."
Zuckerberg acknowledged this issue in his post announcing the restrictions, specially naming the Qanon conspiracy theory as a problem for the social network and promising an increase in enforcement against these groups in the coming months.
There are also concerns that Facebook's new policies only cover the period up to voting day. If Zuckerberg is so worried about candidates disputing the election result, why not continue to enforce these policies throughout November?
"The period following election day is rife for manipulation on the platform, as things stand," Culliver says. "Civic integrity policies need to be extended after the election to help protect from the very real risks of coordinated disinformation about the results and the potential for election-related violence."
Facebook's new actions are a step in the right direction, experts agree, but they'll always have concerns that Facebook is doing too little to protect elections.
The same problem surfaced in June when Facebook announced stricter rules on political advertisements ahead of the US election.
For many critics of Facebook, the social network will need to make far more dramatic changes.
"They need a proper fix for the way in which misleading speech and hate speech can proliferate around political events on the platform," Collins says. "It should be a permanent policy change rather than just a gesture of stopping political ads a week before the election."