His account, according to the company, will remain suspended until 7 January 2023, and he will only be reinstated if the “risk to public safety has receded.” Even if the bombastic ex-president does return, he will face “a strict set of rapidly escalating sanctions” that would be applied if he broke more user rules.
Mark Zuckerberg said at the time his dramatic action against the then-president was essential to prevent more political violence ahead of Joe Biden’s inauguration.
It was the most high-profile ban in the company’s history and a move that was quickly followed by other social media giants.
The ban was welcomed as long overdue by critics of the president and was blasted by conservatives as the ultimate example of Big Tech political “censorship.”
Facebook’s independent Oversight Board, a body which scrutinizes high-profile decisions about its users, upheld the initial suspension of Mr Trump in May, but they criticised the company over the indefinite time period, which led Mr Zuckerberg to introducing a time period for suspensions.
The board is an independent and outside group that was created by Facebook to review the company’s most complex issues.
Facebook referred Mr Trump’s case to the board on 21 January, and though it normally has a 90-day period to reach its decision, on 16 April it announced an extension was necessary.
The board received 9,000 public comments before deciding to keep Mr Trump’s ban in place.
The oversight board choses a panel of five of its 20 members to judge the case, with only a simple majority required.
The panel then takes its decision back to the full board, which can overrule it if a majority of members disagree with the panel’s decision.
According to the board’s bylaws, its decision on the ban, which requires a simple majority, is binding and cannot be challenged by the company.
Mr Trump will also not be able to appeal the board’s decision if it goes against him.
The board could also make recommendations and policy suggestions about the treatment of world leaders and politicians on the platform, but they are not binding and the company is not obliged to act on them.
A written version of the board’s decision will be posted to its website that will include a full explanation of what was considered, how the decision was reached and the public comments.
Facebook’s decision to ban the twice-impeached president was the culmination of six years of struggling to deal with the controversial businessman-turned-politician.
In 2015 when he was still only a candidate for the Republican presidential nomination, Mr Trump posted a video calling for a ban on Muslims entering the US.
Facebook refused to take it down and created the “newsworthiness” policy that allowed posts which broke their own guidelines to remain up because of public interest.
During 2020 Mr Trump used Facebook to post a string of misleading information about Covid-19 and attacked racial justice protesters as “thugs.”
With criticism of the company’s approach to Mr Trump deepening, Facebook announced in June 2020 that it would put labels on posts that violated hate speech and other policies, even if they were from politicians such as Mr Trump.
And they committed to remove any posts which incited violence or voter suppression, regardless of newsworthiness.
But it was Mr Trump’s response to his own supporters attacking the Capitol on 6 January, in an attempt to block the certification of Joe Biden’s election victory, that finally brought about his ban.
Five people, including a police officer died, in the MAGA riot, and Mr Trump eventually posted a video on Facebook and Instagram urging his supporters to leave the Capitol and go home.
But infamously Mr Trump also took the opportunity to tell them: “We love you, you’re very special.”
After the video was posted, Facebook immediately suspended Mr Trump for 24 hours, and Mr Zuckerberg announced on 7 January that the ban was indefinite.
“We believe the risks of allowing the President to continue to use our service during this period are simply too great,” he said.
And two weeks later the company announced that the Oversight Board would make the final decision in the case.
“Many argue private companies like Facebook shouldn’t be making these big decisions on their own. We agree,” wrote Nick Clegg, VP of Global Affairs, in a statement at the time.
“Every day, Facebook makes decisions about whether content is harmful, and these decisions are made according to Community Standards we have developed over many years.
“It would be better if these decisions were made according to frameworks agreed by democratically accountable lawmakers.
“But in the absence of such laws, there are decisions that we cannot duck.”