Facebook’s ‘Supreme Court’ tackles nudity, Nazi quotes, and Covid misinformation in first cases
Facebook’s Oversight Board, which was established to review the social media giant’s moderation decisions, has accepted its first cases.
“More than 20,000 cases were referred to the Oversight Board following the opening of user appeals in October 2020”, the board said in its announcement.
“As the Board cannot hear every appeal, we are prioritising cases that have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook’s policies.”
The six appeals, five of which were referred by users, focus on the company’s policies on hate speech, adult nudity, and dangerous individuals and organisations.
This includes images of dead children posted alongside a criticism of China for its treatment of Uyghur Muslims, an image posted on Instagram of female breasts to raise awarenesss of signs of breast cancer, and a quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, that was used to criticise the Trump administration.
Each of the cases will be assigned to a five-member panel that includes one person from the region of which the content came. The board will deliberate on the case and Facebook will act on their decision within 90 days.
The case that Facebook submitted to the board was a video criticising French health officials for not authorising hydroxychloroquine as a cure for the coronavirus, which was viewed 50,000 times and shared 1,000 times.
INDY/ LIFE Newsletter Be inspired with the latest lifestyle trends every week Please enter your email address Please enter a valid email address Please enter a valid email address SIGN UP Thanks for signing up to the INDY/LIFE newsletter {{#verifyErrors}} {{message}} {{/verifyErrors}} {{^verifyErrors}} {{message}} {{/verifyErrors}} The Independent would like to keep you informed about offers, events and updates by email, please tick the box if you would like to be contacted
Read our full mailing list consent terms here INDY/ LIFE Newsletter Be inspired with the latest lifestyle trends every week The Independent would like to keep you informed about offers, events and updates by email, please tick the box if you would like to be contacted
Read our full mailing list consent terms here
Facebook removed the video for violating its policy on violence and incitement, and referred it to the board as “an example of the challenges faced when addressing the risk of offline harm that can be caused by misinformation about the COVID-19 pandemic.”
The Oversight Board, which CEO Mark Zuckerberg has compared to a Supreme Court for the social media site, has the power to overrule decisions made by Facebook about content moderation, as well as influence new policy.
Each board member will serve no longer than three years, and currently includes journalists, federal judges, law professors, and the former Prime Minister of Denmark, Helle Thorning-Schmidt.
The development of Facebook’s Oversight Board comes as the company has been repeatedly criticised for its moderation policies.
The company’s algorithm was found to be “actively recommending” Holocaust denial and fascism according to research from the Institute for Strategic Dialogue (ISD), and misinformation from president Donald Trump was the most popular post on the social media site despite its attempt to move users towards more reputable sources of information.
One former Facebook employee, Sophie Zhang, also said the company had been ignoring evidence that fake accounts on its platform have been disrupting political events across the world.
“In the three years I’ve spent at Facebook, I’ve found multiple blatant attempts by foreign national governments to abuse our platform on vast scales to mislead their own citizenry, and caused international news on multiple occasions,” wrote Zhang. “I know that I have blood on my hands by now.”
Another Facebook engineer previously resigned, claiming that the company was “profiting off hate in the US and globally” because of inaction against violent hate groups and far-right militias using Facebook to recruit members.
“We don’t benefit from hate,” a Facebook spokesperson told The Independent in a statement at the time.