The independent body set up by Facebook to review decisions to remove content will eventually need to tackle the issue of how to address politicians such as Donald Trump posting disinformation online, the board’s sole Australian member has said.
The oversight board is a body set up by Facebook to review select cases where Facebook’s content moderators have removed a post, but the user believes it should not have been removed. The board makes binding decisions for Facebook, meaning a post that was removed could be restored if the board says it should be.
“The decision we make isn’t going to just dispose of that particular case in front of us, it’s [also] going to help us to require Facebook to change either its policy or its enforcement processes more generally,” Nic Suzor, the sole Australian member of the oversight board, told Guardian Australia.
This month, the 20-member board – made up of academics, lawyers, politicians and journalists from across the world – announced the first six cases it would review over the next 90 days.
One of the cases involves a two-year-old post of an alleged quote from Joseph Goebbels, the propaganda minister of Nazi Germany. The post, which spells out the need to appeal to emotions and instincts, instead of intellect, and on the unimportance of truth, was removed by Facebook for violating its policy on dangerous individuals and organisations. The user who reshared the post has appealed on the grounds that Trump was, in their view, following a similar fascist model.
The board’s remit is limited to content that has been removed by Facebook. Suzor admitted this is likely to be problematicwhen it comes to addressing disinformation posted by politicians, such as US president Donald Trump posting false information about election fraud. Currently Facebook puts warning labels on this kind of post, rather than removing it.
“The only way that we can handle cases where Facebook has decided not to remove something is if Facebook refers it to us,” Suzor said.
“Clearly everyone’s watching this everywhere … We’re not blind to the fact that political disinformation and hate speech is an important issue. We’re not going to shy away from that.”
Another case the board will review involves a user in Brazil sharing a photo raising awareness about breast cancer, with images including visible female nipples, which Facebook removed for violating its policy on adult nudity.
A third case involves a user posting a video about an alleged scandal involving the French health regulator purportedly refusing authorisation for use of hydroxychloroquine and azithromycin against Covid-19, but authorising promotional material for remdesivir.
The post was removed for violating its policy on violence and incitement, with Facebook presenting the case to the oversight board as an example of the complexity around dealing with disinformation being shared during the Covid-19 pandemic.
Suzor said case selection will be key to the work the board does, and that requires not only looking for patterns in the kind of cases referred to the board, but also working with nongovernment organisations and experts around the world to identify issues in areas where the board might not have a particular insight.
“The case selection process really has to pay attention to issues that are coming up in different regions, in different languages, all of that, affecting different segments of populations,” he said.
“I think that’s something we’re still figuring out how to do, to be honest.”
Five members of the board select the six cases to be reviewed out of around 20,000 applications. The remaining 15 members are then divided up into three groups, and each group has two cases to review.
At the start of the next round, a new panel of five is selected, along with new groups.
Users will be able to explain their intentions behind the post as part of the appeal process, and Suzor said that is something he believed would be critical to reviewing content rules on Facebook.
“Intention matters, and that’s something that Facebook isn’t good at figuring out. They’re not good with context or intention.”
Another limitation of the oversight board is that Facebook will not refer the cases if the content was removed in compliance with the law of the country in which the post was made – meaning more complex issues over free speech in parts in parts of the world such as Vietnam will not be the subject to review by the board.
Emily Bell, director of the Tow Center for Digital Journalism at Columbia Journalism School, wrote this month that if the board needed three months to decide whether a nipple should stay on the site, then the board’s hope of having an impact may be “doomed from the outset”. The board should be pushing the boundaries of its remit, she said, while Facebook also should recognise that content moderation is the core of its business.
Suzor said it was still early days and the board is still finding its feet.
“We’ve got to do much better to communicate with audiences around the world who are seriously affected by the way that social media platforms are governed,” he said.
“We’re not going to be perfect at that straight away. We’re not going to be perfect at making decisions and selecting cases straight away, or ever, but we try to be as transparent as we can be.
“This is an experiment, it’s never been tried before and we’re unlikely to find the best way to do everything the first time … [it’s going to] be a bit of a process to keep getting better if we’re going to ultimately be effective.”
Facebook’s announcement of the oversight board has come at a time when the focus has been on the company’s outsourcing of moderation to low-paid workers, who have to review some of the most extreme, often graphic content. Former moderators who have spoken out have said they have received little support from their employer or Facebook.
In May, Facebook agreed to pay $52m in a settlement with moderators who claimed the company did not do enough to protect them from the mental health impacts of the job.