Meta’s internal oversight board slammed the company’s policies that give VIP users – including celebrities, politicians and business partners – preferential treatment on Facebook and Instagram.
If you’re a regular user of either platform, your speech is subject to the tech giant’s frequently controversial content moderation policies. However, if your name is Donald Trump or Kim Kardashian or you simply have a very high follower count, you have more leeway to share and say things that violate the rules.
Known as cross-check, the internal program at Facebook and Instagram protects celebrities and other high-profile users from having their content automatically taken down by the company’s algorithms.
However, disclosures from whistleblower Frances Haugen, who testified before Congress in detail about the program, seem to have informed the oversight board’s assessment. Haugen has said the firm chooses ‘profits over safety.’
Meta’s internal oversight board slammed the company’s policies that give VIP users – including celebrities, politicians and business partners – preferential treatment on Facebook and Instagram for content moderation decisions
‘The board is concerned about how Meta has prioritized business interests in content moderation,’ the report stated. The program, it said, ‘provided extra protection for the expression of certain users.’
When the oversight board began its inquiry into the cross-check program, Meta was performing an astounding 100 million enforcement attempts on content each day.
Therefore, even if the firm was able to make such decisions with 99% accuracy – an impossible standard – it would still make one million mistakes per day.
Meta chose ‘profits over safety,’ a whistleblower testified to Congress. Above: Meta CEO Mark Zuckerberg
Among the board’s key findings, which are detailed in a 57-page report, is that content breaking Meta’s own rules will often be left up for more than five days when the user posting it is a VIP.
The company led by Mark Zuckerberg currently does not provide much transparency to the public around how cross-check works.
‘Currently, Meta does not inform users that they are on cross-check lists and does not publicly share its procedures for creating and auditing these lists,’ the board wrote in a summary of its work, which began in October 2021.
‘It is unclear, for example, whether entities that continuously post violating content are kept on cross-check lists based on their profile.’
The board recommends publicly marking the pages and accounts of all entities receiving list-based protections at Meta in the following categories: ‘all state actors and political candidates, all business partners, all media actors, and all other public figures included because of the commercial benefit to the company.’
The board also wrote that Meta’s cross-check systems operate with a ‘consistent backlog of cases.
‘Meta told the Board, that, on average, it can take more than five days to reach a decision on content from users on its cross-check lists,’ the oversight group noted. ‘This means that, because of cross-check, content identified as breaking Meta’s rules is left up on Facebook and Instagram when it is most viral and could cause harm.’
In many instances, that delay has negative real-life consequences.
For example, in 2019, Brazilian soccer star Neymar posted a video showing nude photos of a woman who had accused of him of sexual assault.
Due to the cross-check program, the post was left up for more than a day and received more than 100 million views before it was ultimately removed.
Haugen said the company had lied to the oversight board ‘repeatedly’ during the case of former President Donald Trump – who was ultimately suspended from Facebook – according to the Wall Street Journal.
In its report, the board asks why the athlete was not suspended and also notes that the incident only came to light as a result of Haugen’s disclosures.
In total, the board recommended 32 different actions and gave Meta 90 days to respond. However, since the board is advisory, the company is under no obligation to implement any of its suggestions.
When the oversight board began its inquiry into the cross-check program, Meta was performing about 100 million enforcement attempts on content each day