Arizona Public Radio | Your Source for NPR News
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
SERVICE ALERT:

The 88.7 transmitter site has sustained a fire of unknown origin. We will be operating at low power for the foreseeable future, which covers Flagstaff but not much beyond. Online streaming streaming remains unaffected. We apologize for the inconvenience and appreciate your patience while we work to restore service.

91.7 in Page is currently off the air. We have identified the problem and are working to restore service. 102.7 is operating, but the signal may not reach beyond Page proper. Online streaming remains unaffected. We apologize for the inconvenience and thank you for your patience.

Facebook's own oversight board slams its special program for VIPs

Facebook parent Meta appears to be more concerned with avoiding "provoking" VIPs than balancing tricky questions of free speech and safety, its oversight board said.
Josh Edelson
/
AFP via Getty Images
Facebook parent Meta appears to be more concerned with avoiding "provoking" VIPs than balancing tricky questions of free speech and safety, its oversight board said.

A Facebook and Instagram program that gives celebrities, politicians and other high-profile users special treatment does more for parent company Meta's business interests than its stated purpose of protecting users' free expression rights, according to Meta's oversight board.

"Meta's cross-check program prioritizes influential and powerful users of commercial value to Meta and as structured does not meet Meta's human rights responsibilities and company values, with profound implications for users and global civil society," Thomas Hughes, oversight board director, said in a statement.

The board said Meta appeared to be more concerned with avoiding "provoking" VIPs and evading accusations of censorship than balancing tricky questions of free speech and safety.

It called for the overhaul of the "flawed" program in a report on Tuesday that included wide-ranging recommendations to bring the program in line with international principles and Meta's own stated values.

Meta said in a statement it would review and respond to the board's recommendations – which are not binding – within 90 days.

The report comes more than a year after Meta asked the board – a group of experts in law, human rights and journalism from around the world, which the company convened and funded through an independent trust – to review the program, known as "cross-check," which had been highlighted in a whistleblower's revelations in the Wall Street Journal.

Under the program, Meta maintains a list of users including politicians, celebrities, and business partners – such as advertisers, health organizations and news publishers – who are eligible for extra review if their posts are suspected of breaking the company's rules against violence, hate speech, misinformation, nudity and other subjects. In some cases, their posts are exempted from Meta's rules entirely.

The documents supplied to the Wall Street Journal revealed former President Donald Trump and his son Donald Trump Jr., Democratic Senator Elizabeth Warren of Massachusetts, conservative activist Candace Owens, Brazilian soccer star Neymar and even Meta CEO Mark Zuckerberg to be among those on the VIP list.

Board finds "cross-check" purports to protect the vulnerable, but in practice benefits the powerful

Meta says the program is intended to address a thorny problem.

Facebook and Instagram users create billions of posts every day, which means the company depends on a combination of human reviewers and automated systems to find and remove content that breaks its rules. But given that scale, inevitably some posts are erroneously found to violate policies – what the company refers to as "false positives."

Cross-check is intended to reduce false positives in cases where the risk and potential harm of an error is greatest.

"The cross-check system was built to prevent potential over-enforcement mistakes and to double-check cases where, for example, a decision could require more understanding or there could be a higher risk for a mistake," Nick Clegg, Meta's vice president of global affairs, wrote last year when the company asked the board to weigh in. As examples, he cited activists raising awareness of violence, journalists reporting from conflict zones, and posts from "high-visibility" pages or profiles likely to be seen by a lot of people.

But the board found in its report that "while Meta characterizes cross-check as a program to protect vulnerable and important voices, it appears to be more directly structured and calibrated to satisfy business concerns."

For example, the board said based on its examination, a key rationale for how Meta reviews posts from users on the cross-check list "is to avoid provoking people" who might complain to senior executives or "create public controversy" for Meta.

"Correlating highest priority within cross-check to concerns about managing business relationships suggests that the consequences that Meta wishes to avoid are primarily business-related and not human rights-related," the report said.

In addition, the company appears to prioritize avoiding the "perception of censorship" over its other human rights responsibilities, the board said.

Fundamentally, the board concluded the cross-check program treats users unequally, despite Meta's statements that everyone has to follow the same rules.

"Cross-check grants certain users greater protection than others," the report said.

What's more, the board said, the program overrepresents users and content from the U.S. and Canada – markets where Meta earns the most revenue per user – even though the vast majority of Facebook's nearly 3 billion monthly users live elsewhere.

"Through the design of cross-check, users in lucrative markets with heightened risk of public relations implications for Meta enjoy greater entitlement to protection for their content and expression than those elsewhere," the board said.

That inequality is exacerbated by the lack of transparency about who is in the program. The board said Meta wouldn't share which profiles, pages and accounts – which the company calls "entities" – are included in cross-check, citing legal obligations to protect user privacy.

"The Board cannot fully assess the degree to which the company is meeting its human rights responsibilities under the program or the profile of the entities that are guaranteed enhanced review if it does not know how the program is being implemented and precisely who benefits from it," the report said.

The board noted that in the past year, Meta has expanded cross-check to review content that might be at high risk of being incorrectly removed, even if the people posting it are not included in the cross-check lists.

But it said the company's limited capacity to review content meant many of those posts never get the same level of additional review as those from high-profile users in the program.

56 million views before soccer star Neymar's rule-breaking post is taken down

The report also slammed Meta's policy of leaving potentially rule-breaking posts from high-profile users visible while they are being reviewed. On average, cross-check reviews take more than five days. In the longest case Meta shared with the board, it took seven months.

"This means that, because of cross-check, content identified as breaking Meta's rules is left up on Facebook and Instagram when it is most viral and could cause harm," the board wrote.

One notorious example the board cited was a video posted by Neymar in 2019 that included photos of a woman who had accused him of rape. The video was viewed 56 million times on Facebook and Instagram before being removed, the Wall Street Journal reported.

The board blamed inadequate resources to review posts flagged under the cross-check program, as well as Meta's failure to properly act urgently on potential "high severity" violations.

"In the Neymar case, it is difficult to understand how non-consensual intimate imagery posted on an account with more than 100 million followers would not have risen to the front of the queue for rapid, high-level review if any system of prioritization had been in place," the report said.

"Given the serious nature of the policy violation and the impact on the victim, this case highlights the need for Meta to adopt different approaches for content pending review and shorten review timelines," it said.

Furthermore, the board found Meta didn't apply its usual rules to Neymar after the incident.

"The company ultimately disclosed that the only consequence was content removal, and that the normal penalty would have been account disabling," the report said. It noted that in December 2021, Meta announced an exclusive streaming deal with the soccer star.

Board calls for more transparency

Finally, the board said, Meta does not track metrics that would show whether the additional layers of review provided by the cross-check program result in more accurate calls on whether a post should stay up or come down than the company's normal enforcement process.

"Meta is not considering whether the ultimate decisions are the correct decisions," the report said.

The board gave 32 recommendations for how Meta should revamp the program to address the flaws it found and live up to its billing as protecting human rights.

That includes making public the criteria for inclusion in the program, letting people apply to be part of it, and publicly labeling some accounts, including government officials, political candidates, and business partners.

The company should also remove accounts from cross-check if they repeatedly violate the rules, and put more resources into reviewing content.

"Meta has a responsibility to address its content moderation challenges in ways that benefit all users and not just a select few," the report said.

Meta said in the past year it has improved the cross-check system, including by developing "standardized" principles and governance criteria, limited the number of employees who can add users to the program and created a process to review and remove users.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.