In 1st Big Test, Oversight Board Says Facebook, Not Trump, Is The Problem

May 7, 2021
Originally published on May 10, 2021 1:37 am

Facebook has almost 2 billion daily users, annual revenue that rivals some countries' gross domestic product, and even its own version of a Supreme Court: the Oversight Board, which the company created to review its toughest decisions on what people can post on its platforms.

This week, the board faced its biggest test to date when it ruled on whether Facebook should let former President Donald Trump back on its social network.

The board upheld the company's decision to remove Trump after the Jan. 6 insurrection at the U.S. Capitol — finding he had broken Facebook's rules about praising violence — but it criticized the indefinite suspension and kicked the case back to the company either to ban Trump permanently or set a time frame for when he can return.

Former Danish Prime Minister Helle Thorning-Schmidt, a board co-chair, even called the company "a bit lazy" for failing to set a specific penalty in the first place.

Facebook said it's now considering the ruling and will determine a "clear and proportionate" action.

The board's response in this case may have been more than Facebook was counting on when it set up the advisory body. But the decision — and the public response to it this week — reveals just how big a challenge Facebook's scale and power present to anyone who wants to hold the company to account.

"They can't invent penalties as they go along"

In many respects, the decision the board handed down is more about Facebook than it is about Trump.

The board zeroed in on something critics have said for a long time: The way Facebook enforces its rules can seem arbitrary. It's often unclear what rules are being applied and why.

When it came to Trump, the board said that an indefinite suspension appeared nowhere in its rule book and violates principles of freedom of expression.

"What we are telling Facebook is that they can't invent penalties as they go along. They have to stick to their own rules," Thorning-Schmidt said in an interview with Axios.

She said that kind of arbitrary decision, made on the fly, has helped fuel claims that Facebook is biased.

"We will only get rid of this talk that Facebook is leaning towards certain political opinions when we get to a stage when all decisions on Facebook and Instagram are taken with transparency and clarity and where all users are judged by the same standard," she said.

Casting doubt on Facebook's "newsworthiness" policy

The board also pushed Facebook to be more transparent about how it treats political leaders and other high-profile accounts in a set of broader recommendations.

The board said the company should generally apply its rules equally, no matter whether the user is the president or an average citizen.

But it acknowledged that people with big audiences, such as politicians or celebrities, can cause outsize harm — and said Facebook should act more quickly when those users break the rules.

That's different from how Facebook — and Twitter for that matter — currently treat politicians and other public figures. Both companies have carve-outs from their rules in matters of public interest, and Facebook's CEO has said the company should err on the side of allowing more political speech. In practice, that meant it appeared Trump was able to get away with posting things that may have gotten the average Facebook user banned.

The board said Facebook should do a better job explaining its "newsworthiness" policy and how it applies to "influential accounts." Under that policy, Facebook doesn't take down posts that break its rules if the company thinks they are "newsworthy and in the public interest." (Facebook said it never applied this policy to any of Trump's posts.)

The board said the opaqueness of the newsworthiness policy makes it seem like Facebook "may be unduly influenced by political or commercial considerations"— in other words, that it's dodging criticism from Republicans or looking out for the bottom line.

"The board's job is to make sure that Facebook is doing its job"

The board's criticism didn't stop at Facebook's imposing what it called a "vague, standardless penalty." It slammed the company for trying to outsource its final verdict on Trump.

"Facebook has a responsibility to its users and to its community and to the broader public to make its own decisions," Jamal Greene, another board co-chair and constitutional law professor at Columbia, said Thursday during an Aspen Institute event.

"The board's job is to make sure that Facebook is doing its job," he said.

Tensions between the board's view of the scope of its role and Facebook's were also evident in the board's revelation that the company wouldn't answer seven of the 46 questions it asked about the Trump case.

The questions Facebook refused to answer included how its own design and algorithms might have amplified the reach of Trump's posts and contributed to the Capitol assault.

"The ones that the company refused to answer to are precisely related to what happened before Jan. 6," Julie Owono, an oversight board member and executive director of the digital rights group Internet Sans Frontières, said at the Aspen Institute event.

"Our decision says that you cannot make such an important decision, such a serious decision for freedom of expression, freedom of speech, without the adequate context."

"They're acting like they're bigger than government"

Critics have seized on these shortcomings — such as the board's inability to force Facebook to answer questions it doesn't want to, and its lack of any legal or enforcement authority — to make the case that the board is little more than a fig leaf for Facebook's lack of accountability.

For many people across the political spectrum, the decision this week confirmed whatever opinions they already held.

Lawmakers seized on the opportunity. House Minority Leader Kevin McCarthy, R-Calif., promised to "rein in big tech power over our speech" if Republicans regain control of the chamber.

Sen. Elizabeth Warren, D-Mass., said she was glad Trump would not return to Facebook but renewed her call to break up Silicon Valley giants. "I don't think that Facebook ought to have this kind of power," she told Cheddar News. "We need to break up these giant tech companies, and Facebook is one of them. They are crushing competition and in cases like Facebook, they're acting like they're bigger than government."

Rashad Robinson, president of the civil rights group Color Of Change, told NPR the board is a "distraction" from what needs to be done to force change at Facebook: congressional regulation of tech giants and their powerful leaders, such as Facebook CEO Mark Zuckerberg.

"The question will be, will our elected officials step up and stop allowing this unaccountable single billionaire person to have this type of outsized power in our democracy and our economy and our media?" he said.

But as unhappy as critics are with executives such as Zuckerberg and Twitter's Jack Dorsey making hard calls about online speech, there is resistance to the idea the government should get involved.

Oversight Board co-chair Thorning-Schmidt said she was concerned about autocratic governments stifling free expression online.

"This [Oversight Board] might not be the perfect solution, but it is much better than Facebook doing it themselves or a government taking these decisions," she told Axios. "It might not be a perfect setup, but I challenge anyone to come up with a setup that is better."

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

SCOTT SIMON, HOST:

Facebook has almost 2 billion daily users, more spending power than many countries, and now it even has its own version of Supreme Court to pass judgment on the toughest decisions that it makes about what people can and cannot say on the platform. This week, that court ruled on the biggest question yet, whether Facebook should let former President Donald Trump back onto its platforms. NPR tech correspondent Shannon Bond has been following this story. We ought to note Facebook is among NPR's financial supporters. Shannon, thanks so much for being with us.

SHANNON BOND, BYLINE: Glad to be here, Scott.

SIMON: And help us understand what happened this week.

BOND: Well, as you'll remember, shortly after a pro-Trump mob stormed the Capitol on January 6, Facebook had suspended Donald Trump from Facebook and Instagram. They said he was praising violence, and that broke its rules. In this penalty was an indefinite suspension. And so what it did was it asked this new Oversight Board, this advisory panel that it's created to review big decisions, to decide two things, both if it was right to kick Trump off and if he should be let back on. And so then this week, we heard from the board.

SIMON: And they said?

BOND: Well, the board said the suspension was justified - right? - that Trump did break Facebook's rules. But it said an indefinite suspension is not something that's in Facebook's rulebook, and it goes against human rights principles. So the board says Facebook needs to make a decision. It needs to either ban Trump permanently or put a timeframe on the suspension when he will be allowed back on. Here is board co-chair Helle Thorning-Schmidt - she's a former prime minister of Denmark - in an interview with Axios.

(SOUNDBITE OF ARCHIVED RECORDING)

HELLE THORNING-SCHMIDT: And what we're telling Facebook is that they can't invent penalties as they go along. They have to stick to their own rules.

BOND: And what's more, the board slammed Facebook, actually, for trying to offload this decision to the board. It said that's just not its role.

SIMON: Shannon, does this decision wind up saying more about Facebook than Donald Trump?

BOND: Well, I think, you know, it's really interesting. Like, the board here is getting at a criticism that lots of people have about Facebook - right? - that its decisions seem arbitrary. It's not always clear what its rules are, why they're being applied, if they're being applied fairly. And the board members said, you know, in many cases it's that lack of clarity that's helped fuel these persistent claims we hear that Facebook is politically biased. So I think it is about Facebook. For its part, Facebook says it's going to review this ruling, and it's going to come back with what it calls a, quote, "clear and proportionate action." So we'll sort of wait to see just what it does with what the board has told it.

SIMON: Let's remind ourselves, Facebook created this board to hold itself accountable. Did the board's decision satisfy anyone?

BOND: Pretty much not. For people across the political spectrum, I think this decision seemed to confirm whatever they sort of thought about Facebook and the board beforehand. We've heard from Republicans, like House Minority Leader Kevin McCarthy, who said this is just more evidence of why Big Tech has too much power. You know, Republicans continually accuse these companies of censorship. He says this is why these companies need to be reined in with new laws. And then there are other critics who say, look; just what Facebook is doing with this oversight board, it's just a way of ducking responsibility and that what's really needed is some sort of much more independent accountability.

SIMON: But what would that look like? I mean, after all, it's Facebook's money.

BOND: Right. Well, so for one view of that, I spoke with Rashad Robinson. He heads the civil rights group Color of Change. Here's what he told me.

RASHAD ROBINSON: The question will be, will our elected officials step up and stop allowing this unaccountable single billionaire person to have this type of outsized power in our democracy, in our economy, in our media?

BOND: You know, so what he's saying there is, you know, it's really Congress that needs to step in here, regulate these tech giants, regulate the power of billionaires like Facebook CEO Mark Zuckerberg. You know, and new government regulations, that's something a lot of people are talking about. But, of course, that gets into areas that raise their own thorny questions. I mean, just how much should the government be involved in deciding what people can say online? That's - you know, that is uncomfortable territory for a lot of people.

Now, Helle Thorning-Schmidt of the oversight board, she said she doesn't think governments should be making these calls. She says, yes, people are also clearly unhappy about the companies making these calls. So at least from where she stands, the oversight board may just be the best option.

SIMON: NPR tech correspondent Shannon Bond, thanks so much.

BOND: Thanks for having me.

(SOUNDBITE OF MUSIC) Transcript provided by NPR, Copyright NPR.