What do you think? Leave a respectful comment.

Board member explains decision to keep Trump off Facebook for now, and why he may be back

Four months after Facebook indefinitely suspended former President Donald Trump's account, the company's oversight board backed the initial decision to throw him off the platform at the time. But the board may have opened the door to allowing Trump back on this fall. John Samples, vice president of the Libertarian Cato Institute, is a member of the board and explains the decision to Stephanie Sy.

Read the Full Transcript

  • Judy Woodruff:

    Now we return to Facebook and former President Trump.

    Stephanie Sy reports on the questions around his suspension from the social media platform.

  • Stephanie Sy:

    Judy, punted, kicked the can down the road, that's how some critics described the findings of a Facebook Oversight Board.

    Before we dive further into this, we will remind viewers of what this board is. It is an international group of about 20 individuals from different fields and disciplines. Facebook set aside $130 million last year for its operation, including salaries. But the company says it is independent.

    Today, that board said the suspension of former President Trump following January 6 was appropriate at the time, but that the ban shouldn't have been indefinite. It recommended that Facebook uphold the ban for six months, at which point Facebook would have to determine whether to let Mr. Trump back on.

    The board also said Facebook should set more explicit policies and limits for crisis situations, not arbitrary judgments. And it said the company should act more quickly when political leaders' posts can create harm.

    John Samples is a member of that board. He's a vice president of the libertarian Cato Institute. I spoke with him earlier.

    So, John, when Mark Zuckerberg formed this Oversight Committee, he called it the Supreme Court of Facebook, indicating that its decision would be a final verdict. But it doesn't seem the committee has delivered a final verdict when it comes to whether former President Trump's account should be banned completely, deleted from Facebook or reinstated.

    Why didn't you come to a final decision?

  • John Samples:

    Well, we are like the Supreme Court, in the sense that we are overseers of proper process.

    So, in this sense, we are being very hard here on Facebook. We're saying to them, you would like for us to make this decision, perhaps. Maybe they want us to. I don't know. But we're not going to do this.

    The proper role in — both in the American political system and throughout the world with regard to users of Facebook is for Facebook to make the decision about Mr. Trump.

  • Stephanie Sy:

    But don't you, as a committee, make decisions on suspensions and bans and deletions in other cases? Why not for Mr. Trump? Why put the onus back on Facebook, when your committee consists of legal scholars like yourself, journalists, human rights and civil rights advocates?

    Who better at Facebook then to make these tough decisions where they have to weigh free speech with harm, potential harm to society and democracy?

  • John Samples:

    Facebook, ultimately, in making this decision, is going to be accountable to its users and, indeed, accountable to the larger society, both in the United States and elsewhere, that — where it works.

    So, I think, really, while it seems like we're ducking the issue, perhaps, to some, it's exactly the opposite. We're trying to put the onus of responsibility exactly where it belongs and where people can respond to it in a sensible and really democratic way.

  • Stephanie Sy:

    But you did decide, in a way, to uphold the current ban on President Trump. You basically extended it or recommended the extension of it, right, by six months.

    That move, in and of itself, John Samples, is being criticized by a number of conservative political leaders. Now, you, in your other job, work at the Cato Institute, which often decries any limits on free speech.

    So, I wonder, how do you respond to those that argue that Facebook is censoring conservative speech?

  • John Samples:

    Well, I would say, first of all, we did not reach the question whether this was a biased decision or whether there's political bias against conservative or any of those issues.

    They did not come up in this case. I would say that, on the issue of free speech, yes, Facebook is dedicated to that, and certainly the board, the people who are on it have a dedication to that.

    But both in the United States and throughout the world, and according to international norms, there are limits on freedom of speech. And some of those limits might be described as speech that does imminent harm. That's a crucial word there, imminent, that is, causes violence or other kinds of harms, and it's not speculative. It's happening right now.

    In this case, we were dealing with an incident in which there was a riot going on at a national capital during a constitutional method of selecting the president. Mr. Trump posted a couple of things. This is only about a couple of his posts on Facebook.

    And Facebook thought that they praised people involved in that riot. And looking at the facts, we agreed that the rules had been violate and they were justified in suspending the account.

    But they applied a response that itself was not in their legal — the framework of rules. And, therefore, they have to go back and do the job right.

  • Stephanie Sy:

    I think that many in our country feel there is a problem of misinformation gaining a greater foothold on this society and this democracy because social media platforms and their algorithms are designed to make certain content spread like a virus.

    This is a problem that a lot of people will acknowledge is before us as a democracy. Do you think your committee made recommendations to address that problem?

  • John Samples:

    We did, indeed.

    In the advisory, you will see that it is recommended that Facebook go back and look at its own performance during this period and to see what extent the way the platform is designed or other factors might have contributed to this outcome.

    I have to say, at the same time — and this is perhaps a more personal view, but maybe not, because free speech advocates are certainly on board — I'm always a little worried when the term misinformation comes up.

    Yes, it does exist and it can do harm, but, often, it's also true that that can be used in a way that would limit free speech. So I think we have got to be looking at that like hawks on both sides of it. And we certainly hope that Facebook will — and maybe it's already started — to look at its own methods and its own rules and what its — its model for how it might have contributed to the problems we see here.

  • Stephanie Sy:

    John Samples, member of the Facebook Oversight Board and vice president of the Cato Institute.

    John, thank you.

  • John Samples:

    Thanks for having me.

Listen to this Segment