
Meta drops fact-checking, critics fear misinformation spike
Clip: 1/7/2025 | 5m 44sVideo has Closed Captions
As Meta drops fact-checking, critics fear it could pave the way for a misinformation spike
Meta announced it's ending third-party fact-checking on its platforms, calling the decision a return to a “fundamental commitment to free expression.” CEO Mark Zuckerberg said the rules had become too restrictive and prone to over-enforcement. Geoff Bennett discussed the implications of this shift with Renee DiResta of the McCourt School of Public Policy at Georgetown University.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...

Meta drops fact-checking, critics fear misinformation spike
Clip: 1/7/2025 | 5m 44sVideo has Closed Captions
Meta announced it's ending third-party fact-checking on its platforms, calling the decision a return to a “fundamental commitment to free expression.” CEO Mark Zuckerberg said the rules had become too restrictive and prone to over-enforcement. Geoff Bennett discussed the implications of this shift with Renee DiResta of the McCourt School of Public Policy at Georgetown University.
Problems playing video? | Closed Captioning Feedback
How to Watch PBS News Hour
PBS News Hour is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipGEOFF BENNETT: Facebook and Instagram's parent company, Meta, announced today it's ending third-party fact-checking on its platforms, calling the decision a return to a -- quote -- "fundamental commitment to free expression."
Meta's fact-checking program was rolled out in the wake of the 2016 election.
CEO Mark Zuckerberg said today the rules had become too restrictive and prone to overenforcement.
MARK ZUCKERBERG, CEO, Meta: We built a lot of complex systems to moderate content.
But the problem with complex systems is, they make mistakes.
Even if they accidentally censor just 1 percent of posts, that's millions of people.
And we have reached a point where it's just too many mistakes and too much censorship.
The recent elections also feel like a cultural tipping point towards once again prioritizing speech.
So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression our platforms.
GEOFF BENNETT: To discuss the implications of this shift, we're joined now by Renee Diresta, associate research professor at the McCourt School of Public Policy at Georgetown University.
Thanks for being with us.
RENEE DIRESTA, Associate Research Professor, McCourt School of Public Policy, Georgetown University: Thanks for having me.
GEOFF BENNETT: So, let's set the stage.
Why did Meta initially put this fact-checking program into place?
And was it effective?
RENEE DIRESTA: Yes.
So, it was launched in December 2016 in response to widespread criticism of fake news that had gone viral quite a bit during the 2016 presidential campaign.
And the platform faced a lot of backlash in response to that.
So the fact-checking initiative was launched as part of Facebook's efforts to kind of restore its brand, restore trust.
It partnered with the third-party fact-checking organizations that were certified by the International Fact-Checking Network.
So, it went to existing organizations that were already quite reputable.
And it worked to incorporate adding context.
They came up with a moderation framework called remove, reduce, inform.
Remove is when content is taken down.
Reduce is when it's reduced in distribution, it's not pushed out to as many people.
And the fact-checking piece was a really big part of inform, which tried to add a little bit more information to the stories that were going viral or articles that people were seeing in their news feed.
GEOFF BENNETT: Meta says it's moving to a community notes practice, similar to what we see now on Elon Musk's X, formerly Twitter.
What has the impact on that platform been?
Can community notes be as effective as fact-checking?
RENEE DIRESTA: It's a little bit mixed.
So it's very hard to know what the impact of community notes is on X.
It's hard for us on the outside as academics and things to see it because a lot of data access and transparency has been reduced.
I think that community notes is a great way to restore legitimacy to content moderation, but it doesn't necessarily actually do the job that fact-checking did in quite the same way, so better to have it as a complement.
And that's because it's often slow.
It addresses just a very small fraction of the content.
It really relies on people wanting to sit there and feeling like they should go and perform that almost, like, platform community service.
On X, oftentimes, that means that you will see it happen on highly contentious political content, where people feel like some sort of emotional response.
They want to go correct the record about their guy, that kind of thing.
And so you see efforts to get community notes on that type of political content.
But on the flip side, it's platforms asking users to do work for them, and it is not necessarily going to catch all of the kind of topical coverage that a professional journalist fact-checker might have more access to the ability to call somebody up and ask them if something is true, the ability to send somebody into a conflict zone to see if something is real.
So it should be a complementary process, but because this has become so politicized, we're seeing it broached as a replacement, rather than a complement.
GEOFF BENNETT: Let's talk more about the political dimension here, because Zuckerberg in his video statement, as we saw, he framed this policy shift as a reaction to Republicans' November victory.
And we heard him say -- he called it a cultural tipping point towards once again prioritizing speech.
We know he's visited Mar-a-Lago, he's dined with president-elect Trump, he's donated to the Trump inaugural fund.
He just named Dana White, the CEO of UFC and a long time Trump ally, to Meta's board.
And here's what the president-elect, Donald Trump, said today when he was asked about this shift by Meta.
DONALD TRUMP, Former President of the United States (R) and Current U.S. President-Elect: Honestly, I think they have come a long way, Meta, Facebook.
I think they have come a long way.
QUESTION: Do you think he's directly responding to the threats that you have made to him in the past?
DONALD TRUMP: Probably.
Yes, probably.
GEOFF BENNETT: So what are the downstream implications of the political motivation behind all this?
RENEE DIRESTA: It is probably in response to the threats that -- and we call that jawboning, and we actually should see that as bad.
We should see it as bad when the left does it and we should see it as bad when the right does it.
We should not want to see platforms who are supposed to be providing a service of value to their users, right, who are supposed to be facilitating a speech environment that protects the user, that creates a good experience for the user.
That's what the platform should be doing.
Working the referees, trying to make the people deciding the calls advantage your team, that's what's actually happening here, right?
It is capitulation to ref-working.
And what you see, if Meta had come out and said, we are launching this fantastic new community notes initiative, that would have been absolutely great.
And that would have been just a routine feature set policy shift from a large social media platform that does that constantly.
But it was the tone of the communication.
It was the specific language used in it that was very transparently saying, we are doing this in response to a shift in the political winds.
And I just don't think that we should want to see our social media platforms quite so buffeted by political winds.
GEOFF BENNETT: Renee Diresta, thanks for your insights.
We appreciate it.
RENEE DIRESTA: Thank you.
Activists in Iran describe threats they face for protesting
Video has Closed Captions
Clip: 1/7/2025 | 5m 47s | Activists in Iran describe the threats and oppression they face for protesting (5m 47s)
Artist uses elements of natural world to see it in new ways
Video has Closed Captions
Clip: 1/7/2025 | 5m 36s | Artist uses elements of the natural world to see it in new ways (5m 36s)
Minneapolis agrees to federal oversight of its police
Video has Closed Captions
Clip: 1/7/2025 | 6m 17s | Minneapolis agrees to federal oversight of its police 4 years after George Floyd murder (6m 17s)
News Wrap: Wildfire in Los Angeles hills explodes in size
Video has Closed Captions
Clip: 1/7/2025 | 6m 29s | News Wrap: Wildfire in hills of Los Angeles rapidly explodes in size (6m 29s)
Trump expresses desire to expanding U.S. territory
Video has Closed Captions
Clip: 1/7/2025 | 4m 30s | Trump expresses desire to expand U.S. territory, use economic force to pressure Canada (4m 30s)
U.S. accuses Sudan's rebel forces of committing genocide
Video has Closed Captions
Clip: 1/7/2025 | 8m 19s | U.S. accuses Sudan's rebel forces of committing genocide (8m 19s)
Where does U.S. foreign aid go and does it make an impact?
Video has Closed Captions
Clip: 1/7/2025 | 8m 41s | Where does U.S. foreign aid go and does it make an impact? (8m 41s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
- News and Public Affairs
Amanpour and Company features conversations with leaders and decision makers.
Support for PBS provided by:
Major corporate funding for the PBS News Hour is provided by BDO, BNSF, Consumer Cellular, American Cruise Lines, and Raymond James. Funding for the PBS NewsHour Weekend is provided by...