GZERO WORLD with Ian Bremmer
In Wikipedia We Trust?
12/12/2025 | 26m 46sVideo has Closed Captions
Wikipedia’s co-founder on trust, bias, and protecting open dialogue in a toxic digital age.
Wikipedia helped shape the internet, but can it survive today’s culture wars, misinformation, and Big Tech dominance? Ian Bremmer talks with co-founder Jimmy Wales about trust, neutrality, the Gaza page controversy, and AI’s influence.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS. The lead sponsor of GZERO WORLD with Ian Bremmer is Prologis. Additional funding is provided...
GZERO WORLD with Ian Bremmer
In Wikipedia We Trust?
12/12/2025 | 26m 46sVideo has Closed Captions
Wikipedia helped shape the internet, but can it survive today’s culture wars, misinformation, and Big Tech dominance? Ian Bremmer talks with co-founder Jimmy Wales about trust, neutrality, the Gaza page controversy, and AI’s influence.
Problems playing video? | Closed Captioning Feedback
How to Watch GZERO WORLD with Ian Bremmer
GZERO WORLD with Ian Bremmer is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- The internet, like, we think of it today as being really toxic, and it is.
Like, there's so much toxicity.
- It is incredibly toxic.
- So much toxicity.
- And it wasn't when you started Wikipedia.
- Well, was it or wasn't it?
(upbeat music) - Hello, and welcome to "GZERO World."
I'm Ian Bremmer.
It's fair to say that Americans today do not agree on very much, but one thing we do seem to agree on is Wikipedia.
Widely ranked among the top 10 websites visited worldwide each year, the crowdsourced online encyclopedia is still our go-to reference for all curiosities, great and small.
But trust is fragile, and recently, there has been a growing backlash, especially on the right, against alleged bias on the site, which erupted into national view when an article on, I quote, the "Gaza genocide" appeared earlier this year.
Can Wikipedia maintain user trust in our divided culture?
And how will it adapt to the ways that artificial intelligence is transforming information?
Wikipedia co-founder Jimmy Wales joins us on the show.
Don't worry, I've also got your Puppet Regime.
(explosion booms) - Those Gringos are not messing around.
I think it's time for Plan B.
- Hello.
- [Nicolas] Senor Presidente.
- Maduro.
- But first, a word from the folks who help us keep the lights on.
- [Announcer] Funding for "GZERO World" is provided by our lead sponsor, Prologis.
- [Announcer] Every day, all over the world, Prologis helps businesses of all sizes lower their carbon footprint (bright music) and scale their supply chains with a portfolio of logistics and real estate and an end-to-end solutions platform addressing the critical initiatives of global logistics today.
Learn more at prologis.com.
- [Announcer] And by, Cox Enterprises is proud to support GZERO.
Cox is working to create an impact in areas like sustainable agriculture, clean tech, healthcare, and more.
Cox, a family of businesses.
Additional funding provided by Carnegie Corporation of New York, Koo and Patricia Yuen, committed to bridging cultural differences in our communities, and.
(bright music) (air whooshes) (bright music) - Who killed JFK?
Don't worry, I'm not going to go there, but if I was going to go there, back in 2005, a four-year-old website called Wikipedia could have pointed me to a surprising suspect.
Not Lee Harvey Oswald, not the CIA, but a seasoned Tennessee reporter named John Seigenthaler.
On May 26th, 2005, an anonymous Wikipedia editor created the following five-sentence biographical entry about Seigenthaler.
It stated, "John Seigenthaler Sr.
was the assistant to Attorney General Robert Kennedy in the early 1960s.
For a brief time, he was thought to have been directly involved in the Kennedy assassinations of both John and his brother, Bobby.
Nothing was ever proven."
Now, when Seigenthaler, who had been a close friend to the Kennedy family, discovered the entry, he was appalled.
- I know I was never a suspect in either of those assassinations.
I mean, I was Bobby Kennedy's pallbearer.
I lived with him six months and wrote a book with him.
I chaired the John F. Kennedy Profile in Courage Awards every year at the Kennedy Library.
I was their friend.
- Seigenthaler soon appealed directly to Wikipedia co-founder Jimmy Wales, who was able to remove the falsehood, but was unable to reveal the author's identity beyond a random IP address.
Seigenthaler wrote a scathing USA Today column about the "volunteer vandals with poison-pen intellects populating Wikipedia."
Shortly thereafter, a Nashville-based delivery driver wrote Seigenthaler a handwritten letter apologizing for what he called a prank.
And soon, everyone else moved on.
But you can read all about the incident on, you guessed it, Seigenthaler's Wikipedia page.
20 years later, Wikipedia's central paradox hasn't changed.
And we caution that Wikipedia is not a solid source, don't cite it in your essay, but we trust it all the same.
We turn to the site to learn about everything from "Dungeons & Dragons" to photosynthesis to pets with fraudulent degrees.
Chester Ludlow, MBA, may have padded his resume, but he's still a good boy.
Wikipedia is something we all agree on at a time when we don't agree on much of anything.
Conservatives watch Fox.
Liberals listen to NPR.
The days of Walter Cronkite and Edward R. Murrow are over.
And yet, at a time when trust in the United States media has hit a record low of 28%, Wikipedia remains one of the most visited websites in the country, not to mention the world.
And that is because Wikipedia's biggest liability is also its chief asset.
No one owns it.
The site's decentralized nature relies on the wisdom of crowds and on the belief that for every jerk out there lying about who killed JFK, there are five virtuous nerds waiting in their parents' basements to correct the record.
But trust gained over 20 years and trillions of site visits can be lost in an instant.
And today, Wikipedia, or Wokepedia, as Elon Musk has been calling it, is facing its strongest headwinds yet.
Can a platform built on openness and consensus survive an age defined by outrage and division?
Here to talk about all that and more, Wikipedia co-founder Jimmy Wales, who says he doesn't run Wikipedia, Wikipedia runs him.
He's out with a new book called "The Seven Rules of Trust," and he joins me now.
Jimmy Wales, thanks for joining us on GZERO Media.
- Thanks for having me.
It's good to be here.
- So, Jimmy, your new book is about trust, and at a time when there seems to be such a deficit of trust in leaders of all sorts, not just political, how do you think Wikipedia maintains that sense of commitment, engagement, and belief among the community?
- Yeah.
Yeah.
I mean, this is exactly what got me to write the book, just thinking about, gosh, we've got this enormous crisis of trust in the world right now.
Yeah, Wikipedia's gone from being kind of a joke in the early days to one of the few things people trust.
And obviously, it's not perfect, and we have to keep maintaining our principles and do the things for trust, but I wanted to reflect on, "Okay, what did we do?
What are the things that organizations need to do, that people need to do to build trust?"
And, you know, a lot of it's kind of obvious and maybe a little bit surprising in other cases.
You know, having a clear, good purpose, transparency.
And so, you know, it's so important, and I think we have to get back to a culture of trust.
- If you started Wikipedia today in this environment, would it survive?
- You know, I think so.
I think so.
But it's a fascinating question, and it's actually hard to get your head around because if you think about a world today without Wikipedia, the internet would be quite different.
I mean, Wikipedia is really part of the infrastructure of how everything works.
I mean, who knows what would be there instead and all of that.
But, you know, I think so.
And actually, one of the things, you know, the internet, like, we think of it today as being really toxic, and it is.
Like, there's so much toxicity.
- It is incredibly toxic.
- So much toxicity.
- And it wasn't when you started Wikipedia.
- Well, was it or wasn't it?
- Less so.
- I mean, I think this is the interesting thing.
So before the worldwide web, we had something called Usenet, which was like a giant message board, unmoderated and, in principle, almost unmoderateable because it was on a distributed system across lots of different mainframes and things like that, and it was notoriously toxic.
So I always say, you know, like, "Turns out we don't need algorithms to teach us to be mean to each other.
We can do it all on our own."
And so there was that, but there was also, there was this great moment of a feeling of great optimism, like, "Wow, the internet," you know?
- Creativity.
- Creativity.
- Communication.
- Innovation and all of that.
- Yeah.
- But also, I would say there was a moment of risk where, if you remember, you know, AOL was really huge, and there were services like Prodigy, CompuServe.
All complete walled gardens.
And if one of those, and AOL probably was the dominant candidate for that, had become the dominant platform, it might have become a monopoly, and the internet, the open internet might not have ever quite made it.
And then it would be really hard to start something new because you would have to sort of pay fees to be on the platform and things like that.
And so fortunately, we ducked that.
And so in many ways, starting something like Wikipedia for, you know, the next Jimmy Wales, the next young person out there, is actually probably easier than ever in a sense.
Now, starting versus succeeding, obviously, is a more complicated question, but.
- But also gaining trust.
- Gaining trust, yeah.
- Also being able to be engaged with a broader audience as opposed to being seen as supporting one group of people or the other.
I mean, is that doable in this environment?
- Well, it's hard, and it's actually- - Is it harder?
Is it harder?
- I think it is harder.
It feels to me harder than it has in the past because there's so much, the culture wars, the divisiveness, you know, all of that seems deeper than it was.
And, you know, how do you really judge that?
I think we all have a sense that that's probably true.
And, you know, even for Wikipedia, like, we get challenged on this.
You know, Elon Musk calls us Wokepedia and things like this.
- [Ian] And he's starting Grokipedia.
- [Jimmy] Grokipedia.
- Is Wikipedia suitably trusted to your standards today?
- Mostly.
Part of the way I look at Wikipedia is it's enormous, and so when we talk about certain divisive issues, divisive topics, that is actually a very small percentage of the work.
You know, I think most people have used Wikipedia in all kinds of obscure ways.
It's kind of the awesome thing about Wikipedia.
Almost anything you can think of, you can go there and see, "Who are these amazing people who wrote all this great stuff?"
On the sort of more difficult topics, I think we do a great job in some, and I think we could do better in others.
I mean, I was at a dinner many years ago now in Moscow and sitting with the editor of a major magazine, and he said, "Oh, I can make Wikipedia say whatever I want.
I just pay $100 each to a few Wikipedians, and I can change it."
I was like, "Let's talk about it.
Let's think that through," because those Wikipedians would have to be known and trusted people in the community.
They would start writing strange things.
The other community members would go like, "What are you doing?
Do you have a source?"
All of that.
It wouldn't last.
You wouldn't get very far, and in fact, it would cause a huge stir.
- Because it's a completely decentralized model.
- Completely decentralized model and a very open community.
I'm like, right now, if somebody started flooding Wikipedia with a thousand entries in one night, even pre-AI, they would be blocked.
They would be stopped very quickly, like, "What the hell are you doing?"
Like, "Stop.
Slow down.
Let's have a conversation.
What's going on?"
And so flooding Wikipedia is much harder than, say, social media where you can create thousands of, I mean, this is absolutely what they do, create thousands of accounts and send some tweets and things like that.
And at least so far, you know, AI agents aren't really able to mimic a human being for very long or for very much.
You know, you can pretty quickly tell.
And in fact, if you've got some experience, you know, it's pretty obvious, AI writing.
Not always, but, you know, it's kind of obvious.
- Where are the places that you do see a structural political problem, and what is it?
- Yeah, I mean, well, lately I've been raising concerns about our coverage of Israel/Gaza.
- Gaza.
Yeah, I've seen that.
- Simply because I think that we, at the moment, and there's a big discussion taking place about this, we say things in the voice of Wikipedia far more than we should, and I think that voice of Wikipedia, we really should have a very, very high bar for that.
It's one thing, we should absolutely accurately report and no holds barred on, you know, people saying this is a genocide, people saying it isn't, people saying whatever.
But for us to take a side would require, in my view, near unanimity within the community.
It's such an important thing.
- What's an example of a near unanimity thing that Wikipedia is rightfully having a voice?
- Yeah, I mean, it's actually interesting.
I was looking for one the other day, because I'm writing some stuff up for the community, some thoughts about this, and I thought, "Oh, well, how about flat Earth?"
You know, clearly, we must say that the Earth is not flat, but actually, we say something like, you know, "There's a scientific consensus that the Earth is not flat," which is slightly different- - So it's not really the Wikipedia voice.
It's not.
Not quite.
So you're saying that on Gaza, you've taken Wikipedia voice?
- Yeah, yeah.
- Why have you done that?
How has that happened?
- There's a lot of reasons, and I'm still just digging into it.
So, you know, I think there's some structural, I mean, now we're going to get super wonky about Wikipedia, but it has to do with the way the discussion, we do something called an RFC, which is like a debate discussion, not a vote exactly.
The way it was closed- - A request for comments?
- Request for comment, yeah.
The concept of Wikipedia is you want to have consensus around pretty much every edit, and that's kind of the basis for Wikipedia, but what does consensus really mean?
And that gets complicated.
So let's say you've got a binary choice like, "Which photo of the Eiffel Tower?
We're going to have this one or that one."
So somebody may start an RFC, and then people start giving their opinions.
And in that, you know, eventually, just to stop an edit war, somebody will close the discussion and say, "Okay, we're going to go with this."
And probably for something like that, it's not massively important one way or the other.
- You don't know the French very well, Jimmy.
- (chuckles) Yeah, well, 60% would be fine, and the 40% will go, "Yeah, okay, fine.
You know, most people seem to prefer the other one, so I'm going to let it go."
So that's enough for consensus.
- Hard to see that on Gaza.
- Hard to see that on Gaza.
But one of the techniques and one of the really valuable things that we do, and actually, there's a great example of this in our article about abortion, Wikipedia can't say, "Abortion is a sin," and it can't say, "Abortion is a fundamental right of women."
But what it can say is, "The Catholic Church's position on abortion is such and such, and the Pope has said this, and critics respond to that," and da, da, da, da.
It starts to sound like a Wikipedia tree.
- Because you're trying to be a resource.
- And that's exactly what it says.
It's really wonderfully written, and you say, like, "This is balanced.
This is..." Because we don't try to answer the question or take a side.
We just describe the debate.
And so, you know, where that gets tricky is like, "Okay, well, where do we draw the line and say, like, 'Well, actually, there isn't any actual real debate?'"
Consensus doesn't necessarily have to mean unanimity, but it should be pretty close, particularly if it's a really important issue.
And so that's the conversation we're having, is like, "Can we re-look at how are consensus decision's being made when you're closing an RFC and things like that?"
So, yeah, you're on the cutting edge of Wikipedia, like, discussions that are happening right now.
- So what is the Wikipedia voice on Gaza right now?
- I mean, right now, it says that what's going on in Gaza is an ongoing genocide.
- [Ian] Which is clearly one side, one specific side- - Clearly one side.
- [Ian] Of the discussion.
- I've been searching.
There are no major news services anywhere, Reuters, the BBC, nobody is in their own voice saying this is a genocide, and they do say that about other things.
And so people of good faith have not been able to participate.
So these things are going to change in my, I'm confident, but it's important to know my role in Wikipedia is I'm not the editor-in-chief.
I can't make these things happen.
But I can coach people, remind people, convene people.
And I think there's a lot of people in the community who are like, "Yeah, through a series of circumstances, we've gotten to a place that isn't great, and we really need to look at a lot of the processes and figure out how did we get here," because it's not living up to what Wikipedia should be doing.
- Now, how do you deal with the other side of the issue, which is most of the editors are male.
I assume most of the editors are white.
When people think about Wikipedia and they see the globe or something that's globe shaped on the homepage, you would think that that would represent eight billion people.
Yet, the reality of the experience is there's a Western bias.
- Yeah, so we do think it's a problem, and we think it's a problem not because we have some woke agenda or affirmative action.
You know, that's not it at all.
It's like if it impacts the quality of the content, then it's problematic.
And so there was a study done, it's been many years ago now, authors who've won major literary prizes, so novelists who've won big prizes, and if you look at the articles about men, they are longer than the articles about women.
Now, longer- - Shocking.
- [Jimmy] Yeah, it doesn't mean better quality.
- I would be stunned if that wasn't the case.
- Yeah, yeah, and- - I'd be stunned.
- It is.
And it isn't because, and I think this is an important nuance here, it isn't because the male Wikipedians think, "Oh, novel by a woman, that's obviously not interesting and not important."
It's more like people write about what they know, and that's a problem, and so this is why we need, like, "Okay, well, this is really important.
We need people who are reading those novels.
We need more people."
And who are those people?
Well, maybe in some cases, it's going to be, "Well, that author has a majority female audience, and therefore, we need more women editing Wikipedia."
So it is something that we really do think is a problem.
- Now, the DEI approach would be, "Okay.
We just need a standard that says that these female authors are going to have, you know, sort of longer bios," which clearly is not what you want to do.
- No, no.
What we really want to do is just say, "Let's bring in more people."
One of the great fallacies of sort of Elon's approach to moderation, which is to have almost none at all, on X is the idea of like, "Oh, we should encourage free speech for everyone and that."
Yeah, but if you've therefore created a culture where people who have a slightly different view from the majority of your users get yelled at and abused and shouted down, you know certain types of people, basically your quiet, thoughtful, kind people- - Won't engage.
- Will be just like, "Yeah, I'm out."
It's just toxic because it's just like a horrible environment, and that doesn't work for anybody.
- Now, here you are marshaling, you know, millions of people that are involved in the site and all of these entries, most of which are not politically controversial at all.
At what point is there going to be a Wikipedia LLM that is trained on all of this stuff that allows a student or anyone on the net to query it, engage with it, and it becomes a much less biased... - Yeah, well, I mean, I would say we already have that in one sense because all of the LLMs are trained on Wikipedia data.
It's fundamental, and so there's a lot of- - Indeed.
You go to Grok, and it will use Wikipedia sources, among other things.
- Wikipedia sources, among other things.
- Yes.
- Yeah, exactly.
But I do think there is an interesting piece there that is a little bit different, which is to say our search experience, so if you go to Wikipedia and you go to our search box and you type, "Why do ducks fly south for winter?"
Basically, the search engine has no idea what you're talking about.
It's keyword-based- - Right, you want an engaged- - And you probably will find ducks in winter, you know?
Whereas with a large language model where you could actually ask a question like that and it would respond not with its own made up stuff, but respond with exact quotes from the relevant Wikipedia entries and tell you, "Oh, the answer to that is in the article on bird migration-" - Yeah, that'd be incredible, right?
- It'd be great, and so I do think that we will be moving in that direction- - Are you already working on that?
- Yeah, yeah, yeah.
So we have a machine learning group who are doing all kinds of experiments and things like this.
We're very ideological about open source, and so we're probably not going to partner with Google or OpenAI or anybody like that to sort of replace our search engine with the proprietary thing they've built for us.
That's just not our style.
We really value our independence, and so the open source is a big part of that.
And so that means we'll have to develop something ourselves, and so on and so forth.
So it'll take some time.
- But this is the key point, right, to close, is that, I mean, at one point, when the internet felt a lot more decentralized, today, the internet feels a lot more centralized, the platforms and their algorithms, and Wikipedia is absolutely contrary to that.
Are there a couple other things happening, like open source, for example, where you say, "You know, the pendulum might be starting to swing back towards decentralization?"
- Yeah, I do.
I mean, one of the things I really love is Signal, the messaging app because it's run by a nonprofit, just like Wikipedia.
They're super ideological about, you know, like, their business model is not going to take them down a path that leads them to wanting to break encryption to whatever.
I think that's fantastic.
- And even cabinet secretaries use it, apparently.
- Yeah, yeah, exactly, exactly.
And, I mean, that's hilarious.
And then, oh, yeah, another thing.
I think something that's really interesting that's going on in the world of AI is the open models.
So the open-source software, the open models are basically a half-step, they're several months behind the cutting-edge proprietary models, and I think that's super interesting and actually really important, because it does mean that we are going to have a much greater diversity of what's going on in that space.
And certainly, it poses an interesting challenge for regulators because you can tell, like, if it's only five big companies that can do it, you can regulate five big companies.
You just tell them what to do, and they're going to have to comply and all that.
You can't regulate 250,000 open-source developers running machines, you know, on their own dime and so forth.
- Because it's going to be decentralized.
- It's going to be decentralized.
I just think that's an interesting space where, you know, the question is, "Are we going to have a handful of big companies who are absolutely running our lives, or can we do that on our local computer?"
That's something where I do think there is this decentralized versus centralized thing going on.
- And that tension persists.
- And it persists, yeah.
- Jimmy Wales, thanks so much for joining.
- Thanks.
Good to be here.
(bright music) - And now to Puppet Regime, where US President Trump engages in last-minute diplomacy with Venezuelan President Nicolas Maduro ahead of a potential invasion.
♪ Imagine all those peoples ♪ ♪ Fleeing or sitting in jail ♪ (explosion booms) Those gringos are not messing around.
I think it's time for Plan B.
- Hello.
- [Nicolas] Senor Presidente.
- Maduro.
- (speaks in foreign language) Let's make a deal to stop this crazy war.
- I'm listening.
- You like to deport people.
- [Donald] I do.
- And I need to import people.
- I can hear Steven Miller's tail wagging already.
- Send me the Democrats.
- The what?
(Nicolas speaks in foreign language) - Deport them (speaks in foreign language) to Venezuela.
- High crime and rigged elections?
They're going to love it there.
- It's a paradise.
- Mamdani, I can't live without now, but I'll send you AOC.
- We can use their help to rebrand for (speaks in foreign language).
- Would you even take the super annoying ones like Newscum?
(Nicolas speaks in foreign language) - Gavin and I both have beautiful hair, and each have lost millions of our people to Texas and Florida.
- What about Pelosi?
- Yes, her net worth alone will support the Venezuelan economy for years.
- All right, and I assume you're taking Jeffries and crying Chuck Schumer?
- Those are the two useless ones that do nothing, yes?
- Absolutely nothing.
- Deal.
- You know, I can't believe you're willing to import a potentially serious political opposition.
(Nicolas and Donald laugh) - (speaks in foreign language) You almost killed me there.
- You never know.
I still might.
♪ Puppet Regime ♪ - That's our show this week.
Come back next week.
And if you like what you've seen or even if you don't, but you want to edit my Wikipedia page, why don't you first check us out at gzeromedia.com?
(upbeat music) (upbeat music continues) (upbeat music continues) (bright music) - [Announcer] Funding for "GZERO World" is provided by our lead sponsor, Prologis.
- [Announcer] Every day, all over the world, Prologis helps businesses of all sizes lower their carbon footprint (bright music) and scale their supply chains with a portfolio of logistics and real estate and an end-to-end solutions platform addressing the critical initiatives of global logistics today.
Learn more at prologis.com.
- [Announcer] And by, Cox Enterprises is proud to support GZERO.
Cox is investing in the future, working to create an impact in advanced recycling and in emerging technology companies that will help shape tomorrow.
Cox, a family of businesses.
Additional funding provided by Carnegie Corporation of New York, Koo and Patricia Yuen, committed to bridging cultural differences in our communities, and.
(bright music) (bright music)

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS
GZERO WORLD with Ian Bremmer is a local public television program presented by THIRTEEN PBS. The lead sponsor of GZERO WORLD with Ian Bremmer is Prologis. Additional funding is provided...