The Open Mind
Wikipedia's Survival and Resilience
11/11/2024 | 28m 19sVideo has Closed Captions
Wikimedia leader Rebecca MacKinnon discusses the future of the encyclopedia platform.
Wikimedia leader Rebecca MacKinnon discusses the future of the encyclopedia platform.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
The Open Mind is a local public television program presented by THIRTEEN PBS
The Open Mind
Wikipedia's Survival and Resilience
11/11/2024 | 28m 19sVideo has Closed Captions
Wikimedia leader Rebecca MacKinnon discusses the future of the encyclopedia platform.
Problems playing video? | Closed Captioning Feedback
How to Watch The Open Mind
The Open Mind is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipHEFFNER: I am Alexander Heffner, your host on The Open Mind.
I'm delighted to welcome Rebecca MacKinnon to our broadcast today.
She is Vice President for Global Advocacy at the Wikimedia Foundation.
As our loyal viewers and listeners will know, we've hosted every single one of the foundation's executive directors, starting with the inaugural executive director, Sue Gardner, through the present most recently in 2022, Maryana Iskander.
MACKINNON: Well, thank you very much for having me on, and I'm sorry it's taken so long.
HEFFNER: No worries.
You're doing important work that requires a lot of attentiveness in relationships with governance all over the world.
I thought I'd start just by asking you, we last hosted your executive director in 2022, now we are in 2024.
Can you just give us an update about what, if anything, has changed?
MACKINNON: A lot is happening in the world, as you know, and my role really relates to how the, both the foundation and our broader volunteer communities and affiliate organizations meet the challenges and opportunities that the world is dealing us.
And one huge development that has happened since 2022 is ChatGPT and all large language models, including ChatGPT, depend on Wikipedia content very heavily.
It's a major source of these large language models.
So the global information environment is evolving very quickly.
And so there are lots of discussions and debates both internally within the foundation is how technologically we keep up with the fast paced developments in technology, but also debates in the community of about what does this mean for Wikipedia when increasingly people are editing Wikipedia, so for instance, the prisoner exchange between the US and Russia already has a Wikipedia page, right?
So are people in the world going to learn, going to get the information that's on that page from that page, or are they going to obtain that same information through their queries to ChatGPT or some large language model that's being used by Google or Microsoft or whoever the individual is interfacing with?
Right?
So this is both a challenge and an opportunity in a lot of different ways, right?
It means that the work that Wikipedia editors are doing is more important than ever and is having a greater reach than ever.
It also means that the need to make sure that our communities are robust and diverse and have strong internal governance processes to address any efforts by malicious actors to put disinformation on any given Wikipedia page or to threaten volunteers, to have robust defenses against those types of things is more important for the world and people's ability to know the truth than ever before.
The quality of any given Wikipedia page, because as you know, Wikipedia is edited by volunteers.
I don't know if you have a Wikipedia account, I don't know if you've edited Wikipedia, but many people watching and listening to us today probably may have.
You can create an account and start editing, and the more experience you gain with editing in a credible, reliable quality way, the more authority you get as an editor and you're given special, the editors vote to give special privileges to people with the greatest authority and credibility.
And of course, they're setting rules, right?
They're setting rules for what constitutes reliable sources, what is allowed on a given page.
And the rules for medical topics are of course, different than the rules for, say, celebrity pages, in terms of what constitutes a reliable source.
But the need to support that s system of governance so that it's robust is stronger than ever, right?
So the pressure on the foundation to do that well, and to help the communities do that well is more important, I think, for the world than ever.
It's also a fact that we know from independent research that the quality of a page depends on not only the quantity of people who are editing a particular topic, but also the diversity of people editing a topic, right?
Because then it’s more resilient against anybody trying to manipulate a page or to impose a point of view when there's a diversity of types of people of viewpoints of backgrounds editing that page.
So the work of the foundation to support the communities in making the projects, not just Wikipedia, but Wikimedia Commons, Wikidata, the other projects that support Wikipedia content, ensuring that people feel safe and welcome, from all backgrounds and parts of the world, not only accessing Wikipedia but editing it more important than ever for the quality of the information that everybody around the world increasingly depends on and that large language models are using and, and sharing.
And so that brings us to a couple of real fundamentals that are also policy fundamentals.
One being the absolute importance of security and privacy, protecting the privacy and security of our editors is paramount, and ensuring that governments passing laws or in engaging in national security activities or behaviors are not exposing our editor community along with people all over the internet to unaccountable surveillance.
We believe encryption is very important.
The ability to edit Wikipedia with a pseudonym to not be doxed by people who don't like the truth that you happen to be putting on a page is incredibly important.
That is a risk we find in all sorts of countries, and that we have a human rights team and a trust and safety team and a security team that are all helping editors and of course freedom of expression protections by governments, right?
HEFFNER: How long have you been at the foundation, Rebecca?
MACKINNON: Coming up on three years.
HEFFNER: So since you've been at the helm, where has there been progress in terms of opening up more access, and where has there been any regression?
MACKINNON That is a good question.
Just to be clear, I'm vice president for Global Advocacy.
I'm not at the helm of the whole thing.
And again we are the foundation that supports the volunteer run projects of which the volunteers are at the helm.
HEFFNER: Just to clarify what at the helm might mean.
At the helm as the ambassador, right, on behalf of the volunteers, editors.
MACKINNON: Absolutely.
Yeah.
HEFFNER: Engaging with these countries where there could be- MACKINNON: Yes, exactly.
Serving our global constituency.
That is a really good question in terms of what has opened up.
To be honest, there hasn't been a dramatic opening up anywhere since I joined the foundation.
There have been some victories in terms of policy but there are a number of concerning trends that relate to concerns about the spread of what some people would call digital dictatorship, networked authoritarianism around the world.
In terms of being blocked, we've been blocked in China for a very long time.
We were blocked in Turkey for a while, and then that got lifted, thanks to a court case.
We were blocked briefly in Pakistan.
But in terms of sort of censorship and kind of freedom, it's less about kind of blocking and not blocking or open and not open.
It’s more about levels of control and how free are people, whether it’s human rights groups, human rights defenders to share information or anti-corruption researchers to share facts online or independent scientists to share scientific research online, regardless of whether it comports with their government's dominant interests or other corporate interests, whether academics can share openly their research online and whether Wikipedians feel safe to share facts online.
So that's kind of one piece.
And to what extent is a government in engaging what is often called in the field of digital rights studies internet controls.
So in many countries you see some censorships.
So for example, Russia.
A lot of websites have been blocked.
A lot of news organizations have been blocked.
Wikipedia has not been blocked, but governments have a lot of other tools in the toolkit beyond just blocking.
A big tool is disinformation.
So one thing to look at, and Freedom House and their freedom on the net index tracks this as well when they're looking at levels of internet freedom.
It's not just overt censorship or how many demands is a government making on internet platforms to take down content or who are they blocking, but it's to what extent is that government engaging in disinformation campaigns to not only inject narratives that are in their interest that may not be factual, but also what are they doing to discredit those who are providing sources of independent information.
So it’s spreading the stories they want to tell and also spreading stories to discredit the facts, right?
And we are seeing that growing in all kinds of countries.
Sometimes you see in countries that are relatively free, but a particular political party or particular political actor or sometimes private sector actors will engage in this behavior.
There’s also again the surveillance piece.
So again how safe do people who are challenging those powerful entities who prefer a particular story to be told, how safe do people who are contradicting those stories feel to share information online?
And that is a constant up and down in lots of places.
I would say I think there's a struggle going on in the United States about what online freedom means and doesn't mean, and what facts are and what, what they aren't.
There are struggles all across the democratic world around that.
There has been plenty of public reporting about parallel struggles in India, in relation to the government's relationship with promoting particular narratives.
But in Europe, there are a lot of struggles going on.
The European Union, I should say, is one of those positive stories where the European Union in recent year, a couple years ago, passed a Digital Services Act, which is meant to hold internet platforms accountable to some basic human rights standards.
And while the act isn't a hundred percent perfect, it's compatible with human rights standards, it seeks to protect information integrity and find ways for people to identify and report problems in the internet ecosystem and for solutions to be found.
So we are seeing some positive efforts as well, but it's complicated.
And again, we're engaging in robust advocacy everywhere to protect freedom of expression online.
HEFFNER: Where are you hopeful that you will advance in normalizing free expression, the democratization of knowledge in the coming year?
Does the foundation look at particular targets of where that might be viable based on the seeds of democracy being planted or the seeds of free expression being planted?
MACKINNON: Well, we're not in the regime change business We’re not Human Rights Watch.
What we are is really working to protect and advance the rights of our global editor community.
So it's less about targeting this government or that regime and saying, you know, we're trying to see a different regime that's not what we're doing.
But what we are doing is not only identifying laws, everywhere.
So for example, there have been a couple of laws in the United States and Texas and Florida that would open up Wikipedia volunteers to lawsuits making that would be a disincentive to people editing Wikipedia in the United States.
And so, you know, we filed a brief with the Supreme Court on the case NetChoice vs. Paxton.
And, you know, we're optimistic that while the results of that case basically got sent back to the states, we’re hopeful that things will come out in the right way.
So like, even in the United States- HEFFNER: To, just to clarify Rebecca, you're saying that Wikipedia contributors or editors are susceptible to potentially some kind of liability?
MACKINNON: So in the United States, right now, there’s a law that's known as Section 230 of the Communication Decency Act that shields platforms from liability from what their users post online on their platforms.
That has been under challenge from people in both major parties in the United States.
But in addition to that, there are laws that were passed in Texas and Florida that basically challenge the First Amendment rights of not just the platforms to moderate content according to their rules, but would challenge the ability basically.
So the Texas law, in particular, if a platform removes content that someone construes as political speech, not just the platform, but in the case of platforms where volunteers are involved with removing content, those who remove the content can be, could be liable to lawsuits, open to lawsuits.
So obviously if the problem is with Wikipedia, right?
We're supposed to be an encyclopedia.
It's not a place for political opinion.
So if somebody posts on the article about Donald Trump, their opinion, for or against Donald Trump, and it's not a neutral, well-sourced, fact-based set of sentences, it will be taken down.
Under the law in Texas, that removal could open up potentially, again, it depends on how the courts end up interpreting it, but based on the words in that law could potentially open up Wikipedia editors to being sued or having “censored” political speech when the whole purpose of Wikipedia is not political speech in the first place.
Right?
So I guess the point is it's complicated.
There are a lot of people, and this is why we don't look at kind of protecting our users on a country by country basis.
There are threats that come up to our communities, to our editor communities everywhere.
In Portugal, there was a rich guy who, who didn't like what was posted about him on Wikipedia and engaged in a lawsuit against us that we're still fighting.
And again, when in jurisdictions where those types of lawsuits, they're known as slap suits, they're used against media organizations quite a lot.
In jurisdictions where those lawsuits succeed, it has a chilling effect against publishing information even when factual that people who can afford to launch lawsuits don't like.
We are having to defend our editor communities in a lot of contexts that go beyond just the standard dictatorship or democracy kind of kind of context.
HEFFNER: What are you doing to safeguard not just Wikipedia as it stands right this moment, but the entire body of the history of this encyclopedia.
It is more additions of Britannica.
It's not by the year, it's by the second.
When we talk about securing the internet, I could think of nothing more important to secure than Wikipedia, and I hope you have as many servers and cloud protectors as Apple and Alphabet and do reassure us that you do.
MACKINNON: Well, we're a nonprofit.
We rely on donations for our budget to hire staff, Wikimedia Foundation is about 700 people.
That's a lot smaller than those commercial platforms.
Most of our staff are engineers and software developers, including a lot of people who are keeping the sites running and up.
But, yeah, our resources depend on the ability of everyone to support us.
HEFFNER: But you do have security apparatus?
MACKINNON: We have security team.
We have trust and safety team.
We have a human rights team.
Yes, absolutely do have a site reliability engineering team who are keeping the sites up and running.
We have all those types of roles that the big commercial platforms have, but we have to do it more efficiently with less resources than they do.
Most of our budget comes from those average $15 donations with people clicking on banners.
So, if you find it valuable, please contribute.
To come to come to your other point.
What are we doing to protect people?
And this is where the advocacy job and kind of our public diplomacy comes in is when government, we get asked by people at State Department and the European Union, lots of governments say what should we be doing on AI regulation that will most help Wikimedia and Wikipedia to thrive?
I say it's not just about regulating AI, it's about protecting the human beings who produce the content that without which large language models in ChatGPT are gonna become a bunch of regurgitated, recycled, hallucinatory crap.
Right?
And disinformation, like, if you don't want generative AI to be that, it's not just about regulating AI.
It's about making sure that your laws and regulations protect privacy and freedom of expression of human beings, and not just wikipedians, but independent research, independent people doing open data projects, anti-corruption activists, et cetera.
If people cannot get that information online safely, we will not have trustworthy, factual information online and the whole thing will collapse.
HEFFNER: Rebecca, we're about out of time, I really appreciate your insight today, Rebecca MacKinnon, Vice President for Global Advocacy at the Wikimedia Foundation.
Thank you for your time.
MACKINNON: Thank you very much.
It's been great.
HEFFNER: Please visit The Open Mind website at thirteen.org/openmind to view this program online or to access over 1500 other interviews.
And do check us out on Twitter and Facebook @OpenMindTV for updates on future programming.
Continuing production of The Open Mind has been made possible by grants from the Alfred P. Sloan Foundation, Angelson Family Foundation, Robert and Kate Niehaus Foundation, Grateful American Foundation, Robert S. Kaplan Foundation, Draper Foundation, and Ploughshares Fund.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
The Open Mind is a local public television program presented by THIRTEEN PBS