09.17.2025

He Invented the World Wide Web. Here’s What He Hopes for the Age of AI

In the age of social media, the online landscape is more challenging than ever for civil society. It’s a far cry from what the inventor of the World Wide Web, Tim Berners-Lee, intended to create. He speaks to Walter Isaacson about his new memoir, “This Is for Everyone.”

Read Transcript EXPAND

AMANPOUR: Just ahead of President Trump’s unprecedented second state visit, the U.S. and the U.K. unveiled the Transit Atlantic Technology Pack. With top tech companies, including Google, Microsoft, and others, pledging some $42 billion in investments, all to develop Britain’s A.I. infrastructure. Meantime, in the age of social media, the online landscape is more challenging than ever for civil society, a far cry from what the inventor of the World Wide Web, Tim Berners-Lee, intended to create. As he lays out now in his new memoir, “This is for Everyone.” He joins Walter Isaacson to explain why he’s still optimistic about artificial intelligence.

 

WALTER ISAACSON: Thank you, Christiane. And Sir Tim Burners-Lee, welcome to the show.

 

TIM BERNERS-LEE: Thank you having me. It is good to be here.

 

ISAACSON: You know, I teach the history of the digital revolution here at Tulane University, and my students are always surprised that the web didn’t all just spring out of nothing, that a real person invented it. And that person is, you, you did it 35 years ago. Tell me what it means to have invented the web.

 

BERNERS-LEE: Yeah, and they say they’re amazed that I’m an inventor and I’m actually alive as well, so they can <laugh>, they can put me in their projects. So inventing the web, you have to remember that the internet, and the web are different. The internet is the underlying network which was invented around 1969, and then 20 years later it was, the network was all over America connecting different universities. But the programs that would write, that would use the web were pretty crummy. And it was, you had to be an expert really, to use the internet. So there was no web. And so I, when I was working at CERN, inventing the web meant taking this idea of links and clicking on links, and then combining it with the idea of the internet so that you end up with a concept of a link that could go anywhere.

 

ISAACSON: You were talking about CERN, which is the particle accelerator in Geneva, where you worked and you invented it ’cause you had to organize all the information there and make it collaborative. Tell me about how that environment led to the web.

 

BERNERS-LEE: CERN’s a great place because people come from all over the world. They have this huge challenge. They have a huge particle physics accelerator, which is 27 kilometers long. You know, in a tunnel under the local mountains. And so it’s a huge project, people coming from all over the place, and they bring all kinds of different cultures, all kinds of different computers and so on. And they, so when I tried to help ’em put the whole thing together, then I found that the, just the state of information was really was, it was tricky. I found that in fact, even though they had some documentation systems, what was crucial was actually bringing people to coffee, getting, inviting people to coffee. There was a coffee place, coffee area, where if you stood there long enough, people walked by. There was an intersection of various corridors. And so people would walk by, and then you could pluck them out of the stream and say, Hey, you know, tell me about your stuff. How does it work?

 

ISAACSON: So you were born in 1955, same year as Steve Jobs, same year as Bill Gates, both of – all three of you invented something. You invented a way to navigate the internet. Bill Gates invented the whole software industry, and Steve Jobs invents the notion of a easy plug and play personal computer. They become billionaires. You don’t. You put it all in the public domain. Was there something in the environment that you decided not to commercialize it the way they did?

 

BERNERS-LEE: The point is, the web is a protocol. The web is a, you know, it’s a standard. It’s, it’s, it’s not just a program. It’s not a product. It’s a, it is – when, when the web works, all of the computers have to speak the same language. And that’s the big ask. You have to, so they all have to speak Http, they all have to speak HTML and so on. And so that is a huge ask. You can’t also ask to censor click. So I wanted the web to take over the world. I wanted it to be used by everybody. If it was gonna be used by everybody, it had to be free or it wouldn’t have worked.

 

ISAACSON: We’ve seen some really results of this horrible, toxic environment you find online. Of course, we’ve gone through a really horrible week, the past week, and you write in the book that the web sort of moved away from what you wanted, and it became a place for a culture of grievance, of hostile activism. And you even say “harassing people online and threatening or even committing violence” is what’s happened on the social media part of the web. Explain how that happened.

 

BERNERS-LEE: Well, I think part of it is the way the social media systems defined – the way they write the code inside things like Facebook and, and they, they do it in such a way as to keep you on the platform. And that means they do it on such a way as to be, they make the systems addictive.

And so yeah, I used to be on Instagram when it was ran, when it ran differently, when I could catch up with my friends and family on it. But now then they changed it. Deliberately, the people who built Instagram make it more, make it so that you stay on the platform. And they do it that like that by feeding you things which make you feel emotional and all. And that emotion isn’t always love, it’s often hatred.

 

ISAACSON: Now you’re talking about the algorithms which amplify things. And you’re saying that the companies amplify things that engage you, which basically means, what enrages you. Is there any way to do ethical algorithm design to make a social media that would more unite us rather than divide us?

 

BERNERS-LEE: Absolutely. Instead of optimizing for people being angry, you optimize for people being co-creative, constructive. In the book, there’s a a two page map of all the things on the internet, and most of them are good. Remember, you know, there are things like Wikipedia, which are wonderful. And so that the problem is that people get stuck on the, the addictive things because they’re addictive.

 

ISAACSON: Well, what if a company can make more money by doing these addictive things as indeed as true as you say of Instagram or X or Facebook and all of them? Should the government, or somebody, try to say you should not do these type of algorithms?

 

BERNERS-LEE: Well, there are places that don’t. Pinterest, for example, Pinterest is a social media place –

 

ISAACSON: Yeah. But Pinterest has been left behind by all the ones that are more enraging and engaging.

 

BERNERS-LEE: Well, supposing the ones that are actually deliberately addictive. Yes, you could, you could rule it, you could legislate. That –

 

ISAACSON: But didn’t the way you invent the web make it very hard for it to be controlled by governments and central authority?

 

BERNERS-LEE: Yes. But you can still make things illegal. And there are lots of things which are illegal. I mean, fraud is illegal on the web. It was illegal before the web, and it was illegal on the web. So lots of things. There are lots of things where we decided it’s illegal. One, you know, some people have suggested that because just as you make – certain drugs are just too addictive to, to be allowed. So you make illegal. You could do the same thing with certain algorithms.

 

ISAACSON: Well, let me paraphrase with what Cassius said to Brutus, which is maybe the faults not in the algorithms, but it’s in ourselves, that people are in this decentralized internet posting what they want to say, looking at what they wanna be enraged by. And this is just part of human nature that’s being amplified by the web.

 

BERNERS-LEE: And to a certain extent that’s true. But the, but when you see something which is make, makes you angry online, then it may be so you could blame the person who posted it. But also if, if in fact that post was circulated in a million people’s feeds, then you could blame the algorithm. So there’s both, there’s, there’s both sides.

 

ISAACSON: So in other words, you try to restrict a company not from allowing posts, which it should do under free speech, but from amplifying it through the algorithm?

 

BERNERS-LEE: Yep.

 

ISAACSON: Do you think government or somebody should regulate what some people would call disinformation and other people would just say is people’s free speech?

 

BERNERS-LEE: In general. I think that it’s a very tricky, tricky slope. So, but there are some things which are illegal. So the inciting violence, for example, is, is illegal. So there are things where society decided that, that it should be illegal on the web or off.

 

ISAACSON: You talk about the toxic environment of social media all built on the web architecture that you invented. Do you think artificial intelligence, the advent of AI, gives us a chance to hit a reset button on the toxic nature of what’s online?

 

BERNERS-LEE: Yes, I do. Yeah. Because we’re thinking, we’re rethinking everything. And so so yeah, AI gives us a chance, as you say, to hit a, hit a reset button. So when we figure out, for example, how AI works, then we can assist on, for example, an AI works for me. I wrote about that a few years ago. And now we’ve made one, if an AI works for you, you know, Siri doesn’t work for me, Siri works for Apple, and Alexa works for Amazon. And so when Alexa makes a choice, when I see I want to buy some running shoes, that Alexa will go through the process of, of selling me the running shoes just like Amazon does by operating a, a, some sort of a market. But where Amazon is is the one that’s going to gain the benefit, the most from that transaction. So I wanted AI work not for Amazon or Apple, but I want one that works for me. And I wrote that up and we’ve actually my, in my company, Inrupt Labs, we’ve actually demonstrated that you could do that. You can make an AI which you trust, it works for you and ’cause you trust it, you give it all your access to all your personal data. And so when you do that, then it’s very much more effective in answering questions that for, for working in your interest.

 

ISAACSON: When you invented that thing, and you wrote about it, you wrote about it a few years ago, you called them agents, and that’s what they’re being called now, which is an AI agent that’s gonna be my agent that’ll book my holiday or book my, or buy my running shoes if indeed I needed them. Why would that agent – how would you make it different from the agents that we’re getting now from our AI systems?

 

BERNERS-LEE: When, when you set the app, you set it up and you train it to be helpful to the individual instead of, you know, you don’t, when it, when, particularly when you make it commercial decision, like what car to buy or what holiday to go on or vacation to go on, you you make sure that it is working as you, as your agent. So with the way this a, the agent works is that you have you, you have all your data in a solid data wallet. So solid is the, is the, the protocol which I also described in the book solid, is, is a way of allowing you to have data that you control. Then when you start an application, then it asks you where you want to store the data, and it gives you complete control over that data. So it’s a very different world from the current one.

 

ISAACSON: Well, the current world, instead of having this solid data wallet in which all my data is in my control, we now have a web that has cookies where Amazon knows what I’ve done, what Apple knows what I’ve done, and they can use that information to market things to us. Isn’t that a good business model though, to make the companies be able to thrive?

 

BERNERS-LEE: Yes, yes. And no, I approve targeted advertising is, yeah. It’s better. It is typically more efficient than than other advertising. That’s what I and so the whole advertising based business model on the web, I think is, is something. There’s a threat to it that people are not using search engines so much, they’re starting to use AI instead. So that’s a bit of a threat to the advertising based model.

 

ISAACSON: The advertising model on the web exists partly because of the way the web was originally designed by you and others, which doesn’t easily allow small payments, secure transactions, ID, would we be better off with a web in which I could pay small amounts for things rather than I had to have it be advertising based?

 

BERNERS-LEE: That we, we’ve always thought that since the beginning, WCCO very early on had a micropayments initiative. And so, yes, we have to be able to find ways of getting revenue to creatives. Copyright law hasn’t been doing very well, but yeah, micropayments as you describe, being able to set your browser so that I’ll pay very small amounts of money to the places – to have that as one of the options for those who, for readers who want to pay like that or want to patronize the writers, the authors, that way, then sure.

 

ISAACSON: So on the web, people can put up material, they can pirate material. AI makes that even more so, which is AI trains on all of that data. Is there a way for people who create content, who have intellectual property, to copyright it and make it work on the system based on the web?

 

BERNERS-LEE:  If you write something on the web, by default, it’s copyright. So then the question is if an AI uses that, you know, there are a number of problems, but I think in principle, what should happen is that when AI, if an AI uses your data, which is copyright, then you should get some revenue back somehow or another.

 

ISAACSON: And you say somehow or another, but doesn’t there need to be some protocols that would permit that?

 

BERNERS-LEE: Some people think that when a LLM sprouts a poem in the style of a given person, for example, then you should be able to poke the LLM and and, and find out, okay, so which sources did you actually use to make that result? If you can poke an LLM and ask it which sources they used, then you can imagine when the answer comes to LLM then you’d have, the protocol would then go back and recompense those sources. But that is all in the, the – this idea of being able to find out which sources were responsible for giving the answer is, I think, all in the research stage. So you can’t, maybe it’ll come in the future.

 

ISAACSON: You talk about taking the web back. What are you and your teams developing in terms of protocols or ideas that would make the web a little bit less toxic and more useful to the ordinary people?

 

BERNERS-LEE: We are pushing that digital sovereignty. So digital sovereignty is is, you know, the, when originally when the web was young, you, anybody could start their own website. And so that power that people have, your sovereign as a individual sovereignty, digital sovereignty. So we’re pushing digital sovereignty. We’re building systems where it’s like having your own website. We’re building new systems where you have a, it may be a data wallet or it may be a pod, a personal online data store. But it’s one thing, something that you completely control of; building apps that work with that so that you can build. We’re building apps which are collaborative as a center or for collaborative communication building things which are give you back your sense of purpose as and, and being a peer to other people on the, on the web and gives you back an ability to collaborate with them.

ISAACSON: Sir Tim Berners Lee, thank you for joining us.

 

BERNERS-LEE: Thank you for having me.

About This Episode EXPAND

In the age of social media, the online landscape is more challenging than ever for civil society. It’s a far cry from what the inventor of the World Wide Web, Tim Berners-Lee, intended to create. He speaks to Walter Isaacson about his new memoir, “This Is for Everyone.”

LEARN MORE