
The “Ensh*ttification” of the Internet and How to Fix It
Clip: 10/24/2025 | 17m 50sVideo has Closed Captions
Cory Doctorow discusses his new book "Ens***tification" and what's gone wrong with the Internet.
What with the rise in AI-generated ads, pop-ups and reams of information, author Cory Doctorow says, "The internet is getting worse, fast." He joins the show to discuss his latest book examining where it all went wrong and how we can fix it.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

The “Ensh*ttification” of the Internet and How to Fix It
Clip: 10/24/2025 | 17m 50sVideo has Closed Captions
What with the rise in AI-generated ads, pop-ups and reams of information, author Cory Doctorow says, "The internet is getting worse, fast." He joins the show to discuss his latest book examining where it all went wrong and how we can fix it.
Problems playing video? | Closed Captioning Feedback
How to Watch Amanpour and Company
Amanpour and Company is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.

Watch Amanpour and Company on PBS
PBS and WNET, in collaboration with CNN, launched Amanpour and Company in September 2018. The series features wide-ranging, in-depth conversations with global thought leaders and cultural influencers on issues impacting the world each day, from politics, business, technology and arts, to science and sports.Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipNext to the online world, which can be infuriating, isolating, and downright dangerous with the rise of AI-generated ads, pop-ups, and reams of information, author Cory Doctorow says, the Internet is getting worse fast.
He's joining Hari Sreenivasan to discuss his latest book, examining what's gone wrong and how we can fix it.
Christiane, thanks.
Cory Doctorow, you've got a new book out called En****fication, why everything suddenly got worse and what to do about it.
This is now a word that has been in kind of common slang for a couple of years.
Some dictionaries are picking up on it too.
First for our audience, what does it mean?
Well, thank you.
For most of my adult life now, I've worked for a nonprofit called the Electronic Frontier Foundation that does digital rights work.
I've spent most of my life coming up with different metaphors and similes and framing devices for this.
En****fication is the latest one and it's done really well.
It's a way of talking about how platforms go bad, but also about why platforms go bad.
So it describes this pattern of platform decay.
First, platforms are good to their end users.
They find a way to lock those end users in and once it's hard for them to leave, they make things worse for them in order to make things better for business customers who also get lured into the platform.
Once they're locked in, the platform withdraws all of the value from those sellers as well and eventually it's just a pile of ****.
Eventually the platform is fully decayed, end stage en****fication.
But the more interesting thing is the questions it raises and the answers it proposes for why it's happening now.
Walk us through examples that people would be familiar with.
I mean, you spend quite a bit of time on Facebook, and there's about 3 billion people on the planet that know what that is like.
So, walk us through how this process has played itself out on something we're familiar with.
Yeah, Facebook started with a very attractive proposition.
They went to people who were using MySpace, which was the big social media platform of the day, and they said, "We will never, ever spy on you."
And so people piled into the platform, they identified the people who mattered to them, they got a feed consisting solely of the things they asked to see, but they also locked themselves in.
They locked themselves in through something economists call the collective action problem, which you may know as the problem of getting the six people in your group chat to agree on what board game you're gonna play this week and what movie you're gonna see, only when it's a couple of hundred people on Facebook and some of you are there because that's where the people with your rare disease, or where you meet with the people who live in the country you left behind, or how you find your customers, or your audience, or just plan the Little League carpool.
It can be really hard to go.
And so once Facebook knows that you can't leave anymore, they start Phase 2 of En****fication, making things worse for you to make things better for business customers.
So they go to the advertisers and they say, "Hey, do you remember when we told these suckers that we weren't going to spy on them?"
Obviously, that's a lie.
We spy on them with every hour that God sends.
We have these incredibly detailed non-consensual dossiers on them.
And if you give us remarkably small sums of money, we will target ads to them with incredible fidelity.
They go to the publishers and they say, "You remember when we told these people we were only going to show them things they asked to see?"
Obviously, that's a lie too.
We will cram stuff into people's eyeballs to see it.
All you need to do is put excerpts from your own website on Facebook with a link back to your own website so people can click on it and we'll just show it to people who never asked to see it.
So they get locked into they become dependent on us.
We know a lot about monopoly and kind of our daily lives.
We think a lot about what happens when there's just a few sellers.
But it's actually just as bad when there's just a few buyers.
So you get this monopsony lock in right where Facebook has control over its sellers, and it makes things worse for them.
We see ad prices going through the roof.
We see ad targeting fidelity going through the floor.
We see ad fraud exploding.
Publishers had to put more and more of their content on Facebook.
You had to put so much that there was no reason to visit your website.
And of course, no one was going to because if you put a link to your own website on Facebook, they wouldn't show it to anyone because maybe that link was a malicious link.
And so we end up with this kind of deadlock where we are holding each other hostage.
The businesses are held hostage by us.
The amount of content in our feed that we want to see has dwindled to a kind of undetectable homeopathic residue.
And the void has been filled with things that people are being ripped off to show us.
And in that equilibrium where all the value has been taken by Mark Zuckerberg and his shareholders and executives, we are all one hair's breadth away from leaving.
So I wonder, one of the things that people are going to hear you describe this and say, why doesn't the market just fix this?
Isn't there a better mousetrap somewhere else?
Won't just people walk with their feet?
Yeah, so you know, we used to have mechanisms that punished companies for being bad to us.
And one of them was competition, right.
And so, you know, there was a time when people were exodusing from Facebook at speed and running to a new startup called Instagram.
You may recall that Facebook then bought Instagram for a billion dollars at a time when it only had 12 employees.
And what's really interesting about that is that as little as we were enforcing antitrust law in those days, there was one thing that we still said was illegal.
And that was to buy a company in order to reduce competition.
And Facebook's CEO Mark Zuckerberg sent an email to his CFO where he defended buying Instagram, even though it only had 12 employees for a billion dollars.
And he said people prefer Instagram to Facebook.
They don't come back.
If we buy Instagram, we can recapture those users.
That's as much of a confession of guilt as you could ask for.
And yet the Obama DOJ waved that merger through, just like all of the G.W.
Bush and Clinton DOJs.
Everyone since Reagan has waved through pretty much every merger, except for four extraordinary years under Biden.
We haven't had a new privacy law since 1988.
The last privacy law America got out of Congress for consumers is the Video Privacy Protection Act of 1988.
It's a law that makes it illegal for video store clerks to disclose your VHS rentals.
That's the only technological threat you can expect to be protected against since the 1980s, since Die Hard was in theaters.
So Facebook can spy on you in all these ghastly ways and do bad things to you.
All of the mechanisms that used to punish companies for being bad to you, new technology and interoperability, strong workforce who cared about users, competition, regulation, all of those things were systematically dismantled.
And when you take away the forces that punish people for harming you to help themselves, well, you should expect that the people who are in a position to harm you to help themselves are going to go ahead and do it.
You write about how the erosion of our antitrust frameworks and laws has created this kind of techno feudalism.
Describe what that is and who benefits.
This is a term I got from my friend and colleague Yanis Varoufakis, who used to be the finance minister of Greece.
And Yanis wrote this book, Technofeudalism, where he tries to draw an important distinction between profits and rents.
So profits are money that you get for making something that people like.
Rents is money that you get for owning something that people need to make things.
And so rents, you know, it's passive income.
It's owning a factor of production.
If you're Amazon, and you own the platform, then everyone who wants to sell to the American public has to pay you 45 to 51 cents out of every dollar they bring in in rents.
About 90% of affluent households in America have prime, they've pre paid for shipping a year in advance.
There's no reason to shop anywhere else.
And indeed, if they find what they're looking for on Amazon, they don't shop ocus on Amazon, they don't shop anywhere else.
And so if you want to sell anything to Americans, you have to be on Amazon, you have to give Amazon 51 cents out of every dollar you bring in, Apple and Google bring in 30 cents out of every dollar that we spend in an app.
So if you're supporting a performer on Patreon, or if you are donating to a news agency, or if you're buying music or ebooks or audiobooks or movies, 30 cents out of every dollar is being captured by these tech platforms.
51 cents out of every dollar is being captured by Google and Meta.
And so that's money that publishers and advertisers don't get to keep.
So these rents are the characteristic of feudalism, right?
The difference between feudalism and capitalism, the thing that changed to turn feudalism into capitalism, was the transformation of an economy based on rents to an economy based on profits.
Now, both of these have their problems.
Neither of them were particularly great for workers.
But capitalism was much more productive than feudalism.
I want to give you an opportunity to examine the business model that Uber has ushered in, because there's so many other companies that are Uberifying whatever their vertical or whatever their market niche is.
I mean, because we talked about Facebook, we talked about Google, and it seems like slightly a different type of business, but it kind of follows the rules that you lay out in this book of how these platforms decay.
Yeah, so, you know, Google lost 31 billion dollars over 13 years, mostly Saudi Royal money.
They got it from a venture capital fund called SoftBank that's the same people who gave us WeWork and they gave us, well, now they're backing OpenAI.
So the Saudis subsidize 40 cents, 41 cents out of every dollar of all of our taxi rides for more than a decade.
All the other cab companies go under.
We have a lost decade in transit.
And as a result, when Uber starts to raise prices and they more than doubled them now, and when they start to cut wages and they more than have them now, we're often without any other alternatives.
They are able to make a lot of money from this.
They're redeeming those discounts they offered in the early days.
Uber has really digital, high tech ways of changing the wages and prices that they pay and offer.
So this is something called algorithmic wage discrimination, comes from a legal scholar called Veena Dubal.
What Uber does is periodically offers drivers a slightly low ball offer, a little less per mile, a little less per minute.
When the driver takes that, if they take the bait, then the offer goes down again a little while later, and it goes down again a little while after that.
The idea here is to sort of in the manner of a boiling frog, to get that driver to abandon all the things that used to let them be picky about which Uber rides they would take.
And as a result, you have the steady erosion of these wages.
Now that's something that's spread to other fields, where you have contractors, Google, or rather Uber misclassifies its employees as contractors and can get away with paying them different wages for the same work.
That's also true in fields like nursing, where hospitals preferentially hire nurses as contractors, not as staff, it's how they do union avoidance.
And it used to be that if you were hiring a contract nurse for the day, you do it with a staffing agency, and that would be someone local.
These days, there's four giant apps, each of which bills itself as Uber for nursing.
And because we haven't had a new privacy law since 1988, these apps, before they offer nurses a shift, can go to a data broker and find out how much credit card debt the nurse has.
And the more credit card debt that nurse is carrying, the lower the wage they are offered.
They are imputing financial desperation and charging them a premium as a result.
This uberization is spreading to other labor markets, and it is connected to a lack of competition, a lack of regulation, the unique characteristics of digital, and the fact that IP law stops nurses from twiddling back, from changing the way this stuff works.
If you're a nurse or an Uber driver or someone else is being paid with an app, you could, in the absence of the IP laws that stop you from buying an app that intervenes, you could get an app that says things like, "Okay, all the nurses in this town are going to refuse shifts below a certain wage," or "All the drivers in this town are going to refuse rides below a certain wage," and it can calculate in an instant how much you're being paid by the mile and by the minute, which is something that can be really hard to do if you're a person and you've got 10 seconds to decide whether you're going to take a ride.
There are so many ways that people could push back with technology.
They've all been taken off the table.
And so what we have is infinite flexibility and technology to exploit and harm you and zero flexibility and technology to defend yourself from exploitation and harm.
So let's pivot a bit to kind of what is your vision of a future look like of a good internet?
What are the kinds of interventions that we would need to do to build that?
I mean, because I think a lot of times the onus gets thrust back upon us.
Oh, well, you know, if you just vote with your dollars, if you just change your behavior, whatever.
As you described, look, a lot of times we don't really have that much of a choice.
But on the policy front, are there things that we can do to try to regain some of this control?
I believe in systemic changes.
And you asked about which policies would make a difference.
Well, you know, privacy law, which will go a long way.
There are lots of different people who are angry about the privacy situation in America.
And if we could get them all to start pulling in the same direction, boy, could we ever make a difference, right?
And the answer to this is a federal privacy law with a private right of action.
And we are long overdue for it.
That would be a very big one.
And I think it's a relatively easy lift.
The other stuff's a little harder.
One thing that I go into some detail on in the book is how to think about a policy that is administrable, right?
So we can imagine lots of things we don't want companies to do, but figuring out whether they're following those rules is really hard.
If we say, okay, we have to stop people from harassing people and allowing hate speech.
Well, you have to agree on what hate speech is, you have to investigate something that someone has called hate speech and see whether it is hate speech, you have to decide whether the company did what it could to stop it.
This is like a multi year process for something that happens 100 times a minute on a platform like Facebook.
And so it's just not a great answer.
Meanwhile, if you ask yourself, why do people on Facebook who are bombarded with hate speech and harassment stay on Facebook?
Well, the answer is that they don't want to leave their friends, right?
And so those people stay.
So why don't we make it easier for them to leave?
You know, Mark Zuckerberg gave you a scraper that would let you leave MySpace, but still see the messages for you on MySpace.
We could reverse engineer apps to do that.
We could use scrapers to do it.
And we could also mandate, you know, through policy that firms do it.
So we could make it so that if you left Twitter or Facebook and went to Blue Sky or Mastodon, that you could see the things people were posting for you on the platform you used to belong to, and they could see the things that you posted in reply.
And that would mean that if the company didn't treat you well, you could leave.
That is how you vote with your feet and vote with your wallet.
But in order to do it, we need policy that makes it possible.
As to how you get involved in policy, I mentioned a few times that I work for this digital rights nonprofit called the Electronic Frontier Foundation.
And these are grassroots groups that work on everything from abortion privacy and limiting the use of digital tools in ICE raids, to limiting facial recognition, to demanding that public procurements be of tools and software that can be independently repaired and audited.
And there's a lot of room to do work even under this federal administration at the state and local level.
Yeah.
You know, I did want to ask, look, a company like Uber or Google, they'll hear what you have to say maybe, and they say, look, this is the free market.
We have built a product that's successful enough that people with their own power have chosen the costs and the benefits and they're coming to us and they're doing business with us because we provide them a service that's worth it.
What's wrong with that?
Well, I think if that were the case, it would be great, but that's not the case.
I mean, if you want to choose someone else's app store for the iPhone, you have to commit a felony punishable by a five-year prison sentence and a $500,000 fine.
If these guys want to be creatures of the free market, then they should stop using state intervention to prevent new market entry and to prevent end users from getting a better deal from installing privacy blockers in their apps and so on.
You know, the rules against reverse engineering have been enormously beneficial to these companies that make proprietary platforms like our cell phones.
You know, if you're a web user, chances are you've installed an ad blocker.
51% of web users has installed an ad blocker.
It's the biggest consumer boycott in human history.
No one's ever installed an ad blocker for an app because to reverse engineer the proprietary platform that the app comes along as a felony under section 1201 of the digital millennium copyright act of 1998 and it carries a sentence of a $500,000 fine and a five-year prison sentence for a first offense.
And so if they want to be creatures of the free market, well then let them give up the power to invoke the state to prevent people from deciding how their own property works.
I'm not the world's biggest advocate for markets as the best way to organize everything, but the one thing that people who believe in markets should believe in is that private property is sacrosanct.
And when you buy your phone, the fact that it would make Tim Cook sad if you use someone else's store is his problem, not your problem.
The book is called En****fication, Why Everything Suddenly Got Worse and What to Do About It.
Author Cory Doctorow, thanks so much for joining us.
Thank you, Hari.
It was a real pleasure to be on.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by: