04.28.2025

Sextortion, Suicide, Drug Dealing: How Social Media Is Harming Our Teens

Susan Glasser, a staff writer for The New Yorker, on Trump’s first 100 days. Ambassador Dennis Ross on his most recent book “Statecraft 2.0.” Abortion law expert Mary Ziegler on her new book “Personhood.” Filmmaker Perri Peltz and lawyer Matthew Bergman on their new documentary “Can’t Look Away.”

Read Transcript EXPAND

BIANNA GOLODRYGA, ANCHOR: Now, with every new digital advancement comes a new way to impact the lives of children. The latest, Meta’s A.I. Digital companion. Founder, Mark Zuckerberg, says, it’s the future of social media. But now a new documentary, “Can’t look Away,” exposes the real-life consequences of these kinds of technologies. Filmmaker Perry Peltz and lawyer Matthew Bergman join Hari Sreenivasan to explain.

(BEGIN VIDEOTAPE)

HARI SREENIVASAN, CORRESPONDENT: Bianna, thanks. Perry Peltz, Matthew Bergman, thank you both for joining us. Perry, let me start with you. This is a documentary called “Can’t Look Away.” You’re the director of this film. Why’d you want to make it?

PERRY PELTZ, DIRECTOR, “CAN’T LOOK AWAY”: It’s a great question because when the topic first was proposed to us, I think like so many people, you think you sort of know that social media is bad for kids. And we learned through the discovery process, before we started filming, is that this more than that, these are platform social media platforms, big tech platforms that know what they are doing and they are designing this platform keep our kids on as much as possible. And in the process, they are really exploiting our kids and they’re doing it with intent so that our kids stay on them. We felt that was a documentary that needed to be told.

(BEGIN VIDEO CLIP)

UNIDENTIFIED FEMALE: How many more children have to die because Snapchat chooses is profits over safety.

UNIDENTIFIED MALE: I found my son’s lifeless body due to fentanyl poisoning. They have the best distribution system in the world and nobody has stopped.

UNIDENTIFIED FEMALE: Our children are the casualties. We need to take back the power from these companies.

UNIDENTIFIED MALE: They know the levels of addiction. They know the levels of suicide. They’re not showing our kids what they want to see. They’re showing you what they can look away from.

(END VIDEO CLIP)

SREENIVASAN: Matt you started the Social Media Victims Law Center. What’s the mission of your company?

MATTHEW BERGMAN, FOUNDER, SOCIAL MEDIA VICTIMS LAW CENTER: The mission is to hold social media companies legally accountable for the carnage that their platforms are inflicting on young people, not just in the United States, but throughout the world. They have designed platforms that they know are addictive to young people. They have prioritized engagement over safety. And the children and the families that you see in the documentary are the victims of these deliberate design decisions. And we aim to hold them accountable and get them to change their behavior and become more responsible corporate citizens.

SREENIVASAN: Perry, well, Matt might work with thousands of families, you really focus in on a few to tell the kind of stories on the types of harm that are happening. One of the stories that you talk about is a young boy named Jordan DeMay. Can you summarize a little bit of what that story’s about for our audience?

PELTZ: Yes, absolutely. Jordan DeMay was a high school junior in Michigan, and the captain of the football team, he was the homecoming king. And he had dropped his girlfriend off at home one evening. Came home. Was on Instagram, sort of swiping through and was DM-ed by what appeared to be a young girl, probably 9th, 10th grade. And eventually, they were sharing explicit photos. And very soon thereafter, what appeared to be this young girl said to him in direct, in a DM, you’ve got three hours to produce a thousand dollars. And he panicked, full on, didn’t understand what was going on. I’ll jump through the story. But eventually, what happened is he came back with $300 and she sent that picture out. You know, she threatened to send that picture out to the football team, to the coach, to the parents, to the girlfriend and he was dead an hour later. He found a gun and he killed himself. And this is — it turns out it wasn’t a young girl, it was two men who had taken this — created this this profile. And this is what’s happened unfortunately on Instagram. And Meta is aware of this, aware of what’s happening. They’re they — and there’s just not enough that’s being done to prevent this from being happening — from happening. But he was dead within hours of this happening.

SREENIVASAN: So right after this story about Jordan DeMay became public meta removed, 70,000 accounts linked to sex extortionists. So what, what does that say to you, Perri?

PELTZ: That it’s widespread and it really gets back Hari, I think to a point that we’re, we’re trying to make is the, the big tech companies, these social media platforms would like you to believe that these are bad kids. That these are edge kids. These are parents that aren’t keeping a watchful eye on their kids. We are here to say that is not the case. This is a public health crisis. This is another page out of the opioid playbook. Out of the tobacco playbook. This is happening widespread, and it is something that these companies are aware of. But this is nothing short of a public health crisis.

SREENIVASAN: Matt, given that you have seen these different types of patterns from so many different families, how — have you been able to kind of figure out how it is that social media companies are serving up this content? What is preference, so to speak, in the algorithm, to decide to allow either direct messages, suggest the types of individuals that should be able to be followed, et cetera, to teens?

BERGMAN: Well, they do that because their profit model is based on maximizing engagement. The more time that a child spends online, the more advertising they can put in front of their face. So, they focus on showing kids not what they want to see but what they can’t look away from. That’s why the documentary is titled that. And from a standpoint of neuropsychology, content that is psychologically disturbing is more attractive than content that is benign. The other thing that they do is they take advantage of the social psychology of young people, adolescents, who crave the adulation of their friends. And so, through this gamification, through the like feature or through Snap streaks, they create a situation so that kids measure their self-esteem based on how many likes they get on a posting or based on how many how — what their Snap score is, or how long their Snap streaks are. And again, this is not an accident or a coincidence, this is a deliberate design decision that takes advantage of the undeveloped nature of the adolescent brain and the immature nature of the adolescent psyche. And as a consequence, kids are in the midst of the worst psychological crisis that that we’ve seen in many years.

SREENIVASAN: In response to the film, a Meta spokesperson, Meta is the company that’s in charge of Facebook and Instagram and WhatsApp, says — Meta said, we know parents are worried about their teens having unsafe or inappropriate experiences online, and that’s why we are significantly changing the Instagram experience for tens of millions of teens with new teen accounts. These accounts provide teens with built-in protections to automatically limit who’s contacting them and the content they’re seeing. And teens under 16 need a parent’s permission to change those settings. We’re also giving parents more oversight over their teen’s use of Instagram with ways to see who their teens are chatting with and block them from using the app for more than 15 minutes a day or for certain periods of time, like during school or at night. Matt, does that work?

BERGMAN: It’s a start. And I will say the only reason they made these changes was because we’ve been going after them for two and a half years in courts of law. This really is evidence to me of why the civil justice system is essential. You know, I’d say it’s a step in the right direction, but it’s still a baby step. As long as kids are able to self-identify based on what their age is, it doesn’t do any good. Meta has — Meta and the other platforms have the technology to be able to use estimated age whatever the self-reported age is. And yet, they are unwilling to do that. So, you know, to some extent, they’re kind of turning a blind eye and using — you know, it’s a fig leaf. To another extent, these are some significant changes. I don’t want to demean, you know, anything that makes these platforms safer, even if it’s a small step is a step in the right direction.

SREENIVASAN: Matt, the other critique that the social media companies will come back with is, look, we are not legally liable. There’s the law on the books, the 26 words that kind of define the internet called Section 230. It’s been on the book since the mid ’90s, right? And they’ve been shielded. I guess for our audience, what does Section 230 say? Where do you think the line should be drawn?

BERGMAN: Well, Section 230 was drafted in 1996 when Netscape was the largest platform. Social media didn’t exist. Google didn’t exist, and Mark Zuckerberg was in junior high school. And unfortunately — and the internet was in its infancy. And what it essentially does is shield companies from liability for posting third-party content on their platforms, which, you know, back then was — were bulletin boards. Unfortunately, it’s been broadly interpreted until we started doing our work to shield social media companies from any kind of liability. In other words, every other company in America has a duty of reasonable care. And our position is that social media companies should have the same duty that any other company has. Basically, follow the Golden rule. So, what we have been doing is focusing on the design of the platform, not the third-party content, and showing that these products are dangerous by design. And thus far, these arguments have permitted us to move forward and commence discovery and get these cases ready for trial. But, you know, the argument is that, you know, every other company in America and we’re — you know, we’re pro-business, we’re pro-capitalist, we just want social media companies to follow the same rule that every other company does. And you know, if a company — if an auto manufacturer can save $50 million a year by putting bad brakes on the cars, they’re not going to do that, even if they’re not a socially responsible company because they’re going to be subjected to liability. We want social media companies to have the same calculus. You know, we could have an endless stream scrolling mechanism. Yes, we would make more money, but we would addict more kids and incur liability. So, we better not do that. We just want that same feedback loop that every other responsible company has to apply to social media.

SREENIVASAN: Matt, you’ve worked on asbestos litigation before, and you’ve said that this is similar. Explain that.

BERGMAN: Yeah, so, you know, I think it’s similar to asbestos because it’s a ubiquitous product. It’s similar to asbestos that, you know, you have a public health crisis just as you had a, you know, two generations of workers develop illness from being exposed to asbestos. You have, you know, generations of young people on entire cohort suffering, adverse mental and physical effects. You also had in the, in the case of asbestos outrageous corporate misconduct, where companies knew that their products were gonna hurt people and design them anywhere. Here, you know, as you’ve seen from the Facebook papers and some of the other revelations that these companies know that their platforms are harming kids and deliberately decide to keep making them. And finally you have some, you know, kind of creative legal legal enterprise and some creative legal innovations that have provided a pathway for recovery. So we think it’s very similar to asbestos. I think the only thing that I would say is, having spent 25 years suing asbestos companies, and having spent now three years suing social media companies the corporate misconduct of the of the social media companies makes the asbestos companies look like a bunch of choir boys.

SREENIVASAN: Perry, one of the characters in your film is a woman named Amy Neville, whose son Alexander died of an overdose after taking a pill he allegedly bought on Snapchat. Can you tell me a little bit about her and what her mission is now?

PELTZ: You know, as you know so well, our stories are only — can only be brought to life by the people who have experienced these terrible harms and are willing and brave enough to share their stories. And Amy Neville is certainly one of those people. Her mission now — her — just in short, her son Alexander bought a pill online and it was laced with fentanyl, and he overdosed. So, many people have said when they watch the film, well, you know, kids shouldn’t buy drugs online. And we all hope that our kids aren’t going to buy drugs online. But there are also kids and they experiment. And the question that I always like to raise is, do they deserve to die because they made a mistake and bought drugs online?

(BEGIN VIDEO CLIP)

AMY NEVILLE: We had all these safeguards in place. And yet, we still in ended up here. Some things I want to point out, there’s this — you know, we think kids are partying and taking drugs and that’s where these things are happening. But Alex died right down the hallway from me. You know, it was supposed to be the — you know, you say goodnight and you think your kids are safe, and it’s just not the times that we’re living in anymore. We give our kids these smartphones, we let them have these apps, and that is the equivalent of dropping them off in the worst neighborhood in our area and saying, good luck tonight. I’ll see you later.

(END VIDEO CLIP)

PELTZ: And what Amy is trying to do, and in my opinion, so bravely, is really point out that this is not the fault of these kids. And that parents need to be aware of what is happening on these platforms that drug dealers are really coming out and seeking kids in a way that we are not used to. It gets back to the title of the film, “Can’t Look Away.” It’s not that they’re necessarily asking for this, it’s what is being served up to them.

SREENIVASAN: You know, the global head of platform safety at Snap had, in part, this to say, we’re deeply committed to the fight against the fentanyl epidemic, and our heart squad to the families who’ve suffered unimaginable losses. We have invested in dedicated safety teams and advanced technology to detect and remove illicit drug related content, work extensively with law enforcement to help bring dealers to justice and continue to raise awareness and evolve our services to help keep our community safe. Criminals have no place on Snapchat. Matt, you are suing Snapchat for a fentanyl overdose purchased on the platform and a judge ruled that the that case can move towards discovery. What does that mean? Why is it important?

BERGMAN: Well, it’s important because the statement that you just read was false. And Snapchat has known about the fentanyl epidemic arising from use of its platform for almost five years — for over five years and has done little, if anything about it. Snapchat has thwarted law enforcement. Snapchat has failed to crack down on drug dealers that it knows are selling fentanyl on its platform, to the point where one of our clients reported a drug dealer who had sold her son a fatal dose of fentanyl contaminated drugs. Eight months later, that same drug dealer sold another child a contaminated fentanyl drug and that child also died. So, what the court ruling allows us to do is to do our work as lawyers and investigators and show that Snapchat has allowed its platform to be used as an open air drug market and done little, if anything meaningful to curtail that.

SREENIVASAN: What do you want, Perry, to — for parents to know to do about this?

PELTZ: That’s a really difficult question, and I wish that there was a better answer to that question. But really, what we see is that parents need to be aware of what’s happening so that they can have more engaged conversations with their children and make sure their children understand that if something goes wrong, that they can come to their parents. So, if it’s a sextortion case, that the embarrassment doesn’t matter. They need to come talk to their parents or a trusted adult. Conversation is really, really important. And additionally, that there be as much control over devices as you can possibly have. There’s a big push right now, the Wait Until 8th. And even trying to delay it even more than that. Getting schools to really control and keep these devices out of the school system. There’s not a tremendous amount that parents can really do other than have conversations and control their kids’ use of these devices. But the best thing that we can say is it has to be regulated. That’s ultimately where the answer lies.

SREENIVASAN: Matt Bergman, lawyer for the Social Media Victims Law Center, and Director Perry Peltz, the documentary is called “Can’t Look Away.” Thanks so much for joining us.

BERGMAN: Thank you.

PELTZ: Thank you very much.

About This Episode EXPAND

Susan Glasser, a staff writer for The New Yorker, on Trump’s first 100 days. Ambassador Dennis Ross on his most recent book “Statecraft 2.0.” Abortion law expert Mary Ziegler on her new book “Personhood.” Filmmaker Perri Peltz and lawyer Matthew Bergman on their new documentary “Can’t Look Away.”

LEARN MORE