Connections with Evan Dawson
Is suing social media about to become more common?
4/10/2026 | 52m 17sVideo has Closed Captions
Verdict vs Meta & YouTube may reshape laws on addictive social media.
A recent verdict holding Meta and YouTube liable for a child’s addiction could mark a legal shift. It signals growing willingness to treat platform design as harmful, like tobacco or opioids, potentially opening the door to stricter regulation and more lawsuits over user harm.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Connections with Evan Dawson is a local public television program presented by WXXI
Connections with Evan Dawson
Is suing social media about to become more common?
4/10/2026 | 52m 17sVideo has Closed Captions
A recent verdict holding Meta and YouTube liable for a child’s addiction could mark a legal shift. It signals growing willingness to treat platform design as harmful, like tobacco or opioids, potentially opening the door to stricter regulation and more lawsuits over user harm.
Problems playing video? | Closed Captioning Feedback
How to Watch Connections with Evan Dawson
Connections with Evan Dawson is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship>> From WXXI News.
This is Connections.
I'm Evan Dawson.
>> Our connection this hour was made in California with a girl named Kaylee.
When she started to use social media, Kaylee was six years old.
She said she turned to apps like Instagram and YouTube throughout her childhood as creative outlets, and to try to escape bullying at school.
She used filters on Instagram, for example, to hide her insecurities.
14 years after she began using the platforms at the age of 20, Kaylee sued four of their parent companies.
According to The New York Times, she sued.
Uh, she sued snap, YouTube, TikTok and meta, which owns Facebook and Instagram and which caused, she says, her body dysmorphia and even thoughts of suicide.
Now, that was two years ago.
Just over two weeks ago, Kaylee won her case in a landmark decision, jurors ruled that YouTube and Meta harmed Kaylee by creating addictive features that led to her mental health issues.
And now the companies must pay several millions of dollars in damages.
Kaylee's case was one of thousands filed against social media companies and was a major win for families concerned about the product's effects on kids, especially during the trial.
Parents of kids whose deaths were related to the use of social media gathered outside the courthouse waiting for the verdict.
Both YouTube and Meta say they disagree with the decision and plan to appeal or review other legal options.
But now, with a spotlight on the issue, could this be social medias big tobacco moment?
Does it create a precedent for more lawsuits where social media companies can be deemed liable for personal injury?
Will it pave the way for regulation, or even banning the use of certain platforms for children like in Australia, which just barred kids under the age of 16 from using social media?
And perhaps one of the biggest questions will the ruling force social media companies to change what they are doing?
Our guest this hour will have a lot to say, and he's going to help us understand the legal sort of layers to this.
Scott Malouf is an attorney who has worked in this section of the law, and it's nice to see you back here.
Thanks for being here.
>> Thanks for having me back.
>> I want to say a couple of things at the start, and I want to tell the audience, I really want to hear from the audience this hour.
Do you feel addicted to social media?
Do you worry about your kids being addicted to social media?
And if the answer is yes, do you feel that the responsibility is shared with parents and tech companies who want to addict us?
Do you think it's all in the tech companies?
Did you cheer this result, this this verdict against YouTube and against Meta?
Or do you think do you think differently?
My bias is at first I was cheering full Throatedly Scott because in a moment I'm going to play a soundbite that sort of encapsulates a lot of how I feel about this.
However, I'm close to a free speech absolutist, and I understand that while it feels to me like social media is the cigarettes of our age, we know it's bad for us.
We use it anyway.
It's not a perfect parallel because you get into speech issues that don't exist with big tobacco.
So when you hear social media is the cigarettes of our age, what do you think?
>> Um, first of all, Evan, I'll just put a disclaimer in that my views are my own.
They're not those of my employer.
Anybody I know or definitely not my family.
So.
>> Okay.
No problem at all.
>> Um, it's really interesting you say that about the big tobacco moment, because one of the things that I thought about was when the tobacco settlements happened, I think that was somewhere in 97, 98.
It really affected smokers.
It affected maybe secondarily family members of smokers, all of us seeing more and more bars and restaurants get rid of smoking.
The difference here is that this impacts every single one of us.
And going back to your speech point, if I am a social media company or if I'm even an any kind of internet provider, I am looking at this and saying, is every design choice I am making going to be evaluated by a jury, by a judge, by a state legislator?
And do I even want to be in that business anymore?
That's the real question.
The difference was cigarettes were a select group, and yes, it impacted the general public.
But this affects every single person using the internet.
And we, the users, are not really involved in this discussion at a very high level.
The way the jurors are, the way the judges are.
>> A two, three, $4 million verdict is not a huge number for these massive companies, but these are not the first lawsuits.
And there could be more.
Right?
>> Right.
There are many, many lawsuits.
So if we just break it down at a very high level, one, the Caylee's lawsuit claims lawsuit was part of a group of lawsuits in California in state court.
And I believe there are about 10,000 plaintiffs in those cases.
I'm ballparking.
And then in California, there are also federal litigation, which is also multiple, multiple hundreds of people as well as states involved in that litigation.
And then finally, there are state attorney generals like New Mexico, a decision that was very close to in time to the California decision that have also brought cases.
I think there are about 41 state attorney general cases out there.
So the answer is yes, there's a lot.
And add on regulation, both by state legislatures, by the feds and internationally.
You mentioned Australia as a good example.
>> So I want to hear from the audience this hour.
You can email the program, as always.
The email address is Connections at wxxi.org.
Did you cheer this verdict?
Do you feel addicted?
Are you worried about your kids getting addicted?
And do you think this will help or do you oppose the verdict?
Do you think?
Look, people use social media at their own risk and their own pleasure, and that's up to them.
And then it's also up to the parents to regulate kids usage.
Again, it's Connections at wxxi.org.
You can call the program toll free 844295 talk.
It's 8442958255263.
WXXI.
If you call from Rochester 2639994.
And yes, we're streaming live on YouTube.
You can join the chat there, which feels a little weird this hour.
I want to listen to a piece of sound that celebrated this verdict, and it comes from someone who has probably done more than any human on the planet to change how we are regulating our phone use in schools, social media, et cetera.
It's the social psychologist Jonathan Haidt.
He's the author of The Anxious Generation, among other pieces of work on this subject.
He's the one who's been working with a lot of governors to to get the cell phone bill to bell ban.
That New York state now has.
And this was his reaction.
Talking to CNN's Fareed Zakaria after the verdicts.
>> I've had two conversations with Mark Zuckerberg, 1 in 2019, 1 in 2020.
In both of them, I said, Mark, there's a lot of emerging evidence of harm.
And he argued with me and he said, no, it's just correlational.
Well, now we know because of all thousands of parents are suing Meta because their kids are dead or severely damaged.
And so much information has come out in the lawsuits.
My team at anxious Generation, my team gathered reports of 31 studies conducted within Meta because they want to understand children and adolescents and what makes them keep attention, what makes them stay engaged is the word.
And now we know that beginning in 2018, they themselves found correlational evidence that the kids who are on a lot are in bad shape.
They ran their own experiment where they had people get off for a week and their mental health improved.
They did deep research into brain development.
So my point is, if you go to Meta's internal research.org, you'll find 31 studies that we organized.
And these are all internal meta studies.
So they knew well that they were addicting kids.
The program was designed to be addictive.
Originally, Instagram was designed to be addictive.
They know that they're addicting kids.
They reward their engineers for increasing engagement.
But some of them actually literally use the word addiction.
And so we know that they did this on purpose.
And now finally, for the first time in all of their history, they are facing a jury and they are going to be held accountable.
They have, I believe thousands of kids are dead.
That would not be dead if they had not used Meta's products.
And for the first time, some of those parents are going to get justice.
>> That's Jonathan Hite there.
First of all, what stands out to you again?
The reason I chose this clip is my immediate reaction was almost word for word what you just heard.
I've changed a little in in reading David French and some other people that I'm going to be quoting coming up here, and I don't know where I sit on this.
I suspect a lot of people are torn, but I can understand the compelling emotion in what height is saying there.
What did you hear?
>> You know, I heard the same thing.
It's the emotion.
And I think all of us feel that the issue being.
You say these are very large companies.
They don't seem to be operating in the best interest of their users.
But yet we turn around and we need to use them every day.
As you said, we're streaming on YouTube.
I mean, I think that's the conundrum that we all face and that's the challenge and why maybe the jurors decided things the way they did.
They felt like, hey, I'm stuck with this.
I was wondering if the jurors spent weeks, you know, listening to evidence that these platforms are addictive.
They're not well designed.
How long did they stay off their phones once they left the courthouse in the case was done.
>> Oh, that's kind of a an interesting and perhaps dark and telling question.
Um, height says there is enough evidence based on studies that he said, quote, we know this harms children.
So he said this isn't debatable anymore.
Whether kids are being harmed.
Is that is it that cut and dry to you?
>> I don't think that's fair.
And I think the challenge here is both sides are very maximalist.
You know, you heard height paraphrase something that he'd heard Mark Zuckerberg say to him, we just think that's causational.
>> And correlational.
>> Or correlational.
Thank you.
And then the other side saying, no, no, no, we absolutely know this.
And I think the answer is this is much, much too complex to just point one way or the other.
You know, as we all know, every child is different.
Every person is different, every usage pattern is different.
And it's, it's not easy to just say it's, it's A or it's B. The other thing is for about the last 5 or 10 years, we have blamed social media for many, many, many things.
And that happens.
And few people stand up and say, wait a minute, is that really fair?
Like we talk about political divisions and we go, oh, that's social media, is it?
You know, we have to be a little more thoughtful in what we put and pin on social media and what maybe is something else.
>> Okay, we're going to keep working through this point.
Let's have Alex and Rochester jump on the phone with us first to kind of weigh in on this.
Hello, Alex.
Go ahead.
>> Hey, Evan.
I love calling into the show.
I just wanted to weigh in.
Um, I am personally a recovering addict, and I have gone so far on doomscrolling and social media.
I mean, I grew up with the iPod touch and everybody getting it, and it was so cool.
You know, we're all on Snapchat and stuff.
I've been using it for years to the point where I now have a foot phone.
I can't have a smartphone because I just sit there and I scroll and I lose time and hours on stuff that I could be furthering my life with.
So I mean, this is this is a real thing.
I know people who are addicted in this way and deal with this in a very, very difficult way.
So I just wanted to weigh in.
It's very real and it's very dangerous for a lot of people.
So thanks for taking my call.
>> Hey, can I ask you a quick question, Alex, if you don't mind, please.
So I number one, I appreciate how thoughtful you are being with your own time and your own sort of mental health by saying, for me, I got to go to a flip phone, like, I can't do it.
Um, what do you make of a jury saying it is so much of an addicting and intentionally addicting product that we're going to sanction companies.
We're not going to put the responsibility on the Alex's of the world or parents.
We're going to put the responsibility on the companies and they're going to pay.
Do you agree with that?
>> I mean, I think it's similar in the way when you talk about cigarettes in this way.
You know, I mean, they know exactly what they're doing.
These things are designed to get people to continue scrolling as much as they can, steal your attention and use it to their advantage.
So, I mean, I think they are incredibly liable.
I don't see how they wouldn't be, you know what I mean?
Of course, I take responsibility for my own time, and parents should not turn their children into iPad kids.
I know Covid made it hard, but I mean, of course they're liable.
How?
How would they not be is my question.
>> So, Alex, I very much appreciate the phone call.
And so what I'm going to do is I'm going to build on Alex's last point and kick it back to attorney Scott Malouf, who is with us, helping us sort through what this means.
There's an obvious breaking point between the cigarettes analogy and social media, because cigarettes are not speech or cannot be sort of positioned as speech.
But this is where Alex sees the parallel, and this is where I want you to address something here.
What Alex has pointed about cigarettes was they knew what they were doing.
They knew it would be addictive and harmful.
They tried to hide that people got hurt, that cigarettes.
Could you not say the exact same thing for social media?
>> You know, Evan, I think the problem is they is doing a lot of work there.
>> Well, the companies, the creators of the product.
>> And so if we look at Meta, if we look at, um, TikTok or others, there's definitely engineering.
There was in this trial, in the Kalshi trial, there was, there were internal documents that said, you know, we're seeing this is how this works.
And this is this has some properties that keep users on their.
Definitely those things.
But I think the challenge comes in is when we take a step back to your free speech absolutism, how far do we want to go with that?
If we think about the mom and pop company and they're trying to design something.
So, so think about local elections, you know, you might have like a next door and you want to talk about what's going on at the gates school board, you know, should next door be treated the same as Facebook.
And I think that's the challenge is, is really trying to figure out where do we make that break.
We've got this one.
And these are the biggest companies.
I think Facebook cleared a $60 billion profit last year.
So in people's guts they feel that's fair.
And they have the information.
But if it were just a local company, is that really the standard we want them to build to when they're not nearly as profitable and as widespread?
>> Okay, Alex, thank you for the phone call.
Um, yeah, I'm going to keep wrestling throughout this hour with this because I think the general and I take Scott's point about the they it's doing a lot of work here, but in general, they created something.
They created it to be addictive.
They knew that they did that for a reason.
They knew that it might have a deleterious effect on the people who use it.
They did it anyway, and people got hurt.
Like I think you could say it about cigarettes.
I think you could say it about social media.
I think we should all be on social media less.
Um, but then now let me flip a little bit before I read some more feedback here.
Here's the piece that kind of turned me in a different direction.
This is from David French, a really smart writer for the New York Times.
He he's a he's probably I would call him if you want to put him on the political spectrum, a conservative never-trumpers what David French is, and he's an attorney and he writes, don't cheer too hard for the Facebook verdicts.
He writes the following.
I am alarmed by the negative influence of smartphones and social media on children.
All of us should be.
I am also worried that in our zeal to protect children from those negative influences, we will violate the Constitution and undermine free speech.
And that's where things get tricky.
A social media site isn't a bottle of alcohol or a cigarette.
It's not delivering a drug.
It's delivering speech.
Sometimes that speech is silly and harmless.
Sometimes it is toxic and harmful.
Sometimes it's educational or inspiring.
But it's all speech.
And in America, speech traditionally can be blocked, censored or regulated only in the narrowest of circumstances.
Defamation, true threats, obscenity, child sex abuse material, direct incitements to violence.
Each of those forms of expression can be banned and punished because they are not encompassed within the freedom of speech, protected by the Constitution.
That's it.
That's David French's opening salvo.
And we're going to I'm going to read a little bit more specifically coming up here, but how do you like his framing there?
>> No, I think that's a strong framing.
And then remember, if you put yourself in the position of the companies, even the largest companies, they are dealing with such massive volumes of posts, videos, audio speech that they have to say to themselves, we have to have a system, it has to be organized and automated in a way so that we don't open ourselves up to liability or regulation.
And that is incredibly hard to do at a small scale and at the massive scale they're at, it's very difficult.
So the takeaway is, would those companies say we're just not going to be in the business of anything controversial?
If you have eating disorders and you want to discuss them.
Nope.
We're not going to talk about it.
If you want to discuss any kind of religious issue, we're just going to take that down because it's too tricky.
So then do you end up with speech that really doesn't matter that much and isn't that important?
You know, and I think that goes right to what David French is saying.
>> Yeah.
Um well, let me hit a point that I think is counterintuitive to what a lot of us think.
And it's about the algorithm because the word algorithm is pretty loaded.
Now, most laypeople who don't work in tech understand that algorithms are just designed to hook us, or they're designed to put something in front of us that they think that we want based on our own previous actions.
Fare.
Right?
Right.
Okay, here's what David French says about that.
He says even the algorithm is a form of constitutionally protected speech.
In a 2024 Supreme Court case called Moody versus Net Choice, Justice Elena Kagan wrote for the majority that expressive activity includes presenting a curated compilation of speech originally created by others.
The algorithm, Justice Kagan explained, was comparable to the layout of a newspaper where editors decide which stories to feature prominently, which stories belong on the back pages, and how to make the page attractive and readable so that more people will see the news, end quote.
>> I think her position is very sound when you think about it.
We all engage in design decisions, right?
So you choose who comes on the program.
If I have a bookstore, I choose what books are in the bookstore.
Those are design decisions.
And if someone were to say, you know, Evan, I heard Scott Malouf on your program, that was terrible advice.
Maybe drive off the road and they would want to sue you.
Right?
You know, you'd be like, you can't do that, that that just in your gut feels wrong.
You know, you should have had a more calming voice on there.
You know, that would be a terrible claim.
Or I bought a book in your bookstore.
And I think that's what Kagan is trying to express.
Those design decisions are speech.
And when you add on the volume of posts that people are making, they can't do it with a human.
They have to do it algorithmically.
That's that's putting it all together.
>> The issue I have with the way Justice Kagan frames it, far be it for me to go toe to toe with the Supreme Court justice, how audacious.
>> They'd be overmatched.
>> Yeah, right.
I want I want the caveat being here, I'm not a constitutional lawyer.
And I understand that she is far smarter than me.
Having said that, here's the issue.
When I read that section on algorithms, she compares it to newspapers where editors lay out a page and they want the page to be appealing.
They are not making a newspaper for every individual reader.
They did not have the ability when newspapers were more widely in use, to say, this street gets this, this household gets that.
This individual gets the other.
They're doing their best to make it broadly appealing.
But that is not individually targeted in a way that can affect individual people.
So I don't know that it's exactly the same thing.
>> Yeah, no, it is definitely a weakness.
And I think if you look at a lot of the technology cases that especially those that have gone up to the Supreme Court, it's very difficult for judges to use algorithms and talk about these things in metaphorical ways that make sense for for case law, but don't make sense, you know, day in and day out.
And the second part is, you know, asking yourself, how do the platforms operate?
If the platforms operate closer to a broadcast network and they're picking and choosing, even if they did it algorithmically with the posts that people are making.
So no longer you and I being friends and they say, Evan posted something, Scott, would you be interested in it?
They say, no, Scott here is like a wood chopping video.
And we know you're going to watch that.
That's closer to being like a television network.
And so maybe the liability should adhere.
And that's the real challenge.
What we're seeing in these verdicts and these discussions.
It's a lot of gray area.
And how do we do it.
And we're very concerned because it is speech.
It's not like regulating a product.
>> And let me read also what he says is he thinks the weakness in the Los Angeles case, again, there's been a couple of cases now Meta and YouTube found liable in social media cases.
And we're talking to attorney Scott Malouf about that.
And I'm taking your feedback throughout the our listeners about how you feel about your own social media usage, maybe your kids usage, and whether you cheered those verdicts against these social media companies, or if you think they're over the line.
French thinks the Los Angeles case was especially weak.
He says the plaintiff, who started using social media when she was six, didn't claim that she was harmed by unlawful speech.
She wasn't threatened or slandered, for example, but she claimed that social media companies made her addicted to lawful speech and that her compulsive consumption of this lawful speech caused body dysmorphia and triggered thoughts of self-harm.
That lawsuit is one of thousands of similar suits pending across the nation.
There is no question that the plaintiff in this case had a traumatic childhood, but there was a real dispute about whether social media was the principal cause of that trauma.
As The Associated Press reported.
For example, the plaintiff testified that her mother had abused her physically and psychologically.
The jury was asked whether the company's negligence was a substantial factor in causing harm, not the factor, not the only factor, not the primary factor, just a substantial factor.
And French's conclusion is that that is a weak case, given all the other aspects of the landscape of this girl and now woman's life.
What do you think?
>> Yeah, it's a very difficult situation because you have the plaintiff who admittedly comes with a lot of difficulties.
And let's flip that around.
You're putting I believe there were 12 jurors here, and you're saying to the jurors, here is someone you feel for this person.
You hate to see her have this this difficult state.
And here is a very large set of companies against her.
And what portion of her current harm is their responsibility?
It's a tough thing, I think, for jurors to to suss that out, especially after they've had to hear experts say, I definitively believe that she was harmed in this way.
And people on the other side say, no, it wasn't the case.
I think it's really, really hard.
And that's part of why we're saying it's not good to regulate speech in these platforms, in juries with litigation.
But the other two alternatives, legislation, you know, lends itself to heavy handed government, right?
You could see a politician saying, oh, that that platform doesn't favor me.
Let's keep that down.
And then finally, self-regulation by the platforms, which as we started with, they don't feel like they're doing a very good job of saying, am I being helpful and protective of my users, especially the youngest ones?
>> Um, one other point before I grab Jeff in Rochester on the phone, I have an email from a listener who is saying, let's not lose sight of the fact that these are kids we're talking about.
And I want you to address this point as an adult.
I can smoke a cigarette.
I can go buy a beer, and I can decide that social media is something I should do for 12 hours a day, no matter what it does to my brain, my relationships, my sleep, et cetera.
>> As children, we regulate differently.
We understand that a child picking up a pack of cigarettes at age 11 is a bad idea, obviously, for their health, but also the addiction factor and a listener saying that's the issue here.
It's not that social media is being made to be addictive to adults.
We have to do better as adults at educating ourselves and controlling ourselves.
Instead, it's targeted to kids, and kids get it and they are in real trouble when they become addicted because it's intended to be addicted addictive and their brains are not fully formed like adults.
And that's why this listener is saying they would not support a verdict against Meta and YouTube if this were adults, but they would support it to protect kids.
What do you think?
>> Yeah, I think and the kid issue is, is really the heart of these cases because of the fact that we go back to where we started.
You know, when you said when you first heard these and your gut, you said, yeah, that's right.
Why did you think that?
Probably because you think about walking around and seeing tons of kids, even kids as.
Absolutely 2 or 3 on a phone.
>> Kids on screens everywhere, kids who can't make eye contact.
>> And as a parent, absolutely.
And as a parent, you see your child on their phone and you ask yourself, am I doing a good job parenting?
So I think I think that the fact that it's children goes was a super strong element in favor of the plaintiffs.
And the companies didn't have a good answer because of the fact that, you know, kids are now programed, right?
As soon as your kid hits 10 or 11, get me a phone.
All the other kids have phones.
This is how I find out what's going on.
It's a it's a perfect storm for the companies, and I don't think they have a good answer for it because they themselves were found in New Mexico, for example, not regulating underage users, not looking to protect them.
And that's the real question.
That's why I think most people really feel that in their gut.
And then they look forward and they say, do we have people who are in their 20s and 30s who are not going to have good attention spans?
And these kids are on computers at schools?
So have we created a scenario where we're not helping them?
When we say, go read a book and they're like, you didn't give me any books.
You didn't give me any chance to get a book.
>> Yeah.
Lynn and Rochester says, while I'm not excusing Meta and think they should be held accountable to a point, why was a six year old using social media in the first place?
What about the parental responsibility?
>> Yeah, and again, that's why these cases are so difficult in this regulation is so difficult.
Some parents may say, I will give my child a phone at, you know, ten years old and the kid doesn't have social media and only watches history videos, whereas other parents may say even at 16 or 17, I know what you're going to do at three in the morning, you're going to get those notifications and you and your friends are going to be sending messages, you know, snap, for example, one of the big complaints against snap has been the snap streak.
And kids will have a streak where every single day they're posting.
And so it encourages kids.
Oh my gosh, it's two in the morning.
I got to make sure I got that done.
Or they go to summer camp and it's like, hey parents, can you keep my snap streak alive?
And I think in our guts, we go, that's not right.
That's not what we should be doing with our lives.
>> The daily streaks, we saw it with Pokemon Go in my family, the Daily Streaks is designed to keep you every day doing something right.
>> But let's flip that around.
Let's say you're trying to get 10,000 steps a day, right?
And your app says, hey, Scott, it's, uh, you know, uh, about 20 minutes to midnight and you've been watching the bills and you haven't gotten your steps.
Don't I want to get that notification and, and you need, you know, to get that or if I'm waiting for a flight and delta changes the flight, don't I want my phone to notify me that, hey, you maybe need to figure out I.
>> Don't know that that's the same thing here.
>> But their notifications, if I am Delta, I'm saying, geez, do we face liability because these are coming in now?
You're right.
They're not the same as Snapchat.
But then you start to make the decision much, much harder.
>> All right.
Let me grab Jeff in Rochester who wants to jump in.
Hey, Jeff.
Go ahead.
>> Hi, Evan.
Thank you and your team for the great work that you do.
Um, my thought on this is that I, I feel for the children and parents thereof, but I'm really concerned about how we are getting a whole bunch of age verification laws popping up around the US and in other countries, and that causes a serious impact to even individuals workflows.
Um, it disproportionately affects things like open source software where you don't have a centralized team like you would on Windows or Google.
Um, and I just, I really am concerned about the constitutional rights of, um, you know, American citizens being breached and, and how lawsuits like this are just adding fuel to the fire in creation of like, uh, California's AB 1043 and similar laws popping up in New York.
>> Uh, Jeff, thank you for the phone call.
Scott.
What do you make of that?
>> Yeah.
Jeff.
Thanks.
That's a great point.
So I know folks might have had a little tough time hearing.
Jeff.
The idea of age verification is we will look at the individual on the platform and determine their age.
So if you went to a bar, the bouncer would look at your ID, we're doing something very similar online.
The challenge is you can't just do that for under 16 seconds.
You have to ID everyone.
When you ID everyone, you create a lot of risks.
So for example, am I putting up my, uh, my driver's license?
Is that being stored somewhere?
And if there's a data breach now my driver's license is out and the data breach wouldn't be the company I went to visit.
It's a third party authentication company.
Or is it using technology to look at my face and say, how old are you?
The real risk there is.
Then adults are being asked to talk about their online activity, verify what they're doing, and it creates a trail that maybe people don't want.
Again, maybe you are looking at content you don't want people to know you're looking at it.
Maybe as as Jeff said, he talked about GitHub and GitLab, which is you're looking at technology.
What if you're just looking for a job and you want to go on and look for jobs and all of a sudden, you know, some way, you know, people start to know you're looking.
So that's why it really impacts adults.
When we try and protect children because of this age gating with with your face or with your ID.
>> So what's going to happen in Australia?
They just ban social media under the age of 16.
>> Right.
And so right now it's the experiment is going on to say, is the benefit, right?
Just like our bell to bell here in New York.
Are the benefits useful or are we seeing students, you know, get around those limitations and maybe do other things?
You know, maybe they're using their phones to constantly text each other and maybe that's creating harm just on a smaller scale.
>> Okay, so a lot of food for thought here.
I've got more feedback from, from listeners.
And when we come back, I also want to talk about something that that Scott touched on earlier in the conversation about these lawsuits against Facebook, Meta and YouTube, which is the design of the apps, the algorithms, uh, as David French wrote, the trial court in the California case tried to evade the First Amendment by claiming that the cases weren't about content, but they were about design.
Infinite scroll isn't speech.
They said it's a means of delivering speech, but it's not the same thing.
So let's talk about that.
We'll get more of your feedback with Scott Malouf on Connections.
Coming up in our second hour, new, like many places, is dealing with an energy crisis.
Many households can't pay their utility bills on time.
It has been a devastating winter.
Now there's a war in Iran that has spiked energy costs even further.
So what can the state do?
Governor Hochul has indicated she wants to pull back a bit, but some in the entrepreneurial space say there are many things that could be done now, and they'll tell us what those are next.
Our.
>> Support for your public radio station comes from our members and from Mary Cariola center, supporting residents to become active members of the community, from developing life skills to gaining independence.
Mary Cariola, center.
Transforming lives of people with disabilities.
More online at.
Mary Cariola.
Org and Excellus BlueCross BlueShield providing members with options for in-person and virtual care, creating ways to connect to care when and where it's needed.
Learn more at excellus.
Bcbs.com.
>> This is Connections.
I'm Evan Dawson.
We're talking about social media and kids, but really all of us and whether the social media companies that are creating products that are meant to keep us on their apps, meant to keep us scrolling or using, whether they should be responsible if people's mental health suffers, if kids mental health suffers, if kids die, um, if if kids die by suicide.
So two juries recently said yes and Meta and YouTube are appealing, I think, for Scott Malouf.
Is that correct?
They're appealing.
>> We don't know yet.
But the likelihood is they will probably appeal in both the California state litigation as well as the New Mexico litigation.
>> Okay.
And so, you know, not quite a done deal, but the jury decisions were I mean, certainly sent shockwaves throughout the industry because these aren't just isolated cases.
There could be thousands more.
Could this open the floodgates?
So there's a lot of questions about what this means.
And listeners have been sharing their thoughts about how you feel about social media companies, the impact on children.
I want to ask, Scott, what is the most compelling case for the results that we've seen?
What's the case that the jury has got it right?
>> Okay.
I think the most compelling case is the one that sits in in your gut.
What I mean is when in thinking about this and preparing for the show, I said, boy, there's a really strong First Amendment argument that design is, you know, part of speech, as you said with Justice Kagan, there's a really strong argument that section 230 says that these design elements are to be protected under that law.
There's a really strong argument that they're not the cause of of individual's harms.
It's it's a bigger part.
Or if they are a cause, they're a tiny slice.
And I, and I kind of talked to my family about that and they both came back to, you know, the family members came back to and they said, but we're on our phones all the time.
We should not be.
And, and they said, you know, we see people who have a lot.
And I think the strongest case is just what the plaintiff's attorneys did.
They made it very emotive, very understandable to say an environment was created for susceptible people to constantly scroll.
So the plaintiff, Kaylee, there was one day she was on Instagram for 16 hours, 16 hours.
You know, that fact probably stuck with the jurors who said, you know, that's that's a lot of time to be staring at your phone.
Can your phone even last for 16 hours?
You know?
And so I think that's the strongest case.
And I think, again, it's the fact that all of us, if this were 20 years ago and we had to boot up a computer and it took five minutes, these cases wouldn't exist.
It's because the phones are in your pocket and they're instantly on.
The strongest case is the day in and day out.
It's it's not the technical side.
It's everything put together.
At the end of the day, something is wrong.
And I think a lot of the callers are saying that.
I'm not sure if this is what it is, but something is not right.
Here.
>> Let me give you a gut response to get away from kind of a legal parsing, a gut response may be, where are the parents, what kids should be scrolling 16 hours a day?
What six year old should be on Instagram?
As in Kylie's case, I get that.
But my equally visceral gut response is you can idealize all you want.
There are vulnerable kids who are going to be hurt, and you can talk personal responsibility, which matters.
But there's no world where some kids are not going to get hurt, right?
And so what's the line of regulation?
What's the line of limitation?
I don't know.
>> And did parents sign up for this?
They might have bought their child a phone because they would say, you know, I want to know where you are.
I want to be able to contact you.
Yeah, you can, you can have divided families where communication is really important.
Pick up, drop off, things like that.
And then that opens the door to social media, to texting, to apps.
And so the parents would say, hey, wait a minute, you Meta you YouTube say parents, you got to be really responsible.
I'm not your employee.
Why do I have to devote three hours a week to looking at my kid's phones, to seeing what their usage patterns are, to watch what videos they're watching?
Most parents will do.
They all voluntarily want to do these things, but is it right to say that's your responsibility so that a model that makes a lot of money for the companies is there?
I think that's the argument against that.
>> It's easy for me to be sort of like the guy standing on top of the mountain, like I do things right and everybody else does it wrong.
But I have had moments where I text somebody and I'm like, it's been a minute and a half, how are you not texting me back?
Yeah, why are you ignoring me?
What could you be doing that isn't immediately tied to my need for a response?
Instantly, as if I didn't grow up entirely without that.
And my 14 year old sometimes watches.
Sometimes we watch old shows like Seinfeld or Frasier.
Cheers.
My 14 year old will marvel at like, well, how are people going to even get in touch with each other?
Like, well, maybe they're not.
Like, maybe they can't for the rest of the day here because they're in different places and they don't know or like it was a totally different world.
And it wasn't that long ago, but norms have changed.
So part of me wants to say, look for the for the parents who want to get instant feedback on communicating, where is your child?
Live with it.
Get get a dumb phone, you know, or don't get a smartphone and live with that lack of immediate contact.
But I understand norms are changing.
I mean, it's easy for me to say all that, right?
>> And remember too, it's an, it's a, an infrastructure.
So for example, you know, I'm sure here you have two factor authentication.
You know, we start to have an expectation that you're going to have those kinds of apps on your phone to protect your online accounts.
It's, it's very easy to talk about these things in isolation.
It's not easy to talk about them.
You know, when, when you put the whole universe together, go back to my flight example, I think we'd all rather have a smartphone that has the app that lets me rebook the flight than just a flip phone that gets text messages from the airline.
>> Um, so before I jump into some more feedback here and more of the aspects of these cases with Scott, big picture, if, if there's an appeal and the verdicts hold, and then there's more.
Well, first of all, if the verdicts hold, are we going to see more cases?
>> So there actually already are more cases filed.
So so Kaylie's case was part of what it was a bellwether.
The idea is there are, I believe, hundreds of cases filed under this California proceeding.
And what the goal is, is to say we're going to have three trials, and we're going to see how jurors respond to these arguments.
So right now the plaintiffs have done done well.
They've gotten a verdict on liability.
They got a $3 million punitive verdict.
And there are a couple more trials scheduled if the defendants win in two of those, you know, the argument would then be, hey, look, you guys got lucky.
You got lucky on the first one, you know, and flip it around.
If the plaintiffs get two more verdicts, you know, they'd say, defendants, you really got to pay up because these arguments are really effective with jurors.
And the appeal question is a little bit different.
It's to go up to a California appellate court and say, is there the First Amendment defense?
As Justice Kagan was talking about, is there section 2.3, you, the trial court judge, you totally whiffed it on those.
No, no, no disrespect meant to the judge.
And those arguments should have held.
And you shouldn't have even sent this to a jury.
That's really what the tech companies want is if you send every case to a jury, it's very expensive and you get unpredictable responses.
You really want a judge to shut these down.
And that had been done for years and years, this design stuff, although it's it's in the news now.
People have tried it before and has failed.
I think the second Circuit here in New York ruled on it and said, no, we're not going to accept that.
>> So is there a world in which the social media companies feel compelled, maybe financially?
Um, and to avoid future culpability, to actually change their platforms and their algorithms and how they do things?
>> Oh, they're probably already considering that and not just the big companies, probably smaller companies.
Again, that's why the issue of design is so nettlesome.
You know, one of the in the California litigation, not the state litigation, but federal litigation, um, one of the issues that came up was notices put on screens.
And where did you put the notice up?
And the judge went so deep in, she said, listen, um, those are those notices are protected by the First Amendment.
So let's jump forward a year.
You and I are sitting in a design meeting.
You know, you say, okay, I want to put up the really flashy notice in this size font.
And I come in as a lawyer and go, oh, Evan, boy, that that really gets my attention.
Are we going to face liability for that?
Oh, I don't, I don't like your red, your red font.
Can we can we make it black?
Maybe.
You know, now it becomes one of those things.
That's the difficulty for the tech companies.
So they're probably already thinking about it.
What do we do?
And of course other people are probably saying, well, if you make that change, maybe our usability goes down, our appeal to advertisers goes down.
>> All right, let me get back to some feedback here.
Rick says the following.
Hold on Rick, I got to pull your email back up here.
Now.
My phone isn't working.
Isn't that beautiful?
It's beautiful.
Rick says, Evan, the profit motive is at the heart of the verdict that Meta faces, knowing that profit is the driver of the work being done by social media.
Free speech is not what Meta and social media is motivated by it's profits they seek.
So I would remind everyone that free speech is only as good as what it is used for.
If we don't make sure that free speech is well used, we will eventually find that it will be lost as people decide it is not worth the risk.
So that requires careful regulation.
I hope we don't get to that point that we lose free speech, but I fear we could.
It's from Rick.
>> You know, I think that's tough.
I appreciate Rick's sentiments.
I think the question goes is careful regulation.
Again, it's always the person doing the regulation is going to say, I don't like that speech if it's somebody.
So we've seen a lot of protests recently.
You know, there may be people who say, I don't like those protests.
I think a number of years ago, 490 was shut down and it was it was a protest on 490.
A lot of people were inconvenienced.
But because we had a free speech culture, you know, the decision was made that goes forward and people have that ability to shut down a highway to send their message.
So the concern is whenever you give people tools to regulate speech, it's going to be your speech that gets shut down and they're going to say, I'm doing it for this good, for the children, for to protect this course.
You know, we just really can't trust people to use that tool fairly.
>> The I think you would say, Scott, I don't want to speak for you.
I'm intuiting that the better response would be to have a really well educated sort of media literacy educated society that understands the risks, especially if kids on these programs and that we dramatically reduce the number of kids who are on these.
>> I think I think that would be helpful.
I think.
>> A cultural.
>> Change, cultural change, I think a technological everyone probably on their phone right now has app timers.
And maybe you say to yourself, I'm going to use the app timer.
You'll be shocked at how much time you will spend on something.
And, and the big one is alternatives to these platforms, alternatives to Facebook, alternatives to the large tech platforms.
So that if people want to talk about the school district and they want to talk about it effectively, say during a snowstorm, they can do that.
They don't have to go to Facebook.
So there's a lot of things we can do.
It's it's but these are super convenient, super widespread platforms.
>> Well, Karen and Brighton wants to contest the idea that social media is exactly like cigarettes.
She says there was no upside to cigarettes.
>> Yeah, it's a perfect example.
>> Although saying, I think the implication is there is an upside to social media if we use it.
>> Well, if you ask its defenders, the best case for the defenders is, look, we are we are helping a variety of communities find each other.
If you have a very rare disease and they're only, say, 5 or 10,000 Americans who have it, you're not going to see each other physically in person.
You're not going to have physicians who know about that disease, but you're going to be able to find a Facebook group or another online group to talk to each other.
>> Well, let me get some of the comments from YouTube.
That feels weird to say.
The Sauer.
>> Your chagrin is is bleeding through.
>> Honestly, I think I should actually address this briefly.
I don't love the world, you know, getting a ton of their news on YouTube and Facebook and things like that.
And we resist a little bit.
Um, but the reason that the show is on YouTube is a lot of people are on YouTube and they're asking why we're not.
And like our idealized mindset is if there's going to be slop everywhere, why not also put some broccoli out there?
I don't like broccoli is not sexy.
Why would I ever describe the show as broccoli?
We're like sugar coated broccoli.
I'm not helping at all, am I?
Scott, this is really not helping.
>> I think everybody gets your sense.
>> Producer Megan Mack is about to come here and tell me stop!
>> Protein.
The show is protein.
>> We're trying to, like, be on platforms that we know can be harmful, and we're trying to put something that we think is helpful.
There is the short answer, but nobody knows the right answer for sure.
It's not easy.
Anyway, we're on YouTube and we hope you're watching every day, but instead of watching an endless scroll of, you know.
>> AI generated.
>> AI generated slop.
>> And, and, and again, some of this why this is so difficult is we used to have gatekeepers.
We used to have newspapers, and the newspaper editors decided what was in those papers.
The reporters were professional, you know, and with social media, it destroyed that business model.
And that's part of why we also are all so frustrated because we're saying our the quality of our debate.
Maybe everyone has a voice now, but it's, it's, it's been in a lot of ways debased.
And my example would be, you know, the Austrian nuns, we've all heard about the Austrian nuns who broke back into the convent, and they got back in.
That story started on social media.
>> Who are these nuns?
Why do I not know this story?
>> If you Google it, it's three nuns who were asked to leave their convent, and then they asked their former students to help break back in.
And now they're going to go meet with the Pope.
The story started on social media.
>> AM I, like the only one?
Rob, do you know that story?
Okay.
All right.
Well, now I want to know that story.
But anyway, the parable of the Austrian nuns, um, let me try to squeeze in a little bit more feedback because we've got a lot that's come in.
John called in to say the common person doesn't have billions of dollars to spend to, to resist the billions of dollars that these social media companies are spending to get us to use them.
So he's essentially saying, good on you for getting these verdicts against them, because the power is with the social media companies.
They've got the money, they've got the the politicians who don't regulate.
They've got all these things that are going in their favor.
What do we have?
So what do you make of his point there?
>> Yeah.
And, and I think that's probably why a lot of people don't feel very bent out of shape about the verdicts.
These are not very likable defendants.
You know, Mark Zuckerberg testified.
And you know what what some folks, you know, said it wasn't it wasn't very effective testimony.
I didn't see it.
So it's the second hand.
Um, and it makes sense, right?
We'd say these are incredibly wealthy companies.
You know, it's very hard to feel that they have other, other individuals, others in, in mind as their profits continue to grow.
But on the flip side, we have to say to ourselves, what's the infrastructure that comes out of this?
What's what's the thing that comes out?
Are we not going to have some of the benefits that we take for granted?
Are we going to lose some of the places we get to speak now, even if it's not these big platforms?
>> Well, let me let me get a kind of a different flavor of the conversation.
We've had a YouTube question for Scott is can we sue Fox News?
My parents are addicted and it has destroyed our family.
So I think that's a cheeky comment, but I want to take it seriously.
Yeah.
Um, does does the Meta and YouTube verdict open the door to suing cable news companies?
>> That's a really interesting question.
I think a plaintiff's attorneys ears probably pricked up as you said it.
So you may, uh, you may you may see some ads.
Um, you know, I think the question comes back to the delivery methodology, right?
So probably most Fox News viewers are watching it on TV.
They're turning it on.
I think all of us would say, wow, you made the choice to turn it on.
But let's let's be a plaintiff's lawyer for a moment.
What if it was all on the Fox News app and you said, man, every time I went over my parents house, it was playing on their phones.
It was playing on their iPad and it was on 24/7, and they constantly got notices, you know, one could make that argument.
And I think that's what the internet defenders who talk about this are saying is you think it's just Facebook.
It's not.
And it's going to be all kinds of speech that are going to be burdened.
And Fox News may sit there and say, geez, let's stop sending push notifications.
>> Right?
And someone's going to say that NPR destroyed their family.
>> It all depends.
I mean.
>> So for the people celebrating like down with Fox News, like kind of the shoe will be on the other foot.
>> Eventually the shoe will be on the other foot.
And, and, and you will, you will take a lot of technology that maybe we do want.
And, and it will be snuffed out before it gets to grow.
>> Uh, and here is one, uh, from first initial P, he says, how about designing some phones that allow kids to message or call their parents, but not access social media?
Something like an updated version of a flip phone.
>> And I think there are apps like that.
There are watches and things that do that.
But no, it's a great comment.
You know, maybe the maybe the thing to do.
We go back to what you said about parenting and this is true, I think for all of us is maybe we say, what are tools or techniques that we can use to turn these things off?
You know what?
Get an old school alarm clock, not don't bring your phone to bed.
>> Well, uh, Tom writes to say even from a golf course.
Great show, gentlemen.
So Tom is out on the golf course, and he's got his headphones in.
He's listening to Connections.
And that is exactly what people should be doing, I think.
I think you should be out outside enjoying a great spring day and still getting a little bit of brain food here.
That's a pretty good idea.
Yeah.
He appreciates Scott Malouf.
Take us home 45 seconds or so.
What are you looking for next?
Here?
>> I think the big thing to do is watch the next set of trials that come out of the bellwether.
Those are the state court case in California.
We did not really talk about the New Mexico trial is state attorney generals see what kind of verdicts they get.
And are these decisions?
Are these things found to be public nuisances?
And so the final thing is watch design changes, look at and follow technology blogs and say, what changes are the companies making?
Are they making good on some of these changes or are they doubling down?
>> Thank you for the expertise, and thank you also for I appreciate, Scott, that as an attorney, you show a lot of grace to people who are kind of driven by the emotion of this, because there is a lot of emotion and there's a lot of legitimate emotion.
People feel like they're losing their kids.
I mean, it's really, really hard.
I mean, some kids have died.
And so I know it's emotional.
I appreciate the grace that you've shown in working us through this while also honoring that people have strong feelings about this.
>> And I appreciate all those feelings.
It's all about us working together.
>> Come back sometime.
>> I'd love to be back.
>> There's.
I cannot think of anybody in this town really, anywhere.
Where are you going?
To hear an attorney with that kind of knowledge who are going to walk you through cases like this?
He's one of the best.
That's Scott Malouf more Connections coming up in just a moment.
>> This program is a production of WXXI Public Radio.
The views expressed do not necessarily represent those of this station.
Its staff, management or underwriters.
The broadcast is meant for the private use of our audience.
Any rebroadcast or use in another medium without expressed written consent of WXXI is strictly prohibited.
Connections with Evan Dawson is available as a podcast.
Just click on the Connections link at wxxinews.org.
>> We can't remember what was the name of instruments and when everything was done in Texas.
And I'm sure you know things about it where they call Univision activities and basically forced them to start meeting like you brought in tech companies with all the various technological things that can be brought forward.
And basically forced them to start a new know.
I don't know how to explain.
Why it is relevant to this, like in the context of this article, in my words, things like welcoming Jacques Paris and let us help get this stuff done.
But I thought that was an example of that being done in Texas in a way that it sort of made a formal process to inviting entrepreneurs into the conversation to deploy the technologies.
Yeah, that's right.
I think the bigger, the bigger points were made with Texas is that, um, Texas operates in a way that operates because it runs the, um, the ISO as well as, um, uh, logos.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Connections with Evan Dawson is a local public television program presented by WXXI