
Exploring Potential Impacts of the Kids Online Safety Act
Season 6 Episode 29 | 26m 46sVideo has Closed Captions
We look at how The Kids Online Safety Act may impact children on social media.
The Kids Online Safety Act is designed to ensure children aren’t presented harmful content when they are on social media. But some advocacy groups are concerned about censorship. We explore the different sides of this bill. And a local mother shares how a social media challenge forever changed her family and explains how the Kids Online Safety Act may have helped her daughter.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Nevada Week is a local public television program presented by Vegas PBS

Exploring Potential Impacts of the Kids Online Safety Act
Season 6 Episode 29 | 26m 46sVideo has Closed Captions
The Kids Online Safety Act is designed to ensure children aren’t presented harmful content when they are on social media. But some advocacy groups are concerned about censorship. We explore the different sides of this bill. And a local mother shares how a social media challenge forever changed her family and explains how the Kids Online Safety Act may have helped her daughter.
Problems playing video? | Closed Captioning Feedback
How to Watch Nevada Week
Nevada Week is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship-I still wake up, and you're just like, How did that happen?
(Amber Renee Dixon) A local mom says a social media challenge left her daughter in this condition, but that's not the worst of it.
The controversial bill she's calling on Congress to pass to prevent more tragedies like hers, that's this week on Nevada Week.
♪♪♪ Support for Nevada Week is provided by Senator William H. Hernstadt.
-Welcome to Nevada Week.
I'm Amber Renee Dixon.
The Kids Online Safety Act, which you'll hear viewpoints for and against ahead, seeks to hold social media companies accountable when they recommend content to children that promote self-harm, suicide, eating disorders, substance abuse, and sexual exploitation.
Here in Las Vegas, Meghan Stuhmer believes the bill also known as KOSA could have prevented a deadly car accident that seriously injured her daughter.
It happened in June 2021 in South Carolina right before she and her daughter planned to move to Las Vegas.
I'd like to start by going back to June 17, 2021.
What happened on that day?
(Meghan Stuhmer) On Wednesday, June 16, that was the last day of my daughter's sophomore year of high school.
She had recently gotten her license.
We were getting ready to move in a few days.
She and two of her friends decided to take the speed challenge.
So after I went to bed, she snuck out with her two friends.
They got my Hyundai up to over 100 miles per hour.
The speed challenge was like 130 to 200 miles per hour.
They hit a tree, and her friend passed away instantly.
And then both girls in the driver's and passenger seat were severely hurt.
-Tell me about this speed challenge.
-So during the quarantine, we'd spent a lot of time with defensive driving, driving to the DMV--the DMV had a little obstacle course--and it gave us practice on the highway, on, you know, the dark country roads, things like that.
So they-- and my daughter had a smartphone.
So did I.
So we-- I didn't realize it, but they were tracking her location.
And with that, they knew that she was a new driver.
They knew she was a high school student who was working from home.
She wasn't going in person.
And they develop their algorithms.
Well, the algorithm, one of the algorithms that they chose to send to my daughter, started when she was 15 and then 16, has to do with speeding.
And so when I was on her phone following the accident, I realized how many ads she got where it was like 20 seconds of crazy driving.
And the speedometer always would go up, and it was always 150, 180.
It would go over 200 miles per hour in certain scenes.
Or it would be like this little clip of a very young looking girl with like these little stars, looking pouty, and saying, "My face when my boyfriend says he's driving safe."
Then it would flip and it would say, "150 miles per hour."
I never saw anything less than 130 miles per hour.
That seems to be the starting point of the speed challenge or the speed algorithm.
And then it would be followed by an advertisement.
For instance, Tesla, "More affordable than ever.
200 miles per hour."
I didn't-- just constant normalization of speed.
And so they took the car out, they hit the tree, and it ended horrifically.
When I asked my daughter about it later, "Why did you do this"-- she'd gotten straight A's for the first time in her life and just never mentioned a fast car, never mentioned even a car that she liked.
It was so far out of my radar.
And her response was, "I was really scared, but I wanted to take my turn."
So the speed challenge is just the normalization and the incentivizing of driving 130 to 200 miles per hour that's being sent to kids.
-How did you find out about what happened?
-So after the phones were released back to the families, we all looked at the phones.
And I got onto the different social media accounts.
My daughter had Snapchat, TikTok, and Instagram, along with different gaming apps and a few different ones.
So she had been using them to chat specifically during when we were all quarantined.
It was like they were using the chat features within social media platforms more so within the texting feature.
So when you text, you're not getting all these ads.
But in order to read what she had been talking about with her friends, I was trying to figure out where did this influence, where did this idea to go 130 miles an hour, where did it come from?
And at the time, it never occurred to me that companies were pushing it.
I thought it was a peer influence.
But in order to go back and read all the months of chat, which never found any peer influence for it, in order to read that, I had to watch.
There's like 20 seconds of crazy in this reel and then the advertisements that came through.
And at one point, I realized I was like, just being on her phone for all these hours, I was like, I think it's normal to go 130 miles per hour with my adult brain, based on all the information they were sending her.
-Do you think you could have prevented this as a parent?
-I-- So we've talked about KOSA and having more regulation for social media.
I was not ignorant.
And I don't like the narrative that parents have to stop giving their kids screens and telling them-- and letting them raise their kids.
That's not what happened here at all.
Absolutely.
Had I understood that they were going to track my daughter's location and they were going to use that to determine her algorithm, I would absolutely turn off her phone.
She could still have the smartphone.
I would absolutely would have had her turn off the phone, though, whenever we left the house.
Like whenever we left the house.
Because I wouldn't have necessarily wanted her off of social media.
I hate to say that, but it's part of our problem now and part of the reason we need more regulation.
It's what kids use to connect.
So right now, she doesn't want to be on the different platforms, but it creates an isolation.
And so, yes, I could have prevented it had I been more informed, but I wasn't uneducated and I was not hands off.
-How would the Kids Online Safety Act have prevented this, in your opinion?
-One of the biggest things for me would have been transparency.
If there's accountability and they have to disclose a certain amount of information, then as a parent I would have understood a little bit more that they are tracking the location.
That would have been a huge thing for me if I had understood that, because what I did is I put a bull's eye on my daughter's back because I spent all this extra time with defensive driving.
It's like it kills me how many times I said, If you ever hurt somebody, if you ever hit somebody's dog or child ever runs out, you'll never forgive yourself.
We spent so much time on these defensive driving lessons and practicing, and really what I was doing was setting her up to be targeted.
-How is your daughter doing now?
-She's a fighter, but she's not doing-- I mean, I don't know how to describe the pain that-- she just, she's just doing good.
Like she's going to school, she's doing the best that she can do.
She's not-- she's had nerve pain every single day since 2021.
More, like recently, it's gotten worse.
So physically, there's only so much healing that will take place.
And then you look at mental and spiritual, and you just-- we've had a ton of support.
You know, I mean, family, friends.
The little girl who passed away, their family has just stood by us.
And so she's doing as good as she could be doing.
But one of the things that I've shared with a lot of people is I asked Carly one time, like, If you were to promote speeding, like a speeding advertisement the way that you received it, if you were to have your own advertisement, get the message out there so kids could see what speeding really looked like, what would you, how would that look?
And she said, I would show pictures of myself and the other little girl that got hurt.
She was like, But I wouldn't even mention that anyone had passed away, because no one would believe anything that bad could happen.
And I still wake up, and you're just like, how did that happen?
You know?
And how are they still doing it, and how are they still profiting from it?
-Meghan Stuhmer, thank you for your time.
-Thank you.
-The app that speed filter was on was Snapchat, and the maker of Snapchat, Snap, says the speed filter was removed from the platform in June 2021, the same month as the accident but not in relation to it.
Snap tells Nevada Week that the speed filter had a warning telling users not to Snap and drive and that the top driving speed at which a Snap could be shared was changed to 35 miles per hour.
Additionally in a statement, a Snap spokesperson said, quote, Protecting the privacy and safety of young people on Snapchat is a top priority, and we support thoughtful federal legislation that helps achieve this goal.
We offer extra protections for Snapchatters ages 13 to 17 and in app tools for parents, giving adult caregivers more visibility and understanding of their teen's online activities.
We also support many provisions being considered in bills like the Kids Online Safety Act, such as mandating appropriate default privacy settings, providing safeguard measures for teens, including additional privacy protection settings, and limiting the collection and storage of personal information.
End quote.
As for KOSA, the bill has bipartisan support with 46 cosponsors in the Senate.
While Nevada senators are not among them, Senator Jacky Rosen did vote to advance KOSA out of committee.
And Senator Catherine Cortez Masto says she's reviewing the legislation and will continue to work with her colleagues to protect children.
So let's hear from both sides of this issue now.
And for that, we bring in Laura Marquez-Garrett, an attorney with the Social Media Victims Law Center.
She supports KOSA and is also the attorney of Meghan Stuhmer, the mom we heard from.
Opposing KOSA, is Evan Greer.
She's the director of the digital rights advocacy group Fight for the Future.
Thank you so much for being here.
I want to start with you, Evan.
We have yet to hear from the opposing viewpoint.
And I want to know what is your biggest concern if KOSA were to pass?
(Evan Greer) I think the first thing I'll say in this conversation is that you're going to find a lot more agreement than disagreement.
We completely agree that big tech companies are doing real harm to children.
And we completely agree that something needs to be done about it.
I commend these parents who are speaking up, who are trying to transform their pain into meaningful action.
And it's up to our elected officials to do their jobs, which means responsible legislating.
And unfortunately, KOSA is a bill that is substantively opposed by effectively the entire human rights and civil liberties community in the US, because it's a bill that we believe will make kids less safe, rather than more safe, by cutting them off from access to life-saving online information, whether that's information about healthy sex ed information, whether that's information about how to avoid substance abuse, whether that's information about how to deal with bullying.
The way that we protect kids is not through censorship.
It's through cracking down on the predatory business practices of these companies, which is what we and many other organizations have been calling for, for years.
-Laura, do you have a response to that?
(Laura Marquez-Garrett) I do.
I think Evan is correct that there's going to be a lot of areas of agreement here.
We all want change.
We all want to protect children.
I think the thing about KOSA is it's not about content.
It's about products, services, and features.
So these are design and programming decisions these companies are making.
And the really tricky part is without regulation, these companies operate in a black box, it's hard for us to sort of see how they are designing and programming on the back end.
But again, KOSA is not about content.
It's not about censorship.
It is about the programming decisions that they're making, the designs.
In fact, in early to mid-2023, a limitation was added to KOSA that I've heard referred to as sort of the "you search it, you see it," which is that this is not, KOSA is not going to cover what children are searching for.
It's covering what these companies are targeting at our kids through their program decisions through their designs.
-When we talk about social media companies being held liable, that would be in the form of lawsuits that state attorneys general could file.
It would be up to their discretion.
Evan, your thoughts on that?
-So first of all, I just want to clarify that it's actually not accurate that KOSA does not cover content.
It absolutely does in its current form.
And what we and the ACLU and organizations like Equality Nevada and dozens of other LGBTQ, civil liberties, and human rights groups have been trying to do is encourage the bill sponsors to enact the sorts of meaningful changes that would make this a bill like the one that Laura describes.
If KOSA was a bill that only covered design choices like autoplay, infinite scroll, the use of minors' personal data to power algorithmic recommendations, not only would we not oppose it, Fight for the Future would be cheerleading it and trying to get it passed.
The problem with KOSA is that the duty of care covers what content platforms can show to which users.
The way it does that and the reason that attorney general enforcement is so dangerous is because the way KOSA is currently written, it effectively says that an attorney general or the FTC can go after a platform if they believe that platform has algorithmically recommended content to younger users that causes a series of delineated harms, including anxiety and depression.
That sounds perfectly reasonable on its face, but what it means is an attorney general can go after platforms for any type of content that that AG is willing to claim causes anxiety and depression among youth.
And unfortunately, we have politically motivated attorneys general in this country that are already arguing things like encountering drag queen story hour videos makes kids depressed and anxious, like learning about the existence of transgender people makes kids depressed and anxious.
The attorney general of Nebraska, for example, cited medical studies showing that LGBTQ people disproportionately experience anxiety and depression in their executive order attempting to ban gender affirming care for adults in Nebraska.
So we know that attorneys general are already willing to claim that all kinds of important content is causation for these types of mental health outcomes.
And that's why we've been pushing, to make sure the duty of care in posts that can't be weaponized in that way.
And if the bill sponsors would adopt the changes that the ACLU and other organizations have been calling for, then it would prevent that sort of weaponization, while preserving the parts of the bill that we all agree on, like banning targeted advertising to kids, something that Fight for the Future has supported for years.
-Now, there is discussion, according to the sponsor of this bill, Democratic Senator Richard Blumenthal.
He recently told Politico that he is open to changing the enforcement mechanism of this bill, taking it out of the hands of state attorneys general, putting it in the Federal Trade Commission's hands.
Would that satisfy you, Evan?
-That would be an improvement, but unfortunately, we have to ask ourselves, how would the Trump administration have used this authority if they had had it when they were in power?
How could a future administration use this authority if they have it?
I'm not willing to gamble the lives of LGBTQ young people and the human rights, broadly speaking across the web, on the idea that we're always going to have someone thoughtful and progressive in the White House.
I think we need to pass legislation that we know will do more good than harm.
And that goes after the underlying business practices of these companies, rather than giving the government a hammer with which to go after speech that they don't like.
-Laura, any response to that?
-I do.
I'm responding to sort of all of these statements.
I mean, I think Evan and I respectfully have to agree to disagree on what KOSA provides.
And the point I would respond to is, these are not libraries.
These are not nonprofits.
KOSA, in Section 2, I believe it is, specifically covers these for-profit platforms that are targeting and exploiting our children.
The Section 3, Duty of Care, includes a explicit carve-out.
That is Section 3(b).
It is a limitation that states that nothing in subsection (a), and this is a quote, "... shall be construed to require a covered platform to prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content."
So the way KOSA works based on its plain language is that you could have a platform that never takes down or censors a single piece of content, and they are in compliance with KOSA.
And conversely, you could have a platform that takes down everything a conservative or liberal AG finds objectionable, whether it's suicide and self-harm, whether it's gun-related content, political content, LGBTQ, you could have a platform that takes that all down and is still in violation of KOSA because, again, it goes back to the programming, to the mechanisms, the tools that these companies are using, not the content third parties are posting.
That is not with that limitation in Section 3(b).
All a company would have to do is say, you know, the child sought out suicide content.
So this is like, our children make poor decisions often.
They're children.
They, their brains are still developing.
KOSA is not about the poor decisions our children make.
It's about the decisions these platforms are currently making for our children.
And one last point on that.
In the work that I do with SNVLC working with thousands of families and children, seeing the back end data, you know, we see everyday examples of how these products are designed in a way that exploits and targets children, particularly LGBTQ, everything from children searching for uplifting content and motivational speeches and getting what I refer to as "go kill yourself" content on some of these platforms, you know, to children searching for gay pride and getting Westboro Baptist Church, "You're going to hell."
And that is a function not of what the users, what these children are searching for.
That is a function of the programming that these platforms are knowingly implementing to increase their own engagement.
These are not choices our children are making.
And that limitation in KOSA specifically calls out: If kids search for resources, they will get resources.
In addition to which, KOSA does not apply to nonprofits, libraries, those types of organizations.
-Evan, how much of this is subjective?
How much of the content that is out there in regards to mental health, because KOSA emphasizes mental health could be viewed by some as helpful and others as harmful?
-I think this is an incredibly important thing for us to focus in on because there is no objective legal or even medical definition of what type of content is inherently helpful or harmful to kids.
A piece of content about police shootings might make some kids depressed and inspire other kids to take action, for example.
I also just want to respond directly about the carve-out that Laura was mentioning around search.
Unfortunately, it doesn't actually work that way.
If I search on a social media platform for, for example, why do I feel different from other boys, and that social media platform then returns to me results related to gender identity or finding support resources for people with gender dysphoria, they are not covered by that carve-out in KOSA, because they're algorithmically making an inference into what I was searching for.
An attorney general can easily argue that that's not what I was looking for, that they should have returned to me results about conversion therapy.
And so, unfortunately, as long as the duty of care implicates content, there will be ways for it to be used for censorship.
And there is no objective definition.
There are people in this country-- for example, Marsha Blackburn, one of the lead sponsors of KOSA, has said explicitly that she believes transgender content is harmful to kids.
She's also said that she believes that teaching about racism in schools is harmful to kids.
So as long as we have people in power who are going to argue that things that help kids are hurting them, we cannot give the government the authority to take down content that they don't like, just by claiming it's hard to hurting kids.
-I want to make sure we get this in.
Laura, and you can also address-- -Sorry, if I-- -Okay.
Quickly, please.
-Oh, yes.
Sorry.
Look, there's a lot of provisions in KOSA, and I strongly urge anyone who has questions about KOSA to read them.
Just responding quickly, you know, there's a provision in Section 4, and it's, I believe, 4(e)(3)(C) that says, again, Nothing shall be construed-- where is this?
Sorry.
"Nothing shall be construed to prevent using a personalized recommendation system to display content to a minor if the system only uses information on language spoken, geolocation, or age.
So that's yet another carve-out in KOSA that, that you would have to read all of KOSA and sort of look for these specific carve-outs that ultimately is going to protect in that scenario.
Because the goal there of KOSA is it's not that they cannot have recommendation systems; it's that when those are being aimed at children, there have to be certain safeguards.
There have to be default settings--and that's Section 4--that children can select that will help create a safer experience.
And then, of course, the one I just cited, 2, which says that the platform, as long as they are not collecting all these thousands of personal data points and targeting children one by one, they can still use recommendation algorithms with children.
They just cannot use those in a way that is inherently dangerous.
-Laura, in the interview with Meghan Stuhmer, your client, she talks about two different issues.
One is the alleged tracking of her daughter and the alleged use of that data to target her with content that promoted excessive speeding, and then the other is the speed filter itself, the speed challenge.
Would KOSA address both of those?
-KOSA is able to address many things that these platforms do so well.
Those are not specifically called out.
You have a number of provisions in KOSA, such as Section 5, which relates to disclosure.
So the platforms would have to tell us how they're using data they're collecting from kids.
So in Meghan Stuhmer's case, one of the things that would have done was very clear and plain disclosures, parents, and Meghan in particular, would have been informed as to how the platform was using the data.
Additionally, Section 6 of KOSA relates to transparency.
And there's multiple disclosure categories so that when a company like Snap, Meta, TikTok identifies issues of potential harm to children, they have to disclose those.
And that becomes a self-policing system, where companies are now motivated to not deliberately harm consumers.
When they identify these harmful features, they have a disclosure requirement.
These are all critical provisions of KOSA.
-We have about 30 seconds left.
I'm gonna give you the final word, Evan.
-Sure.
So here's what I will say.
Again, there's so much agreement on this call.
We need to hold these big tech companies accountable.
But to do that, we need to actually get legislation passed, which means that we can't be divided.
As long as we are trying to pass a bill that is deeply controversial and that is opposed by dozens of human rights and LGBTQ and civil liberties organizations, big tech is going to keep getting away with murder.
And so our plea to those that are supporting KOSA is, please, let's sit down.
Let's come together and craft legislation that will keep all kids safe, rather than making perhaps some kids safer, while putting other kids in danger.
If we can come together, there are so many things that we can do to address the underlying business practices of these companies and mitigate the harm that they're doing.
But as long as some of us insist on pushing a bill that is substantively opposed by a broad coalition, what's going to happen is nothing.
-Thank you both so much for taking the time to discuss this important, important issue.
A spokesperson from Senator Blumenthal's office says that they are working on putting the finishing touches on KOSA and getting it to the Senate floor early this year is their intention.
That is all for this episode of Nevada Week.
For any of the resources discussed, please visit vegaspbs.org/nevadaweek.
And I'll see you next week on Nevada Week.
♪♪♪
Kids Online Safety Act Explained
Video has Closed Captions
Clip: S6 Ep29 | 15m 21s | We examine the pros and cons of the Kids Online Safety Act. (15m 21s)
Local mom shares social media warning
Video has Closed Captions
Clip: S6 Ep29 | 8m 49s | Meghan Stuhmer shares how a social media challenge changed her daughter’s life forever. (8m 49s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship
- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Nevada Week is a local public television program presented by Vegas PBS

