
Kids Online Safety Act Explained
Clip: Season 6 Episode 29 | 15m 21sVideo has Closed Captions
We examine the pros and cons of the Kids Online Safety Act.
We examine the pros and cons of the Kids Online Safety Act.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Nevada Week is a local public television program presented by Vegas PBS

Kids Online Safety Act Explained
Clip: Season 6 Episode 29 | 15m 21sVideo has Closed Captions
We examine the pros and cons of the Kids Online Safety Act.
Problems playing video? | Closed Captioning Feedback
How to Watch Nevada Week
Nevada Week is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipAs for KOSA, the bill has bipartisan support with 46 cosponsors in the Senate.
While Nevada senators are not among them, Senator Jacky Rosen did vote to advance KOSA out of committee.
And Senator Catherine Cortez Masto says she's reviewing the legislation and will continue to work with her colleagues to protect children.
So let's hear from both sides of this issue now.
And for that, we bring in Laura Marquez-Garrett, an attorney with the Social Media Victims Law Center.
She supports KOSA and is also the attorney of Meghan Stuhmer, the mom we heard from.
Opposing KOSA, is Evan Greer.
She's the director of the digital rights advocacy group Fight for the Future.
Thank you so much for being here.
I want to start with you, Evan.
We have yet to hear from the opposing viewpoint.
And I want to know what is your biggest concern if KOSA were to pass?
(Evan Greer) I think the first thing I'll say in this conversation is that you're going to find a lot more agreement than disagreement.
We completely agree that big tech companies are doing real harm to children.
And we completely agree that something needs to be done about it.
I commend these parents who are speaking up, who are trying to transform their pain into meaningful action.
And it's up to our elected officials to do their jobs, which means responsible legislating.
And unfortunately, KOSA is a bill that is substantively opposed by effectively the entire human rights and civil liberties community in the US, because it's a bill that we believe will make kids less safe, rather than more safe, by cutting them off from access to life-saving online information, whether that's information about healthy sex ed information, whether that's information about how to avoid substance abuse, whether that's information about how to deal with bullying.
The way that we protect kids is not through censorship.
It's through cracking down on the predatory business practices of these companies, which is what we and many other organizations have been calling for, for years.
-Laura, do you have a response to that?
(Laura Marquez-Garrett) I do.
I think Evan is correct that there's going to be a lot of areas of agreement here.
We all want change.
We all want to protect children.
I think the thing about KOSA is it's not about content.
It's about products, services, and features.
So these are design and programming decisions these companies are making.
And the really tricky part is without regulation, these companies operate in a black box, it's hard for us to sort of see how they are designing and programming on the back end.
But again, KOSA is not about content.
It's not about censorship.
It is about the programming decisions that they're making, the designs.
In fact, in early to mid-2023, a limitation was added to KOSA that I've heard referred to as sort of the "you search it, you see it," which is that this is not, KOSA is not going to cover what children are searching for.
It's covering what these companies are targeting at our kids through their program decisions through their designs.
-When we talk about social media companies being held liable, that would be in the form of lawsuits that state attorneys general could file.
It would be up to their discretion.
Evan, your thoughts on that?
-So first of all, I just want to clarify that it's actually not accurate that KOSA does not cover content.
It absolutely does in its current form.
And what we and the ACLU and organizations like Equality Nevada and dozens of other LGBTQ, civil liberties, and human rights groups have been trying to do is encourage the bill sponsors to enact the sorts of meaningful changes that would make this a bill like the one that Laura describes.
If KOSA was a bill that only covered design choices like autoplay, infinite scroll, the use of minors' personal data to power algorithmic recommendations, not only would we not oppose it, Fight for the Future would be cheerleading it and trying to get it passed.
The problem with KOSA is that the duty of care covers what content platforms can show to which users.
The way it does that and the reason that attorney general enforcement is so dangerous is because the way KOSA is currently written, it effectively says that an attorney general or the FTC can go after a platform if they believe that platform has algorithmically recommended content to younger users that causes a series of delineated harms, including anxiety and depression.
That sounds perfectly reasonable on its face, but what it means is an attorney general can go after platforms for any type of content that that AG is willing to claim causes anxiety and depression among youth.
And unfortunately, we have politically motivated attorneys general in this country that are already arguing things like encountering drag queen story hour videos makes kids depressed and anxious, like learning about the existence of transgender people makes kids depressed and anxious.
The attorney general of Nebraska, for example, cited medical studies showing that LGBTQ people disproportionately experience anxiety and depression in their executive order attempting to ban gender affirming care for adults in Nebraska.
So we know that attorneys general are already willing to claim that all kinds of important content is causation for these types of mental health outcomes.
And that's why we've been pushing, to make sure the duty of care in posts that can't be weaponized in that way.
And if the bill sponsors would adopt the changes that the ACLU and other organizations have been calling for, then it would prevent that sort of weaponization, while preserving the parts of the bill that we all agree on, like banning targeted advertising to kids, something that Fight for the Future has supported for years.
-Now, there is discussion, according to the sponsor of this bill, Democratic Senator Richard Blumenthal.
He recently told Politico that he is open to changing the enforcement mechanism of this bill, taking it out of the hands of state attorneys general, putting it in the Federal Trade Commission's hands.
Would that satisfy you, Evan?
-That would be an improvement, but unfortunately, we have to ask ourselves, how would the Trump administration have used this authority if they had had it when they were in power?
How could a future administration use this authority if they have it?
I'm not willing to gamble the lives of LGBTQ young people and the human rights, broadly speaking across the web, on the idea that we're always going to have someone thoughtful and progressive in the White House.
I think we need to pass legislation that we know will do more good than harm.
And that goes after the underlying business practices of these companies, rather than giving the government a hammer with which to go after speech that they don't like.
-Laura, any response to that?
-I do.
I'm responding to sort of all of these statements.
I mean, I think Evan and I respectfully have to agree to disagree on what KOSA provides.
And the point I would respond to is, these are not libraries.
These are not nonprofits.
KOSA, in Section 2, I believe it is, specifically covers these for-profit platforms that are targeting and exploiting our children.
The Section 3, Duty of Care, includes a explicit carve-out.
That is Section 3(b).
It is a limitation that states that nothing in subsection (a), and this is a quote, "... shall be construed to require a covered platform to prevent or preclude any minor from deliberately and independently searching for, or specifically requesting, content."
So the way KOSA works based on its plain language is that you could have a platform that never takes down or censors a single piece of content, and they are in compliance with KOSA.
And conversely, you could have a platform that takes down everything a conservative or liberal AG finds objectionable, whether it's suicide and self-harm, whether it's gun-related content, political content, LGBTQ, you could have a platform that takes that all down and is still in violation of KOSA because, again, it goes back to the programming, to the mechanisms, the tools that these companies are using, not the content third parties are posting.
That is not with that limitation in Section 3(b).
All a company would have to do is say, you know, the child sought out suicide content.
So this is like, our children make poor decisions often.
They're children.
They, their brains are still developing.
KOSA is not about the poor decisions our children make.
It's about the decisions these platforms are currently making for our children.
And one last point on that.
In the work that I do with SNVLC working with thousands of families and children, seeing the back end data, you know, we see everyday examples of how these products are designed in a way that exploits and targets children, particularly LGBTQ, everything from children searching for uplifting content and motivational speeches and getting what I refer to as "go kill yourself" content on some of these platforms, you know, to children searching for gay pride and getting Westboro Baptist Church, "You're going to hell."
And that is a function not of what the users, what these children are searching for.
That is a function of the programming that these platforms are knowingly implementing to increase their own engagement.
These are not choices our children are making.
And that limitation in KOSA specifically calls out: If kids search for resources, they will get resources.
In addition to which, KOSA does not apply to nonprofits, libraries, those types of organizations.
-Evan, how much of this is subjective?
How much of the content that is out there in regards to mental health, because KOSA emphasizes mental health could be viewed by some as helpful and others as harmful?
-I think this is an incredibly important thing for us to focus in on because there is no objective legal or even medical definition of what type of content is inherently helpful or harmful to kids.
A piece of content about police shootings might make some kids depressed and inspire other kids to take action, for example.
I also just want to respond directly about the carve-out that Laura was mentioning around search.
Unfortunately, it doesn't actually work that way.
If I search on a social media platform for, for example, why do I feel different from other boys, and that social media platform then returns to me results related to gender identity or finding support resources for people with gender dysphoria, they are not covered by that carve-out in KOSA, because they're algorithmically making an inference into what I was searching for.
An attorney general can easily argue that that's not what I was looking for, that they should have returned to me results about conversion therapy.
And so, unfortunately, as long as the duty of care implicates content, there will be ways for it to be used for censorship.
And there is no objective definition.
There are people in this country-- for example, Marsha Blackburn, one of the lead sponsors of KOSA, has said explicitly that she believes transgender content is harmful to kids.
She's also said that she believes that teaching about racism in schools is harmful to kids.
So as long as we have people in power who are going to argue that things that help kids are hurting them, we cannot give the government the authority to take down content that they don't like, just by claiming it's hard to hurting kids.
-I want to make sure we get this in.
Laura, and you can also address-- -Sorry, if I-- -Okay.
Quickly, please.
-Oh, yes.
Sorry.
Look, there's a lot of provisions in KOSA, and I strongly urge anyone who has questions about KOSA to read them.
Just responding quickly, you know, there's a provision in Section 4, and it's, I believe, 4(e)(3)(C) that says, again, Nothing shall be construed-- where is this?
Sorry.
"Nothing shall be construed to prevent using a personalized recommendation system to display content to a minor if the system only uses information on language spoken, geolocation, or age.
So that's yet another carve-out in KOSA that, that you would have to read all of KOSA and sort of look for these specific carve-outs that ultimately is going to protect in that scenario.
Because the goal there of KOSA is it's not that they cannot have recommendation systems; it's that when those are being aimed at children, there have to be certain safeguards.
There have to be default settings--and that's Section 4--that children can select that will help create a safer experience.
And then, of course, the one I just cited, 2, which says that the platform, as long as they are not collecting all these thousands of personal data points and targeting children one by one, they can still use recommendation algorithms with children.
They just cannot use those in a way that is inherently dangerous.
-Laura, in the interview with Meghan Stuhmer, your client, she talks about two different issues.
One is the alleged tracking of her daughter and the alleged use of that data to target her with content that promoted excessive speeding, and then the other is the speed filter itself, the speed challenge.
Would KOSA address both of those?
-KOSA is able to address many things that these platforms do so well.
Those are not specifically called out.
You have a number of provisions in KOSA, such as Section 5, which relates to disclosure.
So the platforms would have to tell us how they're using data they're collecting from kids.
So in Meghan Stuhmer's case, one of the things that would have done was very clear and plain disclosures, parents, and Meghan in particular, would have been informed as to how the platform was using the data.
Additionally, Section 6 of KOSA relates to transparency.
And there's multiple disclosure categories so that when a company like Snap, Meta, TikTok identifies issues of potential harm to children, they have to disclose those.
And that becomes a self-policing system, where companies are now motivated to not deliberately harm consumers.
When they identify these harmful features, they have a disclosure requirement.
These are all critical provisions of KOSA.
-We have about 30 seconds left.
I'm gonna give you the final word, Evan.
-Sure.
So here's what I will say.
Again, there's so much agreement on this call.
We need to hold these big tech companies accountable.
But to do that, we need to actually get legislation passed, which means that we can't be divided.
As long as we are trying to pass a bill that is deeply controversial and that is opposed by dozens of human rights and LGBTQ and civil liberties organizations, big tech is going to keep getting away with murder.
And so our plea to those that are supporting KOSA is, please, let's sit down.
Let's come together and craft legislation that will keep all kids safe, rather than making perhaps some kids safer, while putting other kids in danger.
If we can come together, there are so many things that we can do to address the underlying business practices of these companies and mitigate the harm that they're doing.
But as long as some of us insist on pushing a bill that is substantively opposed by a broad coalition, what's going to happen is nothing.
-Thank you both so much for taking the time to discuss this important, important issue.
Local mom shares social media warning
Video has Closed Captions
Clip: S6 Ep29 | 8m 49s | Meghan Stuhmer shares how a social media challenge changed her daughter’s life forever. (8m 49s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
Nevada Week is a local public television program presented by Vegas PBS