Indiana Lawmakers
AI and Data
Season 43 Episode 7 | 28m 46sVideo has Closed Captions
How are lawmakers addressing issues with data privacy and dangerous developments with AI?
Over the years, the Indiana General Assembly has passed numerous measures to advance technology and promote its use. This session, however, the focus has shifted to reining in technology that might be detrimental, if not downright dangerous. We explore what measures our state legislature is taking to protect Hoosiers from data breaches and dystopian developments in artificial intelligence.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Indiana Lawmakers is a local public television program presented by WFYI
Indiana Lawmakers
AI and Data
Season 43 Episode 7 | 28m 46sVideo has Closed Captions
Over the years, the Indiana General Assembly has passed numerous measures to advance technology and promote its use. This session, however, the focus has shifted to reining in technology that might be detrimental, if not downright dangerous. We explore what measures our state legislature is taking to protect Hoosiers from data breaches and dystopian developments in artificial intelligence.
Problems playing video? | Closed Captioning Feedback
How to Watch Indiana Lawmakers
Indiana Lawmakers is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- Over the years, the General Assembly has passed numerous measures to advanced technology and promote its use.
This session in contrast, the focus has shifted to reigning in technology that might be detrimental, if not downright dangerous.
(majestic music) Lawmakers, for example, introduced bills to discourage the use of misleading AI generated content in political campaigns to erect guardrails around data gleaned from consumer genetic tests, and to ensure that no automated commercial vehicle takes to the road without a qualified human in the cab.
Or over recognizing that they don't know what they don't know, lawmakers seem poised to create a task force that will study how artificial intelligence is used and/or should be used by various state agencies.
That's a lot to process in a 10 week short session, nevermind a 30 minute public affairs show.
But as always, we'll do our level best.
I'm Jon Schwantes, and I have to go set my VCR.
So hey, Alexa, you take it from here, (majestic music continues) - [Alexa] "Indiana Lawmakers," from the state House to your house.
- For our discussion about the nexus of public policy and emerging technology, we're going to forego GP chat or ChatGBT or whatever they call it, in favor of OG Chat, the old school kind in which the participants are real life humans, and the intelligence is astute, not artificial.
Joining me to talk about the state's efforts to capitalize on and not capitulate to the fourth industrial revolution are two forward-looking lawmakers who coincidentally both hail from Fort Wayne.
Third term Republican Senator Liz Brown, and first term Democratic Representative Kyle Miller.
Thank you, both.
And just to be clear, when I talk about astute intelligence, I'm referring to my guests.
(guests laughing) I make no claims on that regard myself.
Before we move forward in this interesting new world, let's take one step back.
Last session, one of the most significant bills and one of the quickest ones to reach the governor's desk was Senate Bill 5.
That was your bill, Senator Brown that dealt with data privacy.
Pretty significant bill.
Why was that a priority for the caucus and for you?
- Well, I think everybody's concerned, right, about who has their data and how they're using it.
And so that was actually the second year that we tried to move it through, and I think that was good.
We had a lot of conversations in the year while we waited to bring it back last year.
People need to understand it a little bit.
And also strike the right balance between businesses who need our data in order to perform and give us the services we expect.
And then also consumers who wanna make sure that they have some control over it.
- And in theory, still time to tweak it because even though it was enacted last session and signed last year, it does not take effect until January 1st, 2026.
The thinking there is what?
- The thinking was other, only a few other states have done this.
I think we were- - I think we were seven.
- Yes, I was gonna say when we started two years ago, we would've been six, we were seven.
But to make sure, obviously big, big companies that are global we're already gonna be enacting the same parameters that we were requiring.
But the concern was, if you're an Indiana business and this is new, we wanna give you a year or two to ramp up because there are certain things in that bill in terms of doing assessments and being able to access that data quickly for consumers that we wanna make sure it wasn't too burdensome.
- Representative, you liked it enough to vote for it.
I went back and looked at the tally.
Some critics say, Oh, it still puts too much of a burden on individual consumers because rather than having a, for instance, a single opt out, they have to sort of go company by company and search through the internet and deal with theoretically, I guess, hundreds if not thousands of companies.
Also doesn't deal with state agencies.
They're exempted from that.
Does it go far enough and would you like to see changes either before the enactment date, the effective date January 1st, 2026?
Or is that something that you think will be further molded down the line?
- Sure, well, I think much like a lot of what we do in the legislature, nothing's ever perfect.
We're all trying to do the best that we possibly can.
Senate Bill 5 was an excellent bill, an excellent start on some of these discussions.
And so, I applaud Senator Brown for bringing it, for authoring it, for working it through that process.
It's definitely needed.
And then as we go forward, we'll see what changes need made and what we can do as technology continues to change and as consumers continue to give us feedback on the process.
But I think for now, we are well on our way and one of the front runners in the country on this topic.
And so I think we're in a good spot and we'll see how we move forward.
- Do you find it interesting that states are in front of the federal government on this?
I know the federal government has tried, although it's had difficulty doing much of anything in recent years, they've had proposals, but have not had any significant change at the congressional level and yet states, you mentioned we were number seven and now I think it's probably several dozen have addressed various aspects of this.
What does it say that states are taking on this issue?
Is that where this should, the regulation for these things should reside?
- Maybe not, when I first introduced it, there was concern that we would be creating a patchwork because California had a very different model.
This was originally modeled on Virginia's law and other states that have enacted at Utah, Colorado.
- The first goal was the European Union model.
- Yeah, which would've been- - Then you decided to go American, right?
- That would've been extremely different.
But that's because you talk to all the constituencies and say how's this gonna fit for Indiana?
But I think that probably the fact that we sort of conformed and try to find a common middle ground was helpful.
But sure, the federal government should be stepping in.
They've had a bill on their desk for years and years, but I think that's why it had bipartisan support, because we realize they're not gonna be doing anything anytime soon.
And our constituents are, have significant concerns and that's not gonna go away with the type of technology we're using.
- Are these essentially placeholders in a way?
Assuming, presuming presumably the federal government will have to deal with this at some point if for no other reason the enforcement mechanism seems better suited to the long arm of the federal government as opposed to individual states which would try to police this.
- I think that we obviously would probably prefer that the federal government makes some moves on this, but until they do, states can be sort of laboratories for this issue.
And there may be something that our federal, our congressional delegation looks at from Indiana and takes some pieces of Senate Bill 5 and some others, and then, and from pieces from California and some other states and kind of put together a comprehensive federal piece of legislation.
So I think until they make moves, I think that it's good that we're all having this discussion state by state to see what works here, what works in other states, and come to some solution.
- And you're probably willing to let this serve as a model for the federal government, right?
You don't have a- - Sure.
I think the Indiana model.
- You have the copyright model and the t-shirt concession.
- No.
Happy to help the federal government do their job.
- So last year, the buzzword obviously we just talked about was cybersecurity and data privacy.
Technology is still back this year, but with a bit of a different flavor, different twist.
It's more about how to deal with artificial intelligence, which I think I, for one, probably, I don't know when I learned what that term was, I think it used to mean associate instructor at the college level or some other things that weren't germane to the tech, the subject at hand.
It looks like we're trying to protect rather than consumers.
This time we're looking at vulnerable children, we're looking at victims of sex trafficking or victims of, so-called revenge porn that's manufactured through digital means.
We're looking at filters for cell phones and other digital platforms that might be available to young people.
Is this the next frontier we need to address as AI?
- Well, that's one, right?
And obviously that's top of mind for most of us as parents.
We're concerned about our children and the exposure because so many kids have a phone and have access to things that we could never have imagined, and so how do we put guardrails around that?
Making sure too, that it's we're not trying to put anyone outta business except the bad actors, but how do we do that in a way that is enforceable?
And that's hard.
But I think some of the bills that are moving this year are pretty narrow and have been thoughtful in their approach.
- And I mean, one, some of 'em are easy to understand.
For instance, curbing the use of cell phones in classrooms.
I guess teachers and superintendents just say from this point to this point, you're gonna go plug into the charger and I don't wanna see 'em in hand.
But when you get into the safeguards about filters and digital platforms, that's a little bit tougher certainly to police.
How do we do that?
- It's very difficult as we were crafting the legislation around deep fakes or fabricated media in elections, we had to be very- - Which is one of bill you're on.
- Yeah.
- Yeah.
- We had to be very intentional and very careful about not just including all fabricated media, because if a wedding photographer were to touch up photos, technically that could be fabricated media.
It was digitally altered.
And so the emerging technology around artificial intelligence is vast and very, very interesting and has a lot of applications that can really assist our lives and make our lives better.
And it also has a lot of facets that can really be damaging and harmful.
And so sort of taking this piece by piece and sort of going, okay, this technology's here, what are some of the ways we need to put, we need to curb the use of it now, like in elections, like making sure that what is put out there in something as important as an election is legitimate, is kind of the first step in getting, as a legislature, getting our heads wrapped around what this technology is, what it can do, what it's capable of, and how we need to reign it in, if at all.
- And let's, since the elections are looming, both primaries and general, let's focus on that sort of as we tick through these pieces of legislation.
Unlike your data privacy bill, which had a two and a half year lead time before it took effect, most of the bills that relate to election content that might be a misrepresentation or a deep fake, I guess is the term, they're deemed emergencies and would take effect on enactment.
Are we seeing what is, I mean, am I missing something?
Are things happening?
I know we, in New Hampshire, there was the audio where Joe Biden purportedly told democratic voters to stay home.
I'm no expert, didn't sound like it was very authentic to me, but I know maybe the technology at their fingertips wasn't what Hoosiers might have.
Are we seeing problems?
Why this hurry up offense here?
- I don't know if it's necessary seeing problems or also trying to be at the same time be proactive.
And these are first steps, but it's difficult, right?
Because there's also the first amendment of free speech.
And so when you're talking about political issues and two sides may have a disagreement at what point, you know, do you, I mean the deep fake, we think is fairly obvious, so it's a lot harder than one might realize in trying to craft these bills to allow the free exercise of political discourse.
And at the same time, and even maybe not necessarily being totally accurate in your information, not deliberately misleading, but because the political rhetoric gets to a hot level and at the same time making sure that people aren't deliberately manipulating information and to the point of pretending the video or the audio is from someone when it's not.
So, it's hard.
I don't know if we've had any incidences in Fort Wayne.
We're pretty honest, straightforward people up there, but I don't think it's something we should be ignoring.
- And again, the details are always, the challenge here is I think most of the bills were drafted, it's 90 days before an election period.
I guess you can go crazy on 91 days before, but you better walk the straight and narrow at 90 days.
And then the enforcement mechanism is interesting.
Different states have have looked at different approaches.
As I understand this, the aggrieved party, the person whose likeness or words have been distorted would have the ability to essentially sue civil action against the alleged offender as opposed to some states which have tried to assign criminal penalties, misdemeanors, I think typically to it.
Is that the way that we do this is if I do something that you didn't do and I put in a commercial, it's up to you to come after me.
- Well, I think here again, we have a situation where we're trying to be proactive.
We see some of the fun parts of AI.
I mean, I came across something on Facebook where Joe Biden and Donald Trump were hunting together in the woods, so we're seeing the fun parts of what AI can do.
- I can imagine where the shotgun is pointed.
(guests laughing) - In those pictures it was very amicable, but we're kind of imagining where that could go.
And so, we are trying to be proactive.
So much of what we do as legislators are constituents bringing problems to us, and so we're being reactive to problems.
And I think when we see things like this, it's important for us to get out ahead of it.
And I think one of the challenges is what do those enforcement pieces look like?
Do we attach criminal charges to them?
Do we just allow the aggrieved party to sue?
That's where we've landed on this one.
Perhaps for some dynamics in getting it through both chambers of the legislature, but who knows?
We'll see what other states do.
We'll see how it goes.
And we can always amend the law if we need to.
- And it is a work in progress, even at this point.
In committee last week when the house bill came over and was initially heard in the Senate, one of your colleagues said, this is great if somebody views or likeness are distorted, but what if the deep fake and the AI manipulation is done to beef up or trumpet your own accomplishments, even if you've never had those accomplishments.
That's not addressed in the bill.
Do you think that's a gap that needs to be changed?
- Well, I think that's, I mean, to Representative Miller's point, that's the point, right?
I mean, we don't want this to be a fishing expedition for lawyers just to sue people.
And I think that's sort of the problem where we are is you don't really know who's paying for some of these things, who actually formulated the deep fake to begin with.
So if there's a deep fake of me, I can't necessarily, I'm in an election year and have an opponent.
I'm not going, you don't want it to be attributed to the other side, so to speak, automatically.
You gotta be thoughtful about this.
And so I think just starting with civil penalties is probably the best way to go.
- And moving outside the election realm, the related bill, we already touched on it, whether you call it revenge porn, whether you call it child porn, created out of the imagination of a computer.
Those bills, we saw this session as well.
Again, work in progress.
I presume it's that notion of what is victimless, what is not victimless.
Have we arrived at the answer yet on something as complicated as that?
- No, I don't think so.
I don't think we'll be at the final answer for many, many years.
But it wasn't too long ago.
We saw just a couple weeks ago explicit images generated by AI of Taylor Swift splashed all over Twitter or X and- - She needed the publicity, I think.
- As a struggling artist, she could use the publicity.
- I'm kidding, to all the Swifties out there, no letters.
- Yeah, I'm gonna get letters too.
But we're seeing it and we see non generated AI revenge porn as a huge issue damaging psychologically this sense of power over victims and recognize the huge issue that that is.
And when we have a technology that can just create it from thin air, that's a real problem.
And so, I don't think we're at the end.
I think we're at the very beginning of a lot of these discussions on emerging technologies and we'll get there.
- And one of the real world scenarios, I believe in talking to some of the authors that might have set the stage for this legislation was an incident where a teacher who had run afoul of a student in some manner.
To get back at that teacher, her head was put on somebody else's body and scantily clad if clad at all body.
That's the kind of thing this is meant to address.
- Right, and those are consequences.
You may think that the adult is not gonna be emotionally harmed, but there are consequences with respect to your profession, your employment and things like that.
And then as we all know, once they're out there, they're out there forever, right?
So maybe that individual's employer at the time is fine and maybe there are no consequences immediately, but that is the bigger concern.
And sometimes these things may be generated and happening and you don't even know about it yet.
So, it's gonna be really interesting to see when this bill is passed, how it's first implemented and the enforcement.
And if we've not gone far enough, then that's I'm guarantee you, that'll be something we be addressing.
- And I could perhaps you foresee litigation too from those who say it is an encroachment on the first amendment or if it's when it gets into the so-called child porn arena, if it's total fabrication.
Where does imagination start and where to- - I always, I gotta, I have to say though, when anyone ever says there's a first amendment right to porn, I always have to say there's not.
So, I just- - Oh, Supreme Court has wrestled with that issue for- - Yeah.
- But they know it when they see it though.
- Probably, apparently.
- All right, if our heads aren't spinning enough from the notion of cybersecurity and artificial intelligence, let's delve into a relatively simple subject like DNA and the structure of humankind in all life.
One of the bills you're promoting this session would put guardrails around the data that are gleaned from consumer test kits that we see advertised.
I think I've given some for people for the holidays, I've probably gotten a few myself, talk about the legislation you have promoted this session.
- So this kind of deals with our very basic of privacy concerns.
Our DNA and the makeup of our ourselves.
So many people are using them now for great purposes to find out more about them themselves and their families.
And when I was initially approached about looking at this legislation, the idea came up of who's pushing this, who wants this, who should we do this?
Obviously it's a good idea for consumers, but I was surprised to learn that some of the bigger companies that deal in this space were the ones that we're kind of pushing for this legislation.
And it makes sense because if we are handling such private information, we should have guardrails around that.
And so it just lays- - Doesn't get more private than the building blocks of our genetic code, I suppose.
- Right, right.
And so, we wanted to put guardrails around the use of those and allow consumers a comfortability with using these products.
And knowing that while they're voluntarily giving their DNA, giving that data over, it should be protected in all cases.
- So more disclosure on the front end, more buyer beware sorts of notification.
- Sure.
- And then the ability to get back some of that data if someone wishes to have it back.
How do you balance that privacy notion with we've heard about some of the celebrated cases, the Golden State Killer I think, maybe you have the name of that particular serial killer wrong, but was done through a relative's DNA that was retrieved from a dumpster, et cetera, et cetera.
Through that, all I think could be traced back to one of these consumer kits.
- We solved a young girl's murder longstanding probably almost 15, 20 years because of the DNA.
But the flip side is then there's also people's privacy.
And I don't, I think some people don't quite appreciate how invasive it can be.
They willingly take the test and maybe they just wanted to do a genealogy tree and start the process.
But there was this quite interesting story a few months ago in, The Wall Street Journal where a sperm donor decided he would go and look up all of the children he fathered.
And think about that.
Someone knocks on your door and you haven't had that conversation with your child or you just didn't really want that, or someone's adopted and they don't want someone reaching out and you don't have to go through the normal legal processes that you used to.
And all of a sudden someone is sharing information with a family member that you had never planned to do.
So yeah, it's complicated.
And I think when people use those test kits, especially in the beginning it was kind of fun.
What's my heritage?
I'm not Swedish, I'm Irish, whatever, right?
All those ads that they were using to promote 'em, but now all of a sudden they're like, "Wow."
- [Jon] That's kind bit more complicated.
- Once it's out there forever, right?
- We were very intentional in crafting the bill to make sure that law enforcement still has to go through the procedures that they would normally go through to get something like the Golden State Killer.
- [Jon] Same sort of warrant, probable cause that you would have for any evidence, election, right?
- Sure, sure.
And we wanted to make sure that we weren't interfering with that because there again, this technology can be used for a lot of good in solving the April Tinsley murder, the Golden State Killer.
It can be used for good things, but we need to make sure that we're harnessing that good.
- And of course, a lot of the things we've talked about, you've done your best, you and your colleagues to sort of craft what makes sense, tooling it, retooling it, tweaking it.
I'm intrigued, Senator, by what you've done with Senate Bill 150, which is essentially an acknowledgement that we don't have all the answers.
We can't, there's no way we're gonna have all the answers.
So you want to create a task force to look at initially at least how state government is using or should be using AI.
'Cause it, I mean, who knows where that leads?
Tell us your thinking on setting up that task force.
- No, it's interesting.
And when the bill was heard in the House, we had some experts come and testify, interestingly enough against it because they think government's too slow.
And a hundred percent we're gonna be slow.
But it's not even really top of mind.
And we need to understand that our agencies are already using this, local governments are using it.
And at the same time, another big part of that bill is instituting safeguards and policies all the way down.
I mean, our school systems have been hacked.
- That's right.
It's every division- - Yes.
- Division of government.
- Whether that's a research institution at the post-secondary level or a grade school.
- Exactly.
And frankly, many, many times all of our local uni somehow we're integrated all the way to the state level.
So if there's a bad actor at the lower level who somehow gets into our system, if you will, it can go up all the way up the chain and then nefarious things can happen.
So, I think that's the idea.
Let's start making everyone very purposeful and looking at safeguards and at the same time understanding that these are opportunities here in the AI piece, but there's also a cybersecurity proponent to that bill.
- We'll be watching that.
I think the deadline, if that's enacted would be October to see what.
All right, I feel better knowing that we have smart people who are trying to get ahead of these issues.
And I appreciate the insights that you've shared on this and perhaps Hoosiers have a better understanding now of what's involved.
Again, my guests have been Republican Senator Liz Brown and Democratic representative Kyle Miller, both from Fort Wayne.
Now we have passed the halfway point in this year's short, but by no means quiet session.
Ever wonder why we have a long and short session in the first place?
We've got you covered.
Indiana's legislative sessions come in two varieties.
Long and short.
Here's how that shakes out.
In odd numbered years, when two year state budgets are crafted, the General Assembly meets for up to 61 session days, not necessarily consecutive days, and must adjourn by April 29.
Sessions in even numbered years like this one are capped at 30 session days with adjournment coming no later than March 14th.
Now, unless otherwise specified, Indiana's new laws will take effect July 1, the start of the state's fiscal year.
Wasn't always the case that we had a short session.
They were initially designed to deal with emergency legislation, but in recent years they basically morphed into truncated versions of their longer counterparts.
Don't think though, that nothing will get done in such a short period of time.
In a short session during Mitch Daniels' time as governor, right to work was signed into law.
And in 2022, lawmakers cut Indiana's personal tax rate to 2.9% proving money can happen during a short period of time.
Of course, no matter how long the session is, you can count on the team here at, "Indiana Lawmakers" to be with you every step of the way.
And time now for my weekly conversation with commentator, Ed Feigenbaum, publisher of the newsletter, "Indiana Legislative Insight," part of Hannah News Service.
All right, we've been in the weeds.
Let's climb outta the weeds.
Let's go up to the 40,000 foot level.
What do we see there as we look at these issues?
- Well, I see that it probably won't be back next year as a commentator because AI will replace me.
You'll have figured out some kind of generative way to do that.
- No replacing you, Ed, that I'm sure of.
- I think one thing that you saw on the round table is that this is not a partisan issue and it's really more of a philosophical issue and it doesn't really work into any kind of conservative, liberal split either.
And in the US Senate, Senator Todd Young from Indiana is working very closely with Senate Majority Leader Chuck Schumer on AI issues, trying to prepare the Senate for what's coming up.
And we're seeing the same kind of thing here.
There's a futures caucus comprised of some of the younger members of the legislature that are looking at these tech and AI - Recently revived, just a week or so ago.
- Absolutely, absolutely.
- Yeah.
- And this really is a new frontier.
You've got just about every state legislature in the country looking at this.
More than half of the state legislatures that are doing these kinds of things are looking at basically how to regulate deep fakes and how to regulate those kinds of content issues.
And at the same time, if you've got the federal government involved in this.
Just on Wednesday of this week, the US Department of Justice announced that they're going to be seeking stricter penalties for problems with AI, for crimes committed with AI.
And they're trying to get at those deep fakes as well.
- You make the point about this not being a partisan issue, which explains why a lot of these bills that are still alive emerged essentially on unanimous votes from the House of origin.
What is the danger then, if there is no opposition and nobody's picking it apart because of the rapidity with which these things are moving, that we may end up with something that nobody wants or expected.
- Well, I think that, that we've seen in basically the last couple of years, the legislature starting to move toward correcting things that they've already done in the session before without even seeing how they've, you know, the impact of those things would be.
So, I don't think that's any different.
And you look back and you think about how things have changed over the years.
Back when I was in college, if I had the internet.
I was there before the internet.
If I'd had the internet, my papers would've been a whole lot better.
If I'd had Google, my papers would've been better.
If I had had Google with the thing that finishes your sentence and all that, would've been better.
And then you've got AI too now.
And so things have have totally changed and our perceptions of what a good college paper is, a great college paper is today have changed as well.
And how those things are generated have changed.
And that's part of the problem that we're facing today.
This may be a solution in search of a problem or vice versa.
And we've had some of these same crimes before.
We're just looking at how they're generated today.
- Very good, Ed, as always.
Appreciate your insight.
- Thank you, Jon.
- Until next week, take care.
(majestic music)

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Indiana Lawmakers is a local public television program presented by WFYI