Palmetto Perspectives
Smart Thinking in the Age of AI
Special | 59m 40sVideo has Closed Captions
Artificial Intelligence is rapidly transforming the world around us.
Artificial Intelligence is rapidly transforming how we learn, work, and interact with the world around us. This panel discussion will explore foundational AI concepts and examine the impacts of this technology in our communities.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Palmetto Perspectives is a local public television program presented by SCETV
Support for this program is provided by The ETV Endowment of South Carolina.
Palmetto Perspectives
Smart Thinking in the Age of AI
Special | 59m 40sVideo has Closed Captions
Artificial Intelligence is rapidly transforming how we learn, work, and interact with the world around us. This panel discussion will explore foundational AI concepts and examine the impacts of this technology in our communities.
Problems playing video? | Closed Captioning Feedback
How to Watch Palmetto Perspectives
Palmetto Perspectives is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
♪ ♪ ♪ > Hello and welcome to this Palmetto Perspectives Education Special, Smart Thinking In the Age of AI.
I'm your host, Jada Samuel.
Recent reports show significant increase of AI tools being used in the classroom.
Globally, 86 percent of students report using AI in their studies.
Artificial intelligence is no longer something we're waiting on, it's already here.
Shaping how we learn, how we work, and how we connect with one another.
From classrooms to workplaces to the information we see every day, AI is becoming a part of our daily lives.
Sometimes, in ways we didn't even realize.
And tonight, we're focusing on what that means right here in South Carolina.
How technology is showing up in our schools, our economy and our communities.
I'm excited to have these experts joining me this evening.
On stage, we have Dr.
Chelsea Richard, Vice President of Operations with the Central Carolina Community Foundation.
David Thornton, Technology Instructor at the Daniel Morgan Technology Center in Spartanburg and member of the SREB Commission on AI in Education.
Dr.
Kedralyne Folk, AI Educator at the University of South Carolina and Charles Appleby, Senior Advisor of the Office of Statewide Workforce Development at SCDEW.
And in our audience, we have a few VIP subject matter experts joining us from the front row.
Adrianne Beasley, Director of Strategy and Communication with SC Competes.
Monique Garvin, Director of the State Human Trafficking Task Force at the Attorney General's Office.
Stephanie Hite, with the Attorney General's Office on Internet Crimes Against Children.
And Nathan Hogue, Chief Information Officer at the SC Department of Administration.
We'd also like to welcome our streaming audience, joining us from Facebook and YouTube.
And the audience listening on SC Public Radio.
We invite our digital audience to engage, submit questions and be a part of this dialog.
Our Media Correspondent, Paulia Williams, is present to ensure that your voices are heard.
Tonight isn't just about understanding the technology.
It's about asking the right questions.
So let's get started.
A quick show of hands.
How many of you have used AI in the past week?
Well, yeah, I think the number of hands is not surprising at all.
There are a lot of opinions about AI, but we use it in so many ways, and probably ways that we're not even aware of all the times we do.
So what exactly is AI?
To help us get started with our conversation, we asked a generative AI system how to explain artificial intelligence to someone who is five years old.
Our Social Media Correspondent Paulia Williams has that AI generated response.
Paulia> You know how your brain learns things like, how to tie your shoes or recognize your favorite song?
Artificial intelligence is like teaching a computer to learn in a similar way.
It's kind of like, really fast- It's kind of like a really fast learner that studies lots and gets better over time.
It has no real brain.
It doesn't feel things, like a real person.
It's just really good at spotting patterns and helping us with tasks.
Jada> Paulia, I think that's a pretty good definition, but I want to go to Chelsea and ask, what would you add or change about this definition?
Dr.
Richard> I think the thing that I would change is that it can also be kind of whatever persona you want it to be, right?
It can be really specific or it can be really broad too.
So that learning can be in a specific place... of the internet.
Jada> Yeah.
Anybody else have anything that they would like to add?
Charles> I think it's important, especially at a young age, to tell those children that AI doesn't know right from wrong.
So when you're asking it questions, it's not going to have that perspective in it.
And it's something that human needs to bring to that conversation.
Jada> I like that.
Anyone else?
Dr.
Folk> I think, I would echo that it is very important to always have human oversight.
Anytime you're dealing with any type of generative AI.
Jada> Okay.
Let's talk about how AI is being used specifically in education.
Paulia had an opportunity to talk with a small group of students at the University of South Carolina about their thoughts on AI and how they are using the technology in their studies.
Let's take a look.
Student 1> I think of, of course, artificial intelligence- Student 2> Like network- Student 3> ChatGPT.
Student 1> Chat GPT.
Student 3> Software that we just use to generate ideas for us.
Student 4> Honestly, I think AI is more of a shortcut.
I think it's very helpful in certain ways, as long as it's being used the correct way.
Student 3> I always use the term like, "water is good for you."
Too much water can kill anything.
Student 4> We're seeing it more as a cheating shortcut, which I think is robbing us of true education.
Student 1> It can be a good learning tool, but honestly, people are probably just going to do it to do homework and stuff.
Student 6> Being able to put it into something and it explaining to us, and all you got to do is read it.
That's better than not having it all.
Student 2> You can like put it in to make a study guide.
That way, so it's not technically cheating, but it's also helping you.
Student 6> Any type of way that I put it in there, let it break it down to me so I can read it a long time.
Or make it into flashcards or do something.
Student 3> I think if we use it as a tool for good, it can be great, but it's just... if we use it for bad, of course it's gonna be bad.
♪ Jada> There are so many times where I wish that I had AI when I was in high school or college.
But, David, I want to know what can you add about how students are using or experiencing AI in their classrooms?
> Well, some of the students said, you know, shortcuts but, I don't like using shortcuts.
I like saying using AI as a tool to enhance their learning.
And I do feel like in the classroom, you know, our students are using AI to, help them to gain more reps, repetition in their studies.
They also use it in research.
It does get them to a faster research to add to their papers and documents and things like that.
Also, AI is used to, I think bring in ideas, creative ideas.
To give them a start, a faster start in moving forward.
Jada> Yeah.
Dr.
Folk, you are a former K-12 educator, so, I want to know what your thoughts are on that.
But also, how do you account for the use of AI in classrooms for students with varying learning styles?
> I believe that as I reflect on my K-12 experience, that AI can be a very helpful learning tool.
When students are taught how to properly engage with the AI, responsibly.
And so now when we look at AI being embedded into a lot of our shows and things, apps that we interact with on a day to day basis, the teaching part of how to interact with AI is even more important.
Jada> Yeah.
David, how is South Carolina addressing some of these policy needs around AI and some of the procedures that we have around AI usage?
David> I served on the SREB Committee and Charles, as well.
And we have, worked in different groups.
I was part of the K-12 group.
We have post-secondary, and then we have a kind of administration level, as well.
And Charles can talk about that probably a little bit more.
But in my area, the K-12, we have set some guidance.
Not set in stone because, from what I understand, the State Department will be the ones to make that final decision.
So I can't speak for them, but from what I've seen, basically, each district is kind of using the guidelines we kind of put out there to kind of guide them of how they may want to approach setting policies and procedures in regards to AI and guardrails, things like that.
Jada> Yeah.
Charles, AI is sometimes developing faster than we can develop policies.
So, what are your thoughts on what's happening here in South Carolina?
> I think South Carolina is, really ahead of the game in this area.
As David mentioned, the commission at Southern Regional Education Board is actually co-chaired by Governor McMaster.
And so it was our leadership that was, forward thinking in this.
And, the commission was comprised of 16 states, business and education leaders, whether it was K-12, post-secondary, workforce managers.
We even had engineers who were building robotics, that were part of this, groupthink.
And again, South Carolina's leadership was the one going, like you mentioned, in the beginning "AI is here."
It's not going anywhere.
So let's get at the front end of it.
And start looking at how we can, form those policies, how we can make sure that it is truly a tool that can help us move forward, while also balancing, the risks that may be involved.
Jada> Well, David, what does it look like in class when people are using AI?
David> It's fun.
[laughter] I'm gonna encourage you.
I love it.
Don't... sit back.
Try and learn what AI is.
Become, knowledgeable.
I think that's the main thing.
I think a lot of people are nervous, very nervous and scared.
But, learn what AI is and how AI can be used as a tool to help you and better educate you and to help you get to an end task.
But in a classroom, we are sitting there taking data, and we're actually taking that data and we're cleaning the data.
We're organizing the data, we're prepping the data.
And, to be used in a machine learning model.
And this machine takes that data in and basically it can do some testing and give us some predictions and outcomes, maybe even new discoveries, you never know.
Might have a discovery for cancer because, what- There's two types of data, one is structured, one is unstructured.
And I don't want to go into big depths of this, but the unstructured data is the unknown.
And that might be where we can find a lot of solutions to a lot of things we haven't even thought of before.
So the kids are loving it in the classroom.
They're excelling at it.
Right now, I have my students working through, manufacturing curriculum in AI.
And they are going into Microsoft Azure Labs, and they're literally building those pipelines of data.
And that data, again, is building their machine learning models and getting outcomes and results.
And, it's engaging, having hands-on and engaging in these, areas.
Students are, wanting to learn more.
They're not bored.
Can you believe it?
They're not bored.
And, it's so exciting to see their new discoveries and just see them excited about it.
Jada> Yeah.
You talked about the results and, Chelsea, you work in the nonprofit world, and that's all about the results and making an impact.
So how are you using AI?
And how do you see students using AI to support the work that you're doing?
> So at Central Carolina Community Foundation, we got a large grant from Google.org.
We are one of five communities across the country that got this grant.
The other ones were San Francisco, New York, Austin, Atlanta and us.
So we have a cohort of 20 nonprofits across the state.
In common, they're all human service nonprofits, but they are, have developed and identified a pain point in their organization that has helped, that they're using AI to help solve.
So this could be anything from operations to fundraising, to communications and a lot of what we're seeing are incredible efficiency gains.
Nonprofits are, have a lot of needs wear a lot of hats, limited resources, limited capacity.
And this is helping them work really efficiently.
There are incredible stories across the 20 nonprofits that we're working with.
And we as a community foundation, also raised our hand to say "we want to learn too."
So we're the 21st team at that table, learning alongside them.
So there are local nonprofits here in the Midlands that have, now are providing resources to clients immediately upon intake, where it used to take three days to see a case manager.
They're setting up driving routes to more efficiently serve three times as many people as they've been able to serve in the past.
Using it for their employee onboarding so that they can learn on the fly when they have staff scattered throughout the state.
So, it's been really incredible to see the efficiency gains that those nonprofits have seen.
And nonprofits are so ripe for this.
And the momentum is totally there.
Jada> Yeah, definitely changing the game.
Well, I want to ask our front row guest, how are you seeing AI being used in the classroom?
Stephanie> I see it a lot.
So I travel the entire state talking to, children about what they do.
And the main thing we see, mostly are middle school kids that I, come and talk to me after I talk, you know, do a presentation.
And their's is, using AI as a person to talk to.
A lot of them are finding boyfriend and girlfriends on there and morphing those into, those relationships.
Some are writing papers.
So there's different varieties that we see that they're using it, good and bad.
Because I work in internet crimes, we do see it a lot, the negative of AI using it for more, not so pleasant ways.
But again, we have to see how to solve the problem, how to teach our children.
And that's what I do, to go into the classrooms and teach them the right ways to use it and what consequences can occur, good or bad, in those situations.
Jada> Absolutely.
Anyone else have examples that they would like to share?
No?
Charles> I think one of them, and the commission that David and I run really realize this is, the ability for AI to help with individualized learning.
So not only does it excite them, but for those, to really gain the skills.
It can adjust reading levels.
It can provide different, or explanations in different ways.
And it really helps the child, to ensure they're gaining their skills.
But sometimes different children learn different ways.
And so, it never replaces the educator, but it provides that additional supplement, to really help ensure students are learning the skills they need.
Jada> Charles, that's actually a really great transition.
Because we learned in a recent survey by the U.S.
Department of Education that 75 percent of students say AI is a valuable learning tool.
We asked our student group about this, as well as how they might use AI to help them with choosing a career.
So let's take a listen to what they had to say.
Student 3> I just think we are in for a different way of life and just how we do things.
New innovations, new technologies are coming that we haven't even predicted.
And so I'm excited to see how that works.
Student 1> It's very helpful for an AI to, like, be a tutor and not actually like, give you an answer.
But kind of guide you on how to get to that answer.
Student 4> So what I would do is I would use AI to put it in there, understand what question I'm supposed to be answering.
And from that develop a good concise answer.
Student 3> But like it can tell me things, such as other people's opinions, parts of the job that I may not know.
So it can give me insight into that.
So like, I can be more informed about picking my career, but I wouldn't say I would let it pick my career for me.
Student 6> I would trust it to help me give ideas that if maybe if I was unsure what I wanted to do, I would let it give me different ideas to come to something.
Jada> Yeah, I think, I could have used, some additional support when I was in high school or college figuring out what I was going to do next.
I think we all could use a little assistance there.
But Charles, hearing those student responses, how do you think AI will impact South Carolina's workforce in the next three to five years?
Charles> I think as the students mentioned, it's all about job evolution and making sure that we help individuals adapt as soon as possible.
And that adapting could be gaining the new skills.
But it also could be learning that the skills they currently have could open them up to different careers they never thought about.
This is, actually something where the state, is already working in, for example, the Find Your Future platform that is being built by the Coordinating Council for Workforce Development is starting to bring that, to fruition.
The ultimate goal is to allow it so that individuals can put in education, work experience and it break it down into those skills.
So think of a recipe but then going "what are the ingredients involved with it," and then showing individuals wait, you can put those skills in a different order and actually be qualified for jobs you never thought of.
And this not only helps the individual, but it also helps the employer because it opens up a pool of employees that they never thought of.
Because again, we took it down to those individual skills that individuals knew, not just the high level credential or the high level job title.
So it gives them more options, that are out there.
David> May I add to that?
Jada> Yes, absolutely.
David> I totally agree.
I think AI is going to be raising the bar of education.
I think our workforce is going to be greater educated, they can go back to school or take some kind of night class, side class and learn a little bit about this AI.
Because as we move forward in industries, you're going to see that dashboards, a lot of dashboards are going to be created and they're going to be used all across every industry that's out there.
And they're going to have to adapt to that education.
And because they get that higher learning, their pay will also increase too.
Now, yes, the entry level jobs may go away, but they will be adaptive to a new entry level job that creates new jobs with AI on its side or at hand.
So, I think that's something we to think of as we move towards the future.
Jada> Yeah, I know Chelsea, you talked about it expanding capacity for people who might have small organizations or business.
But Kedralyn, can you talk to us a little bit about what you think it's going to, to do in terms of the workforce and students?
Dr.
Folk> Yes.
When I think about my role as an AI educator, my job is to translate very difficult AI topics into something that's digestible for faculty, staff or students.
So creating a foundational understanding across campus of what AI is was very important, when this job was created in October of 2025.
And so one way that we're doing that is through our Garnet AI Fluency course.
It's available to all faculty, students, faculty, staff, and students on campus.
It allows three modules for you to go through in less than three hours, for you to have that foundational understanding of AI.
And then you can ask more questions.
But one thing that we made sure to include in that course is how to prompt the AI properly.
And, really do more prompt engineering and having a framework for that.
And so when we think about careers and, students and their futures, them being able to prompt engineer and work through their workflows is going to be critically important.
Additionally, you both mentioned how it's so important that students are able to adapt to the new world, that they're in to the new environment.
And so, yes, a new technology may drop tomorrow and you may need to know how to use it.
But you should be flexible and able to kind of adjust and adapt so that you can continue to work, continue to grow in your career.
Jada> Absolutely.
David> And to add to that, I agree, a utility player, because at Daniel Morgan, that's one thing we teach our students is to be, well-rounded.
And so it's not just only the hard skills we're hearing all across the board from industries, "We need soft skills."
And these students have to come up with soft skills.
So we're not to be that numb robot, okay?
We want somebody that can talk the talk and, both sides of it.
Either the high-techy AI stuff or down to the common staff where we gotta make decisions, you know?
And what's nice with this artificial intelligence, my students are preparing for the workforce either right out of high school or post-secondary to come to USC or wherever their choose of college.
But the idea is they're walking out of that classroom, if they apply themselves if they stay successful, they can come out of Daniel Morgan Technology Center with a Microsoft Azure Industry Certification recognized across the world.
<Wow> Charles> She doesn't brag on it enough, but USC had an AI conference today where they literally, professors from across different colleges were talking about it.
I think there are higher-ed institutions USC, Lander University, Tri-County Technical College that are really, adopting this to ensure that the students who go there, no matter what, college there in, whether it's journalism, whether it's finance, really understand how AI can help them in those fields.
Jada> Yeah, wow.
Congrats on that.
I think we all need an invite next time.
But you talked about those skills and those soft skills and the hard skills.
Chelsea, I want to take a minute to talk about the program that you're working with.
And maybe some of the skills that you and some of the people that are in your cohort are gaining from that program.
> So it's definitely the technical skills, the prompt engineering, like Kedralyn said.
But I think part of what has really come out of this is not being afraid to fail.
Right?
Cultures of experimentation within organizations, within the teams implementing this and really that iterative process, right?
In nonprofits, we have a lot of space to play and experiment, but not a lot of time, because there's also really critical work that we're trying to do, right?
So having that time and protected space together to experiment and learn side by side with our peers has been really valuable.
Jada> I love that, time and space to experiment.
I mean, who doesn't love that?
Adrianne, I know that you are with SC Competes, and so I want to hear from you about how our workforce is going to be impacted in the next three to five years, and what skills and trainings do you think that students should really be focused on during this time?
Adrianne> Yeah, I think it's a very similar conversation to the one we have when we talk about AI in education.
Because AI is here, and we have a workforce that also needs to learn those skills, sometimes alongside our students that are learning those skills.
And that impact is about both literacy and competitiveness.
So, on the literacy front, you know, those human skills, those like, critical thinking, communication, teamwork, those success skills that are laid out in the profile of the South Carolina graduate.
That profile's been here since 2016.
So leaning on those skills, for not just our education, but our workforce is incredibly important.
And then on the competitiveness side, we really have an opportunity to remain AI forward as a state.
We really like, Charles said, we came out ahead, and I think the alternative is getting left behind.
So I think that we're headed in the right direction.
And on the competitiveness, it's about supporting those high-tech companies that want to land here, grow here, employ here so that we have more high-tech jobs and we just lift the entire ecosystem.
And I think AI gives us that opportunity kind of like nothing else we've seen lately.
Jada> I love that.
Does anyone else have anything to add to that conversation about workforce and AI?
If not, then I want to head on over to Paulia because she is engaging with our online audience.
So Paulia, I want to hear from you.
What are people saying online?
> Thanks, Jada.
There's some really good conversation in the chat.
One question that I have here says, "How do we, redirect AI misinformation?"
This is specifically for parents, supporters or caregivers.
"How do we ensure that good information is getting out to our toddlers?"
Jada> Wow.
That is something heavy to think about.
Does anyone in the front row have comments?
I heard some chatter here.
Nathan > I can make a comment.
Jada> Yeah, absolutely.
Nathan> So I'm not sure about the toddler part, but I will say that, AI doesn't, you don't just implement an AI system and walk away.
So, it needs continuous care and feeding.
It needs, human oversight and human in the loop.
It needs to be continuously supported, secured, making sure it's privatized.
All those kinds of things, need to happen to ensure that we have a good tool and good AI model, producing outcomes that we would expect, that tool to provide.
Jada> I think that was a perfect segue into the next part of our conversation.
Because understanding that AI is shifting workforce demands and is being used in schools is important.
But let's shift the conversation to talk about those who are concerned about AI and it's growing impact on our communities.
So, I actually do want to go back to Nathan and ask what policies and procedures are in place at the state level to support the rapidly evolving use of AI?
> Yeah, I couldn't agree more with Charles and the panel up front.
They took all the good answers.
But I would say that we saw this coming as a state and our state leadership, embraced AI.
And we wanted to lead with AI, and I think we've done that.
States are calling us about how we are managing AI and overseeing AI.
About two years ago, we published an AI strategy.
You can go to admin.sc.gov and review that strategy.
You can also go use our online state assistant at SC.gov His... Its name is Bradley.
It's the state dog, a Boykin Spaniel.
It can help you find information about AI and how we handle AI from a state perspective.
I think we did an important thing by putting together the strategy.
We also assembled, professionals from every state agency and higher-ed, to come together, to figure out how to, learn from each other.
Right?
Because I think it takes a village to, to go into this novel technology.
The other important thing we did was establish, a Center of Excellence.
And that Center of Excellence, looks at any AI use case for a state agency before it gets implemented.
We look, at it from a security perspective.
We look at it from an operational perspective.
What kinds of problems are you trying to solve?
How are you going to measure it?
How much is it going to cost?
Those kinds of things.
So any, use of AI is, rigorously tested before it goes into production.
Currently we're looking at about 135, distinct use cases in the state of South Carolina of from 35 separate agencies.
And a lot of interesting things, and problems being solved out there with AI.
Jada> Okay.
So just to clarify, Bradley is something that people can use to help learn about AI or is that an AI platform?
Nathan> It is our- It was one of the first things we did as a state, when state leadership said, "hey, AI is coming and we want to be first."
We stood up an AI virtual assistant, a state virtual assistant.
You can go to sc.gov You have the option to use.
You don't have to use AI.
If you'd like to use it, it's in the lower right-hand corner of the screen.
You can interact with, in a very human way.
You can talk to it.
You can use your keyboard however you prefer.
It harvest information from state websites, from around the state.
So, let's say you want to, be a notary.
You just type in or speak, "How do I become a notary in the state of South Carolina?"
And it gives you not only an answer, a human-like answer, but also sites where that information is coming from.
So you can dig in and validate for yourself.
And that's supported by a very secure, and sustainable program, delivered through Tyler Technologies, who's a partner and vendor for the state.
Jada> Well, thank you so much for sharing that.
<Absolutely> I would like to hear from you all.
And then also our front row guests.
What does responsible AI use look like and, and what resources are available to help people learn how to use it, responsibly?
Dr.
Richard> As part of the nonprofit cohort that we were working in, that was one of the first things that we learned how to do was to think about AI acceptable use policies within organizations.
We have one at the foundation, and ours is more of an approach, a philosophy of how we're going to approach AI, what we use it for, what we don't, and how it aligns with our organizational values.
And that has made our team feel safe.
There are guardrails here.
We have an approved tool that we have purchased that is a confined workspace for us to work in.
And that has really helped, sort of spur the curve because people are holding back.
And now we're like, "here's the safe sandbox to play in.
Here's the rules."
And that has really helped folks.
But that was a process.
We co-designed that policy with our staff, and we went through a learning process.
And then we have a frequency that we look at that regularly to make sure that it's still on board with where we are.
David> You got to remember, you know, AI is a tool, right?
And so, the tool can be used for good or for bad.
And what you put into AI is what you're going to get out.
And so it very important to the question that was online, that answer I think, too lies as... parental guidance.
And turn on certain filters or blocks or checking a student's phone, a child's phone, time to time.
Social media again, it's... there's a lot of things that's going on there, that... there's predators, there's jokesters, there's misleading information.
And again, when we're teaching AI in the classroom to the students, we make sure they understand that, you know, hey, we can't be bias.
And so you got to be non-biased, all right?
But there's going to be some people that are going to put, I think somebody said it earlier, their opinion in it.
And so that's, that's guided one way.
And so, are they guiding people the wrong way?
Are they misleading people are they, looking to do harm?
You know, and so we got to have those, guardrails up and ready to fight these things off.
And, I think we got to put things into place and follow and verify.
Validate, verify, validate, verify.
Jada> Yeah.
Anybody else want to talk about that?
Dr.
Folk> I agree with that, as well.
And I would say at USC our number one goal is to ensure that students are totally aware of how to use AI responsibly.
And as we look at doing different workshops and trainings across campus, our departments are identifying pain points, as Chelsea said.
And then utilizing the AI, to solve that problem.
So starting very small, piloting something.
See if it works for you, then maybe sharing it with a colleague or a coworker and then allowing it to kind of organically grow and expand.
And in those conversations across campus, responsibility is always at the forefront of everything that we do.
David> Talk a little bit about ethical.
Dr.
Folk> Alright, yes.
The responsible, ethical considerations of AI.
And so I distinctly can see a PowerPoint slide in my head right now, [laughter] that I have it memorized, but... that we always share out when we do those workshops and trainings across campus.
Because there are different ethical considerations for different departments in different colleges.
And so one thing that's really cool about the AI educator program at USC is that we have the time to tailor and personalize all of our AI presentations, specifically to the college or department.
So you're not just getting information, general information when you meet with an AI educator.
You're getting something tailored and specific to you.
Charles> And in terms of materials, I think, Nathan mentioned the state's AI strategy, from the Department of Administration, is while it's for the state and state agencies, it's a good example of organizational framework for it.
Additionally, SREB has materials online for K-12 classrooms, as well as workforce skills.
So, visiting the Department of Administration's site, SREB's site those are good places to start finding some of those resources.
Jada> Yeah.
I mean, wow, lots of information here for people to find information about K-12 and workplace.
And I love that we had that conversation about ethics.
Because I think that's a concern for many people.
So I do want to turn to our front row guests, Monique and Stephanie.
And I want to ask you both, how is AI involved in crimes like sexual harassment, cyberbullying, human trafficking, or other violations?
And what should we be looking for in these instances?
> That's a great question.
So I know that we've talked at length about the ways in which AI can be used responsibly and ethically.
But the truth is, we know people are going to use it for good and for bad.
And while we may be in school settings and workplace settings, that could be training on us on how to do that.
There are people who don't have that training who don't have that education and outreach.
And so what we do know is that AI is being used at an increasing rate to facilitate crimes such as cyberbullying and human trafficking and sextortion.
And so it is up to us as a community, as the state agency, as the Human Trafficking Task Force, to really make sure that we are educating our communities.
Going into schools and not only training our students, but also our parents as well.
We know that AI allows bad players to engage and to exploit minor victims of various types of crime through processes, in ways like deepfakes, where they are creating material and creating content online, and profiles that are seemingly real but really aren't.
And it really does, you know, target the, our trust for technology in many ways because we're having a difficult time differentiating between what is real and what is not, what is true, what is not.
And so our kids are also navigating those very complex worlds as well.
And so, we also know that, AI also has a role in when it's facilitating human trafficking or helping to facilitate human trafficking, we also see instances in which it can be very difficult to pinpoint or to investigate and prosecute those crimes as well.
Because, lots of perpetrators, they're able to develop the content, develop the profiles, and develop multiple profiles at the same time.
So there may be multiple victimizations happening simultaneously.
And so that is why the work that we do in our office- And I'll let Stephanie share about what she's doing.
But that is why we are engaging with our middle and high school students through human trafficking and about online exploitation, because we want them to feel prepared and have enough adequate resources to respond.
And so, that is sort of our role in this work.
Just like social media is still growing and we're still learning about it.
AI adds another component that we had not considered 20 years ago, or at least I hadn't.
And so I think it is, I think we have to stay vigilant in also ensuring and understanding we don't know everything about what we don't know yet and really being humble in that.
Jada> Yeah, absolutely.
Stephanie> And I'd like to say, luckily, we have now a law called "artificial intelligence," where now if a predator does produce any images or videos using the, you know, using AI, it now is considered a felony, which is now protecting.
Before then, a lot of predators were set free because they saw it as those are not real children.
And now that we have this law, they are now seen as it doesn't matter if is a real child or not, it is portraying a real child, and therefore it is considered that, it's against the law and, you know, illegal now.
Which is protecting children a lot more, that we see.
Jada> Well, I think those are concerns for many parents across our state and, everywhere, globally.
And there are also lots of other concerns about AI.
Like as much as we've talked about the great things, I think that there are some concerns, that people have about the presence of data centers in their respective communities.
What are your thoughts on people having those concerns about data centers in their communities?
David> I think people need to be educated.
What is AI?
What is a data center?
I think folks that are trying to, bring data centers in communities, they need to be as transparent as possible and educate those communities.
What all, is going to have to take place for a data center.
You know, how much power and energy is going to have to be used?
What are they going to do about water that has to cool down things?
And how are they going to recycle that water?
How are they going to, do with that water?
You know, taxes, is that burden going to be on folks of the community?
I think a lot of these things need to be put in place, at stages and educate folks so they have a clear understanding.
Because, again, I think a lot of people it's our human nature, we go to the negative when we have something we don't know about, we have our guard up, which I understand, but, I think transparency is key.
Jada> Absolutely.
I do think there are valid concerns.
Right?
But I would love to hear from the rest of the panel.
Dr.
Folk> I'll add that, when we think about the data centers, we do have one on USC's campus, and many people didn't know that at all.
Because it's just a background building and it is not causing, any issues, for the community around it.
And so there is a correct way to have a data center in a community, where it's not harmful.
Also, I will say that our data center, we have looked into it, and we found that it is using about the same amount of resources as other buildings on campus.
With consideration to data centers going into, small communities, rural communities across South Carolina, I agree that transparency is absolutely important, but also accountability and human oversight.
So please, tell us what's going on, but also hold yourselves accountable to those rules that we discussed through our community meetings through our town halls.
And then ensure that there's human oversight for every step of the way.
Jada> I like that.
Does anyone else have to add anything?
Chelsea, Charles?
If not, Charles is getting a sip of tea after that.
I would like to know if there are some conversations happening online?
So, Paulia, thank you so much for monitoring our social media activity.
Do we have any questions for the panel from our social media audience?
> Yes.
I'm really glad that we just answered the question on data centers.
That was, a question in the chat that needed to be addressed.
And I'm glad we understand transparency is needed because South Carolina folks need to know.
And there is another question coming from YouTube right now from Trina.
She would like to know "what are some good resources, for parents and adults to use AI?"
And I think that could go to one of the front row guest, because we did mentioned that a little bit before.
But if we can expand upon it.
Stephanie> The National Center for Missing and Exploited Children does a great job, explaining it at a level that it can work with kids as young as kindergarten.
It's called netsmartz.org is what it is.
And they actually do cartoons and, do comics in order to get children to understand it a little bit more.
It also, they are one of the forefront for generative AI and helping, explain what that is.
In terms that, you know, anyone would understand very naturally.
Other places I would go to is Common Sense Media.
It is another place I recommend.
It goes, not only in AI generative stuff, but it will go through other internet safety, platforms to help you understand what is happening online gives you a better idea, simpler terms.
So those are the two that I highly recommend.
Jada> Thank you for sharing.
Adrianne, you were nodding.
I would love to know what you have to add to that.
Adrianne> Yeah.
Kind of speaking more towards the workforce that's interested in learning more about how to implement AI.
And like our panel talked about that, that necessity of building your own guardrails, because this is a tool that doesn't come with embedded guardrails.
There are a lot of local resources and community resources for learning how to use these AI tools.
We at SC Competes are launching an "AI In The AM" kind of Summer webinar series.
It's basically going to be like an AI office hours, where we'll have AI experts just sitting on our zoom webinar platform.
And then doing breakout rooms specifically focused on, you know, the workforce that doesn't have those guardrails or policies in place, you know, small businesses or parents, to give them the ability to go in and ask any question because with this, there really are no stupid questions.
And we're all trying to learn as we go sometimes.
So I think the resources like that, we were just up at Winthrop in Rock Hill for our conference a few weeks ago, and they offer a continuing-ed program on AI that's focused on, kind of the older generation and teaching them how to avoid scams, how to learn, you know the cybersecurity risks of personal use of AI.
And kind of that first steps of AI use in the home.
Jada> Yeah, I love that.
I think there should definitely be a lot more conversation around our elderly community and some of the, potential scams that exist.
But I would like to know from the panel as well, back to Paulia's question.
What do you all have to add to that?
Where should parents and caregivers be going for tools and information and resources?
Charles> I mentioned a little earlier, but again, the SREB has it, for the K-12 classroom, but I think that can be, molded to parents of younger individuals as well.
Because it's, that same sort of relationship, between an educator and a student and making sure that we're safeguarding them while also teaching them.
Because, as some of the other panelists mentioned, if we just ignore it, then curiosity ends up, catching up.
So making sure that we address it, with the children go, "hey, this is out here," and provide them a safe space to ask questions, so that can be an open conversation.
David> And I can back that, too.
I'm on the AI design team as well, besides the commission team.
And as we develop the curriculum for the K-12, we did take in consideration of, let's put it at a rigor for a high school student to learn AI.
But also keep in mind we're going to start maybe trying to look at feeding this into middle school, elementary school and start letting them learn a little bit as they're young and take those baby steps.
And because we're in the information world of technology.
Children are being born in this technology.
And we were just talking about how, the different, generations and, and the gapping, but then, how they bridge the gap between those generations to try and get people educated and keep it moving, to benefit us all.
And, so that's key, I think.
Dr.
Richard> If I could add.
<Yeah> We were talking about adult learners earlier, too.
I think that that's a really critical group.
There is some resources through Google called "Grow With Google" that's helping small businesses, nonprofits to think about those tools that are available to them.
And it's quick, like, online modules.
I think the key here is not I went to one, "Learning" and now I know everything right?
It's to continually learn and continually sharpen that tool as things are progressing so quickly.
Even at "AI Day to Day," I learned two new things that I didn't know.
And it was great.
So that... it's that continual process that we should all be in.
Jada> I like that "Grow With Google" is what it's called.
Awesome.
I would love to know if any of our audience members have any questions for our panelists?
Yes, we do.
A microphone is coming your way.
Audience member> Good evening.
I do have a question.
And I ask this question at my conference as well.
Now dealing with deepfake videos, I asked this question to a lawyer.
How do you deal with it if you- Because I work at Denmark Tech, and if you deal with it at school and you deal with that one student decides I want to go scorched earth on a professor or the dean or whatever the case.
So you deal with a deepfake video and they decide they're going to go rogue on a particular person, and they put out a video.
We know it's false.
Well, I know it's false, but HR doesn't.
Now that, that video is out there and it's embarrassing the school and everything else.
And now we have to terminate that employee.
How do you deal with that?
Because you don't have the resources to tell that's a deepfake video.
Because now AI has now fired that employee.
Jada> That might be a question for our front row guests?
Audience member> Nathan knows I asked this at the conference.
Dr.
Folk> It's an internet crime.
[laughter] Charles> That's exactly right.
Jada> So we'll make sure the microphone is passed.
Stephanie> So as I said, the artificial intelligence law now will help with that.
Also National Center for Missing and Exploited Children has a thing called "Take It Down."
It is a way that anything that goes out there on to the World Wide Web can physically be deleted using that software.
And it's pretty cool, I think it's very cool that they use, that they get that image and each image that we produce has a thing called a "hash value."
It's like a fingerprint.
And what they do is put it into- And I do it- I talk to kids, so I'm going to kind of do it the way I do it with the kids.
It's where we put it in the software, and we put that hash value out into the world wide web, and it's like "Space Invaders."
They go around and they obliterate all those things that have that hash value and completely delete it.
The cool thing that I think also, is good to know is if that person say, it comes back up in three years or five years, that hash value is still in the system and can delete it again.
And so it is a very, again, take it down is the way- I would go through the National Center.
It is free.
It's anonymous.
All you have to do is call the cybertip line and they will explain exactly how to do it.
I've had to use it, I think... 100 kids, since I've been doing this, to teach them how to do it, how to protect themselves.
And so it can- I would highly recommend to really research and look at that.
If that does happen.
Jada> Wow.
I think that one is kind of startling for me that, you know, you can take it, like that's a great thing, that people can take it down.
I've learned "Grow With Google," "Take It Down."
I've learned so much.
Any other questions from our audience?
Yes.
Audience member 2> I was just wondering, what do you all see as the long term effect of AI on culture and creativity?
David> Well... I thought about this one and we hear it all the time from especially, English teachers, like when it comes to writing a paper.
And I think we have to approach it kind of how we've always approached it.
I think that, you know, a student, writing a paper, they need to fill out that rough draft, their personal, artistic English writing skills.
And that rough draft needs to be, written on paper or typed up, but no AI.
Then for students to learn and to grow and to enhance, because you're going to have variations of students, one that's a slow comprehender, one that's top of the class.
And, for, to bridge a gap in your classroom and to bring them all in and being to grow together and to educate faster and educate more, and have a deeper understanding, I think, that's when you say, "all right, now you can use AI to create a second rough draft."
And so that AI is going to be another teacher in the classroom.
And it's going to teach them where they have a run-on sentence.
It's going to correct them on their spelling, those different things.
I'm not English teacher, so I can't go all the way into it, but... you get the idea.
And then, it's up to the teacher for the final turn-in, what they want and how they want it.
But I think those guardrails, those guidelines, those type things in every class that's taught from elementary school, middle school to high school, as you move forward.
The teachers are going to have to adapt to that AI understanding, and it's going to be in the classroom as you move forward.
There's no brakes on this.
It's the future.
So you've got to get in there and figure out how you want to approach those things.
But that's how I see it in the classroom.
Dr.
Folk> I'll add to that, that I do believe that now, teaching and learning is very iterative, and you have to show your work in every context.
Previously it was just in the math classroom.
But in English, show your work.
Use your version history to show your thought process so that you can share with your professor, "this is how I used AI, or this is how I did not use AI."
By having that kind of version history that you can share out with them.
And so, just knowing that everything is going to kind of be a "show me your work, show me you're thinking, show me what you were doing with the AI, so that we can make sure that you're actually learning" right?
Dr.
Richard> I think in the nonprofit space, that's really going to unlock a lot of creativity, because now I have more time to not answer emails and do some administrative tasks that I have more deep thinking time.
I have more protected time to do what only my brain can do.
And it really is even that idea generator, right?
It's sparking new things, new ways to approach things, new ways to engage folks that is really unlocking that creativity for folks across the nonprofit sector.
Charles> And I kind of want to add on to that.
And it's something that, we're looking to incorporate with Find Your Future platform in a year or two.
But, the creativity in solving solutions.
A lot of individuals, may not continue their education on, or may struggle to find work because they need childcare, they need transportation, there's certain issues going on.
And the ability to go, "hey, this is my situation" and AI pull in all of those different resources at once, for that individual, as opposed to that individual having to know where they are and how to find them, but provide it right there so that individual can then continue their education, can continue to apply for that job.
It really will be that creative problem solver in that aspect.
David> Can I add one last thing?
<Absolutely> I think too, we just talked about this before we walked out here.
We were talking about how I teach a bunch of students that have to be programmers.
Okay, that's just one part I teach.
And so... yes, AI is going to help to program, but let's say the internet goes down, let's say you lose power.
Where is AI to help?
So I make sure my students learn their job from A to Z from a human standpoint and can still do it without any kind of technology.
And that's what we got to do.
These kids have to learn, you work in these areas A to Z know it as a human nature.
Then you're introduced to the tool that's going to speed up the task and that can help you multithread, multitask.
Yes.
Jada> All right, I want to ask you some lightning round questions okay?
You ready?
Y'all prepare.
No AI usage on these, okay.
Would you use AI for these activities?
Why or why not?
And we got to move quickly okay?
You ready?
Drafting a professional email.
All> Yes.
Jada> Okay.
Helping a student study for a test.
All> Yes.
Jada> Okay.
Fact checking information.
Charles> As long as you're looking for the sites and then you look back at those sources to double check.
Jada> So, maybe.
Charles> Maybe.
[laughter] Dr.
Folk> Verify and check.
<Okay> David> Definitely, you have to validate because, you can't always trust AI.
Dr.
Richard> Same.
And I would also say it can't always see things that are behind paywalls.
So like academic articles that need to... So there is limited information that it has access to.
Jada> Absolutely.
Grading a student's work.
Dr.
Richard> No.
David> Some things, not all, like multiple choice, maybe.
Dr.
Folk> Heavy human oversight.
Jada> Okay.
[laughter] Charles> Agree with the same.
You can use it as a tool.
But you always have to have that human oversight involved.
Jada> Okay, I think I know your answer for this one, but planning a big life decision, like a career move.
Charles> 100 percent.
As long as, again, you have that human oversight.
So you're checking everything.
But it definitely provides a lot of options for you.
Jada> That's a yes.
<Yes> Y'all heard it here first.
[laughter] Dr.
Folk> With guardrails.
<Okay> Yes.
David> With heavy guardrails.
More so, no.
Dr.
Richard> Maybe not making the decision <Right> but showing the options and the path towards the option.
Jada> I think, I probably already use AI because I ask Siri to flip a coin to make like, my dinner decisions.
[laughter] Creating artwork or music.
Charles> I think that's definitely an option for a new form of creativity.
<Okay> Dr.
Folk> No.
David> My students actually do it in the classroom, but they got to make sure it's not something that's been copyrighted or previously used, okay?
But they actually do use it and they do some great work.
Dr.
Richard> Yeah.
Same with all those things.
So, with guardrails.
Jada> Okay.
Well, as we wrap tonight's conversation I want to know what is one takeaway about AI that you would like to leave with our viewers and listeners this evening?
And we will start with you.
Dr.
Richard> I'd say stay curious, but stay cautious.
David> I wrote this down, specifically.
<Okay> Learn more about AI, cyber and information technology.
Don't be very nervous or scared of change.
I live and work in the vast world of change, every day.
Always remember, this shall too pass.
Jada> I like it.
Dr.
Folk> I will say please check out our Garnet AI Foundry website through USC.
It has so much information and would be helpful even if you're not a faculty, staff, or student.
And lastly, please verify and check all AI generated content.
Charles> I'd say AI is an amazing tool, but like all tools, there are warning labels.
And you got to make sure there's human accountability over it.
Jada> All right.
We're going to go to our front row and see what you all have to say.
One takeaway about AI and we are going to start right here.
Adrianne> I think my biggest takeaway is South Carolina is so far really been AI forward.
And we have a really big opportunity to lead.
And it's the collaboration, like what we've seen today bringing our agencies, our academics, our industry together to have those important conversations.
Getting the right people at the table for the right conversations.
That's the strategy that we've had as a state, and it's really successful and looking forward to keeping it going.
Jada> Okay, quick, quick, quick takes.
Monique> My take away would be for us to not only focus on training specific groups, but really making sure we're bringing the training to them.
Sometimes we assume that people are going to go to our website.
So I want to make sure we're getting out into the community and going to where they are.
Stephanie> I think you stole mine.
That's okay.
Yes, come to our websites if you want to know more information.
But also, I'm all for, you have to work together in order to protect a child while they're using AI.
So make sure you get involved.
You learn more about AI in order to help them.
Jada> Okay, quickly.
Nathan> I would say that, AI is changing the way that we interface with technology.
So, pay attention to that and, give it a shot.
But also, make sure that you're using your gut and, doing the right things to protect yourself.
Jada> All right.
Thank you all.
We are reminded this evening that artificial intelligence is a tool that learns from data to help us solve problems and make decisions.
What matters most is how we choose to use it, and how we ensure that it serves all of our communities.
The future of AI isn't already written, it's something we are shaping in real time.
And that's why we have conversations like this because they matter.
Because when we come together to learn, ask questions, and listen to different perspectives, we're better prepared for what's ahead.
Thank you to each of our panelists and VIP subject matter experts for sharing your insights.
And thank you, to Paulia for managing our social media audience.
And thank you to all of our listeners for being a part of this conversation.
You can find information about our panelists and guests on SCETV.com Thank you.
♪

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Palmetto Perspectives is a local public television program presented by SCETV
Support for this program is provided by The ETV Endowment of South Carolina.