VPM News Focal Point
Tech journalist and podcast host Kara Swisher talks about AI
Clip: Season 2 Episode 17 | 19m 7sVideo has Closed Captions
Tech journalist and podcast host Kara Swisher says AI "can be a weapon but it’s a tool."
Tech journalist Kara Swisher says artificial intelligence "can be a weapon, but it’s a tool.” She sees AI’s tremendous potential for advances in medicine, education and transportation. But Swisher warns about the dangers of exploitation and is concerned that Congress has not passed any legislation to address privacy and antitrust. She hosts the podcasts, “Pivot” and “On with Kara Swisher.”
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
VPM News Focal Point is a local public television program presented by VPM
The Estate of Mrs. Ann Lee Saunders Brown
VPM News Focal Point
Tech journalist and podcast host Kara Swisher talks about AI
Clip: Season 2 Episode 17 | 19m 7sVideo has Closed Captions
Tech journalist Kara Swisher says artificial intelligence "can be a weapon, but it’s a tool.” She sees AI’s tremendous potential for advances in medicine, education and transportation. But Swisher warns about the dangers of exploitation and is concerned that Congress has not passed any legislation to address privacy and antitrust. She hosts the podcasts, “Pivot” and “On with Kara Swisher.”
Problems playing video? | Closed Captioning Feedback
How to Watch VPM News Focal Point
VPM News Focal Point is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipDENNIS TING: How did you get into tech journalism?
KARA SWISHER: I spent some time at Duke University and studying misinformation and propaganda was always my area of study.
And you could start to see as the worldwide web came into being, how manipulative it could be and how bad it could be for people and also how helpful.
And so I did things like downloaded books and all kinds of experiments and it was sort of clear to me that everything would be digitized in the end that could be digitized would be digitized, and it would change every industry.
DENNIS TING: Back then, did you foresee, you know, what we see now with, you know, AI, social media, all that being as big of a factor today as it was back then?
KARA SWISHER: Well, propaganda's not a new thing.
It's gone on since the beginning of time.
It's just these give propaganda bigger tools.
And so I was always quite aware of that part 'cause it was my area of study at college.
And so, you know, look, Hitler didn't need Instagram, or Mussolini didn't need Twitter.
Had they had them or other tools of social media to target people, it would've been devastating in a lot of ways.
And of course it is devastating right now given all that we've seen.
DENNIS TING: Should people be afraid of machine learning?
KARA SWISHER: No, no.
You should be afraid of people using machine learning and that's the difference, or AI.
It's always the people that are the problem, not the machines.
And anything you put into it is what you get out of it.
I used to call Google 'a database of human intentions,' and that's what it is.
It's human knowledge, human intentions, human thoughts.
And it depends on what you put into it.
You know, you can put bad data in or data that's skewed in some way and then it spits out something else, but it's not anything else but us reflecting ourselves back, it's just faster.
And eventually it starts to see patterns that humans can't or pull from knowledge that humans just, the brain isn't big enough to, you know, like medical knowledge, that's area that of great promise.
They can do gene folding, they can do all kinds of things 'cause humans can't do that with their small tiny brains compared to computers.
DENNIS TING: It's getting harder to detect what's machine generated and what's not.
You know, is that a concern moving forward?
KARA SWISHER: Everything is machine generated.
This is a digital world we live in.
You know, crap in, crap out.
It's not that complicated.
And so if the data say on, you know, criminals, more Black people are arrested, does that mean they're more criminal or that it's the system is skewed and racist, you know?
Well, I'd say the latter, and therefore, there's more record of that.
And therefore, they seem to the computer to be more prone to crimes, and therefore, the computer will say, go look over there more.
And then you'll find more people committing the crimes when over here, this pile of white people is attacking the Capitol.
DENNIS TING: How do you detect what's real and what's not?
KARA SWISHER: Provenance, where it comes from, what's going in, you know, anything, how do you detect anybody lying?
Right?
You have provenance following it.
And that's one of the big issues is where does it come from?
What are they using in these large language models, these LLMs and where does it come from?
And that has a lot to do with their rights to use it 'cause of copyright.
But the problem is it can be deeply manipulated and very hard to follow for the average citizen.
DENNIS TING: Do you see AI, machine learning, you know, this area having an impact on human creativity?
KARA SWISHER: Humans are very creative.
That's the greatest part of it.
These machines are not creative.
They can take other people's things and mash them together.
You give them the prompts.
I want a picture of, you know, Dolly Parton dancing, you know, and then it'll pull from what exists.
So the creativity started with humans.
The computers just has the data and is bringing it out again.
You know, human beings are sort of like a third world nation whose chemicals, mining, and jewels are getting taken.
That's really what it's like, you know?
And then they take them away and do things with them.
And the question is, who owns them?
Who has provenance over them?
Where do they come from and how did you use it?
And so if you take someone's copyright of material, you should be paying for it.
It's not that difficult, you know?
The tech people always act like it's difficult, but it's not particularly difficult.
DENNIS TING: When you talk about AI is it's hard to know, you know, the algorithms that go in.
And to my knowledge, I don't think there's any like legislation or requirement that says- KARA SWISHER: No, there's no algorithm with transparency, no.
There's no legislation on anything, really.
You can ask on all of them.
There's no legislation.
DENNIS TING: Why do you think that is?
KARA SWISHER: The politicians are incompetent?
I don't know.
That they've been bought and sold by lobbyists, that they act like they don't understand it.
That's kind of their go-to is that this is too hard, but they regulate every other industry.
And I would say car making is complex.
I would say plane flying is complex, pharmaceutical making is complex.
They should be able to do this.
And they haven't.
I think there's been a real, you know, celebration of entrepreneurs in a way that's idolatry, especially 'cause they're the richest people on earth.
So you tend not to...
It's not unusual.
This happened with the robber barons, and then they got them under control, eventually.
DENNIS TING: Do you see potential legislation coming, you know, in the near future?
KARA SWISHER: In 25 years of the internet, there's been exactly zero pieces of legislation passed.
So I would hope that maybe they would pay attention.
They're doing a lot of meetings, holding a lot of get togethers, you know, bringing in the powerful people, talking to them.
All citizenry should have a role in it.
We paid for the internet.
They didn't.
They benefited from it, we paid for it.
Elected officials should have a very big role in this.
And they haven't passed any privacy legislation, no data transparency legislation, no antitrust legislation, no algorithmic transparency.
They haven't passed anything.
And in fact, the rule that exists, the single rule tells us we can't sue them for their bad behavior.
DENNIS TING: And that's Section 230.
KARA SWISHER: Yes.
DENNIS TING: You know, I know you're not a politician, but if you were to think of legislation that would be able to, you know, check a lot of these powers, what would that look like?
KARA SWISHER: It's multifaceted.
It's not a single piece of legislation.
The cars are not legislated by a single piece of legislation.
There's not the car act, you know.
I mean, there's a lot of safety issues.
I think I would have absolutely a privacy bill, a national privacy bill that has teeth.
I'd have a data transparency bill, algorithmic transparency bill.
I would overhaul antitrust, which hasn't been overhauled in, I don't know, a century?
I would be pushing research and innovation for small companies.
It's a package of really 10 to 12 really important pieces of legislation.
DENNIS TING: You can pretty much create deep fake videos of politicians and you know, you have them say the wrong thing and, you know, you might start a war or start some international conflict.
What is the impact do you see of this on democracy?
KARA SWISHER: It's already here.
It's just like, look at Photoshop, has been here and caused problems, but people figure that stuff out in a lot of ways.
I think the issue is that it's now even worse.
I mean, it's already happened.
Like, I don't know if you've gotten on Facebook any time recently, but there's all kinds of crap on there.
It's been working its way into the American psyche for 25 years.
And very bad actors, including domestic ones, are taking advantage of it 'cause they want power.
And that's really what's happening is they want power.
DENNIS TING: So do you think these like deep fakes are kind of the equivalent of the, like the memes that we saw on Facebook during like the 2016 election cycle?
KARA SWISHER: It's just another thing, it's another terrible thing we're going to have to contend with.
It's the ability to target people that's, you know, target a million different people with a million different messages and they're very aimed at that person that's, you know, instead of spray-and-pray kind of thing.
It's a real problem.
DENNIS TING: For the people who are concerned about AI taking their job, what should they do?
What can they do?
KARA SWISHER: Nothing.
You know, I think it's really, you know, we, it's interesting because it's white collar jobs is now what's being debated, and then, therefore, white collar people are in a panic, right?
This has been happening all across, whether it's manufacturing or farming or anything else.
Delivery automation has been here for a long, long time.
And so question is, what do we do about it and what new jobs can we create from it?
Some jobs should be done by a computer.
It should not be done by people.
It's stupid.
It's stupid that we continue to insist on it, including long haul trucking, for example.
We can say, oh no, the jobs that are going to be lost, or we can think of new jobs to create for that shorter haul, talking into cities, maintaining the fleets, all kinds of things.
But people shouldn't be doing a lot of the things they're doing.
You know, you think about a lot of the wasted energy of young associates at law firms, And when a computer can do it, why?
To keep their jobs?
That makes no sense.
Then figure out what new jobs are.
DENNIS TING: What ways are there to protect people, especially, you know, kids, people who are more vulnerable and might not understand the difference from, you know, that toxicity that, you know, to be fair has existed since the internet has been around?
KARA SWISHER: Yeah, it's been existing long before that.
(host chuckling) You know, media education, fact-based education, people learning critical learning.
A lot is going to come at these people, lots of people.
And so you have to have critical thinkers and that has to do with our education system.
I think you can, you know, some of it's very hard to discern.
And so if you swallow up everything whole I don't quite know what you can do about that except train people to understand what they're consuming.
Just like food, like what's in that food?
What's it doing to your body?
Is it causing cancer?
Is it bad for you?
Is there too much sugar?
Is there... You know, it's not unlike that, it's actually exactly like that.
DENNIS TING: But who controls AI and who should be responsible for what happens?
KARA SWISHER: Right now, it's controlled by giant companies.
It's not controlled by our federal officials or people we elected.
So right now, a lot of it is being controlled and decided on by a small group of people, the richest people on earth with an interest in keeping their power.
So there we have it.
It's all the big companies, Facebook, Meta, Microsoft, Google Alphabet.
Not really Twitter, it's too small.
You know, all the big companies, Amazon, Apple, same ones who own everything else.
DENNIS TING: How do you break into it to create change?
KARA SWISHER: You legislate, it's called legislation.
It's called doing your job as legislators.
Also citizens have to demand it of legislators, otherwise, they'll be telling you how to think about everything.
And they may be good or they may be bad.
That's the problem.
Again, unelected, unaccountable, all powerful.
It sounds great.
DENNIS TING: You know, it sounds like legislation's what's needed, but it's been, like you said, 25 years and nothing's happened.
Should people be optimistic that things will change or that, you know, we can harness AI for the better and mitigate some of these, you know?
KARA SWISHER: No, I'm optimistic 'cause it happened with trains.
And so yes, we have an ability to do it.
We've done it before again and again and again when absolute power controls everything.
And so we certainly can do it again.
It's just, it's the will of the people and it's elected officials to do something about it.
Again, elected officials are really problematic, but they were elected, right?
You can crap all you want on government, but they were elected.
I'm going to start with 'em.
And then no one elected any of these tech people and they have unimaginable power over your life.
And you either decide 'cause it's convenient that you get a map or you're able to get a dating service or whatever, that this is the trade you want to make, that's really, you're a cheap date for these people.
The government created the internet.
Just so you know, the government created rockets.
Why does Jeff Bezos and Elon Musk control it all now?
Wow.
We created that, we created this, we created that, we made it possible.
The privatization of all the things that we share in common is really quite disturbing.
It's disturbing to me.
It may not be disturbing to other people, but I find it disturbing.
There's all kinds of things in medicine, in drug interaction, in drug discovery, in cancer research, climate change, all kinds of things that we can use these miraculous technologies for, but it will be determined by people who may have interests that "I don't want to deal with that."
You know, or, you know, child poverty, oh my God, everything, every problem we have, this could help not solve it, but help be solved by it with better information and information and ideas of how to fix it.
But it's again, in the hands of a small group of people who have their own self-interest at heart.
We have to think about what we want to use it for, and it should be what's used for the greatest good of all of humanity versus a small group of people.
DENNIS TING: Is there one area that you are, you know, personally particularly excited about when it comes- KARA SWISHER: Medicine, medicine, and all kinds of stuff like drug discovery, drug interaction, cancer research.
Just, that's one disease.
There's many, all kinds of researches into diseases, understanding patterns.
Oh, the kind of things it takes time to make the time shorter.
EVs, I think.
You know, not electric vehicles, but autonomous vehicles could be, or transportation could be transformed in this way.
Saving fossil fuel, getting rid of fossil fuel, you know.
Development of nuclear energy or other energy sources, another area.
Just so many things, so, so many really, and education, to me, those are all really important.
That's great stuff, that's really great stuff, but it has to be, again, someone thinking about the whole world versus a small slice of it.
The 10 richest people in the world are tech people, except for a couple of exceptions.
And the 10 most valuable companies in the world are tech companies, except for maybe one or two exceptions.
And so, you know, and they all look alike.
Whether it's Elon Musk or Tim Cook or Mark Zuckerberg or Satya Nadella, looks a little different, but not much, you know?
It's a demographic that decides everything and there's a lot of people on this planet.
Maybe there should be some more voices, and so that's an issue.
So you've got an issue of diversity.
And the ones that are speaking up, you know, around, say facial recognition, most of the people speaking about facial recognition are the people that get affected by it, not the people who don't get affected by it 'cause why would it concern them?
It doesn't affect them.
You know, that's the kind of thing you have to think about is look and see, look with your eyes to see who controls everything.
It's the same people.
They're a lot alike.
And that's not ever any good.
Being homogeneous is not good in this case.
Heterogeneity is really important, different points of views.
Disagreeing with each other, that's what's powerful.
DENNIS TING: As a journalist yourself, you know, how have you seen AI impact your work or the work of other people?
KARA SWISHER: It can make it easier.
Like writing headlines, doing or giving you ideas, generating ideas, taking care of stuff that really doesn't need to be done by a human.
I know journalists get all prickly about it, but there's all kinds of stuff that's just earnings or headlines or idea generate.
It's okay for it to generate ideas.
It's the way you use the internet.
It's okay to look things up on Wikipedia, it's fine.
It doesn't mean it's the end of it, end of your search, right?
And so it could be used in all patterns, files.
You know, investigative journalism, you could see all kinds of applications for journalists.
You could also see all kinds of laziness where you rely on it too much 'cause it says it, it's so, but you know, any journalist that, you know, does the reporting from Wikipedia deserves to lose their job.
So I don't know.
You know?
You know, you got to combine humanity with these tools just like anything else.
DENNIS TING: And then when it comes to like ethics from, you know, in journalism or just in general when using AI- KARA SWISHER: It's fine, it's fine.
It's like using a car, using a phone.
It's fine as long as you use it properly.
If you use it and you are the one in control of the situation, that's fine.
If you rely on it and are lazy, you know, it's very typical.
You've seen both cases in the case of the internet or anything else, like, anything can be manipulated.
And so you have to think of it as a tool, but it can be a weapon.
That's how you have to really look at it.
It can be a weapon, but it's a tool.
I think that there are dangers, you know, of automating everything where humans aren't involved.
That's always going to be an issue.
We're already there, by the way, in a lot of ways.
But, you know, I think government agencies have to be updated.
There should be probably a department of technology the way there's a department of, you know, the FAA or the FCC or SEC, you know.
Every other industry has something governing it.
I think it's a little more dicey because it veers into free speech, but technology is not necessarily for speech.
And so probably every single agency has to have some element of this, and they do.
There's Cyber Command, there's all kinds of things, but you know, when these tools become so sophisticated, we really should have oversight over them.
DENNIS TING: With elections coming up, what should people know about, you know, generative AI and, you know, and the election process, voting, being an informed voter?
KARA SWISHER: Well as with elections in the past, buyer beware, right?
Just be clear about what you're getting in the information, making sure it's accurate.
What happens is people manipulate it, manipulate the stuff, and allow it to go on and allow lies to just, they say them, and then it's hard to figure out.
This is a place where people who lie thrive.
They really do.
They just thrive in these environments.
And so you have to be much more aware of lying and the manipulation, but again, this is not new, it's just more sophisticated.
It totally amplifies the lies.
And you can see that.
You can look at a lot of these, you know... And it takes them to a place that's really ugly.
So you just have to ask questions.
And it should be in your interest, especially if you believe those people.
“Well, okay, I believe you, but let me see your evidence.
” What's wrong with that?
You know what's wrong with it?
It's a lie.
(chuckles) That's what's wrong with it, you know?
But people want to live in those worlds and then they get radicalized and you see it everywhere, you see it everywhere.
AI and olfactory-implant systems: Dr. Richard Costanzo, VCU
Video has Closed Captions
Clip: S2 Ep17 | 7m 33s | Dr. Richard Costanzo discusses deep brain stimulation research about the sense of smell. (7m 33s)
AI, implants & smell: Dr. Daniel Coelho, VCU Health
Video has Closed Captions
Clip: S2 Ep17 | 9m 2s | Dr. Daniel Coelho discusses deep brain stimulation research (9m 2s)
Artificial intelligence: What are the risks and benefits?
Video has Closed Captions
Clip: S2 Ep17 | 7m 46s | What does a future with artificial intelligence look like? We examine risks and benefits. (7m 46s)
Video has Closed Captions
Clip: S2 Ep17 | 2m 12s | Artificial intelligence is proving useful for medical breakthroughs (2m 12s)
Deep brain stimulation and disease: Dr. Paul Koch, VCU
Video has Closed Captions
Clip: S2 Ep17 | 10m 58s | Dr. Paul Koch discusses deep brain stimulation research and the treatment of disease. (10m 58s)
Epilepsy treatments: Dr. Kenichiro Ono, VCU Health
Video has Closed Captions
Clip: S2 Ep17 | 3m 17s | Neurologist Dr. Kenichiro Ono discusses deep brain stimulation and treating epilepsy. (3m 17s)
Is technology overuse hijacking our children’s brains?
Video has Closed Captions
Clip: S2 Ep17 | 3m 18s | A Chantilly, VA teacher says technology overuse is making kids less intelligent. (3m 18s)
Technology Expert on the Biggest Threats to Democracy
Video has Closed Captions
Clip: S2 Ep17 | 13m 36s | Expert says AI, fake news and social media are not the biggest threats facing democracy. (13m 36s)
Virginia Is The World’s Data Center Hub - What’s The Cost?
Video has Closed Captions
Clip: S2 Ep17 | 4m 37s | Most of the world’s internet data flows through Virginia. What do more data centers mean? (4m 37s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
VPM News Focal Point is a local public television program presented by VPM
The Estate of Mrs. Ann Lee Saunders Brown