
MetroFocus: July 27, 2023
7/27/2023 | 28mVideo has Closed Captions
“TAKE ON FAKE”: HOW TO IDENTIFY FAKE NEWS AND FACT-CHECK LIKE A JOURNALIST
“Take On Fake” host and executive producer Hari Sreenivasan joins us to give viewers some practical tips and tools to distinguish between reliable and fake information, and to tell us how misinformation and disinformation have impacted journalism as a whole.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
MetroFocus is a local public television program presented by THIRTEEN PBS

MetroFocus: July 27, 2023
7/27/2023 | 28mVideo has Closed Captions
“Take On Fake” host and executive producer Hari Sreenivasan joins us to give viewers some practical tips and tools to distinguish between reliable and fake information, and to tell us how misinformation and disinformation have impacted journalism as a whole.
Problems playing video? | Closed Captioning Feedback
How to Watch MetroFocus
MetroFocus is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship>> Tonight, taking on fake news.
With misinformation and disinformation spreading like wildfire, an award-winning journalist debunks and unmasks some of the biggest lies.
How you can separate fact from fiction, as MetroFocus starts right now.
♪ >> This is "MetroFocus," with Rafael Pi Roman, Jack Ford, and Jenna Flanagan.
"MetroFocus" is made possible by, Sue and Edgar Wachenheim III.
Filomen M. D'Agostino Foundation.
The Peter G. Peterson and Joan Ganz Cooney Fund.
Bernard and Denise Schwartz.
Barbara Hope Zuckerberg.
And by, Jody and John Arnhold.
Dr. Robert C. and Tina Sohn foundation.
The Ambrose Monell Foundation.
Estate of Roland Karlen.
>> Good evening and welcome to MetroFocus.
Tonight, we delve into the world of misinformation and the PBS digital new series taking fake news head-on.
The series is Take-on-Fake.
It's posted and produced by my colleague, and debunks claims you have seen or maybe even shared online to show you how to stay informed.
Each episode examines a different aspect of misinformation, to the dangers of falling down an Internet rabbit hole of fake news.
Here's a look.
>> Misinformation is becoming more widespread.
It is easier to be full than you might think.
Do you think we are past the point where human beings can identify a deep fake?
>> Yes.
>> Everyone is susceptible.
>> I am in the matrix.
How do I get out?
>> Learn kung fu.
No, I am kidding.
>> This season we are digging deeper, into higher impact issues.
>> These conspiracy theories are often rooted in anti-Semitism and racism.
>> With leading experts on the front lines.
>> We have often had to develop new skill sets.
>> You cannot rush it.
>> So you can arm yourself with the knowledge of truth.
>> If we cannot agree on reality, how can we agree on anything?
>> If we don't have a shared reality, can we turn it around?
>> Keep it real.
This is take on fake.
>> Joining us not to talk more about the series is the host and executive producer of take on fake.
Welcome.
Always good to see you.
>> Great to be with you.
>> Let's start with explaining to us the concept behind the program.
What is it you are trying to accomplish?
>> We started a couple years ago because we saw the rise of misinformation and disinformation and we thought it is a public service to try and educate and inform and it is very on brand with public broadcasting to say let's help people understand how to spot misinformation.
Because we are surrounded by it.
And I think the case for our relevance has only grown the past couple of years.
>> I want to toss to a clip here.
It has to do with a conversation in an episode you did with a journalist who has been covering the conflict in Ukraine.
Let's take a look at this for a second.
>> Hours after Russian forces began their wide-ranging attack on Ukraine, 52,000 people watched a live stream Facebook gaming that claimed to be filmed from the Ukrainian border.
The video was shared on other platforms like Twitter, attracting thousands more viewers.
Before it was debunked, proven not to be a video of real events, but scenes from a videogame.
This is just one of many examples of misinformation being circulated about the crisis in Ukraine.
If you have been following the war on social platforms, it is likely you have come across some of this yourself.
While this type of false content has become a regular feature on social media, it can be really challenging to identify during breaking news situations, as journalists work to verify and report on what is happening.
Misinformation or disinformation shared during a crisis can be upsetting, confusing, and potentially even dangerous for those who encounter it.
And by amplifying it, you could also be promoting a political narrative.
So, how do we better navigate social media during breaking news to uncover what's really happening?
To answer that question, I spoke with Emmanuel, an investigative journalist who previously joined me in 2020 after a massive explosion in Beirut.
People posted a wave of misattributed and doctored videos after that blast, including ones that appeared to show a missile striking, which as she proved, never actually happened.
>> Here is the original video.
You see it does not have that filter.
And when you play it out, you can see that there is not a missile that comes through the sky.
>> Emmanuel now has her own YouTube channel, tracing the truth, where she helps viewers.
As Russia's attacks on Ukraine have intensified, she has been providing viewers with information she independently verified, and showing them how she does it.
>> Social media made it possible for journalists to access people that we have may never had access to in the past.
Cell phones have changed the way we gather news.
It has put a camera in everyone's hands.
Also, as journalists, we have had to develop new skill sets, to understand if it is real or not.
As news consumers, they now have access to learn how that footage is being verified, and they also have access to all the same information that we do as journalists.
>> So, after this conversation and after your own work and what you have seen in terms of journalism and the concepts of misinformation and disinformation, what sort of impact are you seeing it have on our world of journalism?
>> It is kind of an existential threat to what we do every day.
I like to think, look, back when I used to work in a news program that had a certain deadline every night, I wondered to myself, if I cannot figure out what is real and what is not real, what business do I have amplifying that to thousands of others people?
And so, I don't think that journalism has ever faced this kind of an intense crisis.
People have always tried to pull one over on us, and people have always tried to lie and have their own agenda represented.
That is not necessarily what I am talking about.
But the scale at which misinformation can be created and distributed now is unlike anything we have ever seen.
So we early have to have our radar much more fine-tuned to try and figure out, is this a piece of information that is serving somebody's purpose, and then looking down into technical details saying, is this even real?
Was that from this real person's account?
Was this a real person at all that was sharing this?
Is the image doctored?
Has the audio been doctored?
You are going to require a bit of technical skill to figure out what has landed on your lap, or what you found.
>> What is remarkable is, as you said, it is not is that we have not had any journalism profession these kind of incidences of disinformation an misinformation.
As you said, it is the scale and level of sophistication now we are wrestling with.
It brings us to this next clip, which has to do with an episode you did about misinformation in the fossil fuel industry.
Let's take a look.
>> That is one of those key differences between misinformation and disinformation, is intent.
When you looked through all the archives, what is the intent that you found?
>> Document after document that showed that all of these industries, experts who work for these companies, knew that they could not get his plastic recycling off the ground.
You had these fundamental problems that they were outlining over and over again.
The number one being that it's just cheaper and easier to use new plastic made from virgin oil and gas than it is to use old plastic trash.
Old plastic trash is very expensive to collect, srot, a -- sort, and do something with and keep it pure.
So it is a very tricky problem.
This is more than 40 years ago, and this is right where we are right now.
How did they know this then, and yet we were told something totally different for decades?
And all the way up until me sitting at my desk going, wait, what?
What do you mean none of this gets turned into something else?
That, for me, was like, I remember that moment because it was the first document I saw where I thought, oh my God, they knew.
>> Other reporting has shown how the oil industry was aware of its role and drive in climate change, and followed is similar strategy of promoting mis, disinformation to deny and distract from the facts, while pushing a narrative of personal response about he back onto you, the consumer.
And it was not just ads.
When it came to plastic recycling, these companies invested millions of dollars in splashy public programs that their own research told them were doomed to fail.
>> They started launching all these feel-good projects nationally, and they would get these huge fanfares and throw a whole bunch of money behind them.
And we started looking at -- we pulled a dozen of the projects launched in 1981 dozen of the projects launched.
-- AMCO was going to recycle all the plastic in New York public schools and turn it into something else.
There was a new recycling facility with fancy machines.
All of these big, splashy things, and they would get all this press on the newspapers and the local news, brand-new thing.
The message to the public was they will not be any plastic left in the national parks.
So glad that got solved.
When we looked five years later at what happened to all 12 of those projects, every single one of them had been failed, shut down, they canceled all of them.
And not any news stories.
Nothing happened.
>> Coming out of this, let me ask you to help us understand all of this, and to focus on the difference between misinformation and disinformation.
>> Right.
The easiest way to tell those two apart is intent.
Misinformation can be something that we shared unintentionally.
Hey look, I saw this cool thing, take a look.
Eh, that might not have been true.
You did not have the intent to deceive.
But disinformation, now, whether that is coming from a state actor, or whether that is coming from a corporate entity or a lobbying group, that is people who are specifically targeting you with a piece of information to try and change your mind knowing in the first place that it is not true.
>> How about the introduction now of the world of artificial intelligence?
Talk a little bit about how you all are taking a look at that.
>> Yeah, this last season has been full of videos that we have been doing about artificial intelligence.
I have even gotten on the kind of vertical platforms like TikTok and Instagram reels to be doing shorter clips where people are.
But artificial intelligence, it is just a tool.
I want to say that at the outset.
It's like an ax or the Internet, it is about how you use it and who wields it.
I don't want to throw the baby out with the bathwater.
I think there are some amazing things we are going to be able to benefit from as a society with artificial intelligence.
That all said, the scale at which you and I can create photorealistic images that, just with the type of a few words, is amazing.
One part of my brain, the technological, nerdy, geeky side , my mind is blown.
The other side of my brain working as a journalist says, oh my gosh, how my going to be able to tell whether or not this is real?
Where is the watermark?
How can I run this through an image search engine?
What can I do?
So, again, artificial intelligence just kinda makes the disinformation or misinformation creation process spread on steroids.
It's in the past six months, the capabilities that I have seen with people using tools that are available on a desk top, laptop, or really even one of our phones now, is tremendous.
>> How fearful are you -- you touched base on it a moment ago, but let's focus on artificial intelligence and journalism.
How fearful are you that this mi ght extraordinarily exacerbate we talked about before?
>> Look, there are several layers of threats to journalism.
And I think the more you are in an environment that is under a deadline pressure, where the stakes are high, and in my head as I say these words, I am thinking to myself the election, the election, the election.
When there is public opinion that could be swayed one direction or another, what piece of information changes a campaign, what changes a raise, what comes -- a race, what comes in in the night before the election, how fast is that misinformation spread and how long does it take?
There is that old adage, a lie can spread around the world before the truth has time to get out of bed and put its shoes on.
On this scale, what really concerns me is it can go around the world several times and the amount of time it takes for the truth to catch up would really take a lot of effort.
Again, journalists are up to that task, but are people going to make decisions based on bad Intel, that information when they go to the polls, or anything else?
>> You talk about anything else, and that leads me to the next clip, and that is, as you said, the spread of propaganda misinformation.
One of your episodes took a look at Russia and the propaganda being spread there.
>> Clint knows all too well how convincing Russian propaganda can be.
>> Russians are there very quickly every morning.
They are voluminous in their messaging and very consistent.
It makes them particularly effective even though they are not committing that much resources.
>> It is his job to research counterterrorism, terrorism, social media, and Russian disinformation.
He even wrote a book on the subject.
He has witnessed firsthand Russia's tactic of placing a kernel of truth at the center of a web of lies to make it all seem true.
In the 1980's as the Cold War was coming to an end, the KGB successfully started a disinformation campaign that blamed the U.S. for the Aids epidemic, accusing the Pentagon of genetically engineering HIV.
>> That was from an academic journal.
It took a kernel of truth, they wrapped stories around it, they put a bogus expert on top of it and it proliferated around the world.
That is happening exactly the same way today claiming there are bio labs in Ukraine.
There is a kernel of truth, but it had nothing to do with a bio lab.
These were medical research facilities.
But that kernel of truth helps power these conspiracies forward.
Oftentimes in the early days when they were less sophisticated, they would create personas that looked like Americans which were operated from Russia, and the content they would share would be Russian overt propaganda.
>> Since the early 2000's, Russia has been engaged in a large-scale propaganda campaign in Ukraine, part of Vladimir Putin's efforts to take back what was lost when the Soviet Union was dissolved.
For the people of Ukraine, it is something they have been dealing with for decades.
>> So, coming out of that, we often hear the term the disinformation war.
Is Russia winning that war?
>> Well, if you want to talk about the conflict in Ukraine, they are pretty successful at changing the hearts and minds.
I mean, early on, what you saw in the Ukraine conflict is that there really two big streams of information.
In Russia for example, if you were older and more likely to watch TV, you probably were getting the truth that Vladimir Putin would like you to consume on state-sponsored television.
If you were younger and you are looking across social media, you might have seen a whole different side of the war.
You might have seen Russian soldiers in harm's way.
You might have seen them being defeated by Ukrainians, because the Ukrainians were very social media savvy and reporting this stuff out as well.
So at some point you have to figure out, wait a minute, which of the things I am seeing, which version of the truth do I believe?
It could not possibly be both, an absolute success for Russia and an absolute success for Ukraine.
So people had a little bit of a cognitive dissonance trying to figure out what it was they saw, versus what it is they believed.
So Russia is doing as well as it can when it comes to the information control and manipulation, because that is pretty important in a war.
It's not just what you put out on a battlefield, it is the images people get at home about what is happening.
Especially the parents of soldiers, who are perhaps dying on the frontline.
What kind of images are they seeing on a nightly basis?
>> As you know, one of the expressions that circulates and percolates within combat is the expression that truth is the first casualty of war.
And I think what you pointed to here underscores all of that.
You have talked often in various episodes about the notion that we seem to place the burden on us, the consumer, the information, to figure out, ok, what is genuine, what is legitimate here, and what isn't.
I guess the question is, why are we putting that burden on us, on the consumers of this information?
>> Look, I think that is a fantastic thing that we should be thinking about collectively.
I have this other analogy that I go back to, which is, I think it was maybe British Petroleum hired a big PR firm decades ago to come up with the phrase, carbon footprint.
That really put the onus on myself and you deciding to ride our bikes to work and thinking about what we contributed to climate change.
I am not saying that is not important, but it also took the eye, the spotlight off of maybe the cause, the giant amounts of fossil fuel emissions, etc.
In this era, what I think is happening is a parallel in platforms, meeting Meta, Google, TikTok, all of these folks are interested in you and I being media literate, becoming smarter consumers.
Can I do think that we have a responsibility -- and I do think that we have a responsibility, that we should get smarter.
I am not opposed to that.
But I also look back and say, wait, listen, you guys built these algorithms that you do not even know how to control anymore and they are enabling misinformation and disinformation to spread on scales that we never comprehended.
And, listen, you have some responsibility here too, so come on, let's go, let's start fixing this.
And these folks go up on Capitol Hill and raise the right hand and ask legislators, please, regulate us, because they know it is not going to happen.
>> What would you say to someone who is watching this conversation saying, wait, that's me.
I like to consume information, but I am not highly technically literate in terms of filters and things such as that.
And look to you and say, OK, what do I do?
What kind of advice can you give me about that?
>> The most simple ways to combat misinformation and disinformation to become a smarter consumer, is really just to take a breath before you press the forward button, before you press the "like" button, before you even open a link.
Just slow down for a second.
If you run a clause a headline or a story or an image or a video that just get you going, like, really makes either your blood boil because, oh my gosh, I cannot believe this exists, I hate this, I want to tell my friend about this.
For the inverse, this is fantastic, this is just what I think, and man, I want to share this.
Hold on a second.
You might be a pawn in somebody else's chess game.
You might be one vector that whoever created that disinformation, that intentionally false information, they might be banking on that emotional rise that you get to be the trigger that has you forwarding this information.
And that is one of the easiest ways we can pull back.
In the words of the sage Daniel Tiger, when you get so mad that you want to roar, take a deep breath and count to four.
Words to live by on the Internet in 2023.
If you can just calm yourself down and say, who profits from me sharing this?
Because it is my credibility on the line when I tell 15 friends, look at this.
If one of those folks are smart enough to say, hold on, suss out there's something fishy about this, then they have to reply back to that email chain and say, hey, that is not really true, I have got egg on my face.
>> I agree with you, I have often believed Daniel Tiger is the font of all wisdom.
We should all say, what would Daniel be saying.
We have about two minutes left.
I want to come back to something you touched on before, and that is politics, and what we are seeing here.
We have on the horizon what will be an extraordinarily combative presidential campaign.
Also house and Senate seats, I believe, up for grabs.
What do you anticipate, based on what we have seen so far, the role, the extent, the level of misinformation and disinformation will be?
>> You know, I wish I could be optimistic about what is coming.
I wish that there were automatically going to be tools that would help us spot on this and put it big giant neon sticker on any image that you see on Facebook or on Instagram, a video on TikTok.
But I do not think that will be here, at least not in time before the election.
So, I am expecting that there will be a lot more fake images that are used in campaign ads.
Either the flyers you might get at your door, or something that you see online.
We have already had examples of it, and we are still quite a ways away.
And I think the tools are going to get much more refined, and it is going to get very hard for you to automatically say, seeing is believing.
So, what I think is going to be very difficult for us as journalists is, how do we instill in our audiences a healthy skepticism without crossing over into institutional cynicism.
Right?
Because I really think that if we sat down and we went over all the things that could be faked today, in 2023, I think there's a real good chance people would say, why should I believe any of this stuff?
Do I even know this is PBS, do I know this is the BBC?
Ah, forget it.
I do not want people to throw up their hands and assume everything is fake.
But I really do want people to be a lot more vigilant before they start making decisions about what they are going to do at the polls.
>> That is a great point, it is not just that one flyer that shows up on your front porch that he tossed aside, but you end up literally and figuratively tossing out all of it.
We could talk forever.
It is a marvelous series.
Here's the best compliment I can give you.
Each one of these episodes you walk away saying, I did not know that, but now I do.
And I think that is the greatest value of anything that you can do in journalism.
Always good to see you.
Marvelous work.
Look forward to talking again with you soon.
>> Thank you so much, John.
Take care.
>> thanks for tuning into MetroFocus.
You can take our word winning program anywhere you go with MetroFocus the podcast.
Listen and subscribe where you get your podcasts so you never miss an episode, or just ask your smart speaker to play MetroFocus the podcast.
Also available at metrofocus.org and the NPR1 app.
>> "MetroFocus" is made possible by, Sue and Edgar Wachenheim III.
Filomen M. D'Agostino Foundation.
The Peter G. Peterson and Joan Ganz Cooney Fund.
Bernard and Denise Schwartz.
Barbara Hope Zuckerberg.
And by, Jody and John Arnhold.
Dr. Robert C. and Tina Sohn foundation.
The Ambrose Monell Foundation.
Estate of Roland Karlen.
♪
- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
MetroFocus is a local public television program presented by THIRTEEN PBS