
Disinformation and democracy
Season 26 Episode 46 | 56m 21sVideo has Closed Captions
Disinformation and democracy: Civic discourse in the digital age
More than 150 years ago, Alexis de Tocqueville noted that it is easier for the public to accept a simple lie rather than a complex truth. With the advent of digital and social media and today’s diverse, highly polarized society, it feels as though this adage is being used to quickly dismantle American democracy.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
The City Club Forum is a local public television program presented by Ideastream

Disinformation and democracy
Season 26 Episode 46 | 56m 21sVideo has Closed Captions
More than 150 years ago, Alexis de Tocqueville noted that it is easier for the public to accept a simple lie rather than a complex truth. With the advent of digital and social media and today’s diverse, highly polarized society, it feels as though this adage is being used to quickly dismantle American democracy.
Problems playing video? | Closed Captioning Feedback
How to Watch The City Club Forum
The City Club Forum is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- [Narrator] Production and distribution of "City Club Forums" and Idea Stream Public Media are made possible by PNC and the United Black Fund of Greater Cleveland Incorporated.
(energetic music) (bell chimes) - Good afternoon and welcome to the City Club of Cleveland, where we are devoted to conversations of consequence that help democracy thrive.
It's Friday, October 29th.
And I'm Kristen Baird Adams, president of the City Club Board of Directors and chief of staff of PNC's National Regional Presidents Organization.
We are so excited to be here today marking the City Club of Cleveland's 109th annual meeting.
The City Club is one of the nation's oldest continuous independent free speech forums renowned for our tradition of debate and discussion.
Since 1912, we have maintained a vision of strong informed individuals and communities that prize freedom of speech and civil, civic dialogue.
You could say that at the City Club we are in the business of being a trusted source of information.
If there is one takeaway from this past year, it's how fragile American democracy is and how vulnerable it can be to disinformation.
And this is where our commitment to freedom of speech and our commitments to democracy and to civil, civic dialogue create some surprising tensions.
All of which is why we are so pleased to invite Nina Jankowicz to the City Club stage at our annual meeting keynote, which is also the Samuel O. Friedlander Forum on Free Speech.
Ms. Jankowicz is an internationally recognized expert on the intersection of democracy and technology, and is a global fellow at the Wilson Center, the nation's key nonpartisan policy forum that tackles global issues through independent research and open dialogue.
Last summer, Nina Jankowicz released her book, "How to Lose the Information War, Russia, Fake News, and the Future of Conflict".
In that book, she takes us on a journey through five eastern government's responses to Russian information warfare tactics, all of which have failed.
While America is finally begun to wake up to the threat of online misinformation, Ms. Jankowicz has spent years in central and eastern Europe advising governments who have long been on the front lines of the information war and even testified before the US Congress and European Parliament on the issue.
Her writing has been published in the "New York Times," "The Washington Post," and "The Atlantic," and she is a regular guest on major radio and television programs such as the "PBS News Hour" and NPR's "All Things Considered".
Friends, members and guests, please join me in welcoming Nina Jankowicz to the City Club of Cleveland.
(audience applauding) - O-H?
- I-O!
- Yes!
I really wanted to do that, thank you for indulging me.
(audience laughing) It is lovely to be here today.
Thank you for that lovely introduction.
I am so delighted not only because this is a really important organization here, I am just really struck by the important work that you all do, holding up dialogue, diversity, and democracy, which I think as I get into my speech, you'll understand why I feel are so important.
But this is also my first in-person book talk.
Do not publish a book during a pandemic.
(audience laughing) I don't recommend that.
I have done about 150 Zoom events, and it's just really, really something to see not only 100 copies of my book in your hands, but to hear you laughing at my dumb jokes.
So thank you in advance.
I also, now that we're able to travel again, I'm really looking forward to getting outside of the beltway in Washington, DC.
Issues of democracy and disinformation don't just affect policymakers and politicians.
They affect all of us.
And so I think it's really important for me to talk about these issues that I've studied to audiences all around the country and as much as I can these days, around the world as well.
So really, really happy to be here.
Thank you so much to Cynthia and Dan for inviting me.
I'm also happy to be here, not just because disinformation is a critical topic for the future of American democracy and democracy around the world.
And not just because the release of the Facebook papers over the last few weeks, this is a critical time to discuss all of that too.
But it's critical for me to be here because I believe that individuals, not just journalists or civil servants or politicians are part of the key to winning the information war.
But we're not there yet.
Unfortunately, we've seen many individuals taken in by disinformation taking to the streets over the past 19 months.
They have endangered public health, falsely claiming that the coronavirus isn't real or that COVID vaccines contain microchips meant to track us.
Just for the record, that is not true.
And they've also endangered public safety and our democracy storming the Capitol on January 6th.
At its core, disinformation is antithetical to democracy.
We need good information to participate in the democratic process.
And those that traffic in disinformation are knowingly dismantling it.
This is a truth that five years after revelations about Russian interference in the 2016 election came to light we still seem not to fully understand.
And that's why I wrote my book, "How to Lose the Information War," which as Cynthia noted travels to the front lines of this conflict in central and eastern Europe and interviews the folks who have been in this battle much, much longer than we have.
I've noticed that there's a uniquely American hubris to our approach to disinformation, particularly of the Russian variety.
We seem to think we're the first people that this happened to when that couldn't be farther from the truth.
Of course, the Soviet Union was involved in propaganda and active measures campaigns throughout the communist period.
But since the dawn of the internet era, Russia has auditioned and perfected its tactics in nations like Estonia, the Republic of Georgia, Poland, the Czech Republic, and Ukraine.
I was in Ukraine working as a strategic communications advisor in the country's foreign ministry during the 2016 election and its aftermath.
And while I was there, it became clear that while we in the west have been slow off the starting block, unable to recognize the dividing lines in our society that disinformation weaponize, we were also unwilling to admit that our fellow citizens draw those dividing lines.
But Russia has us lapped.
Although the Kremlin's goal is increased global influence, Russia's disinformation campaigns operate on an undeniably human level, often employing local actors to cast a spell of plausible deniability and increase the authenticity of their message.
Our response, however, has existed almost entirely in two realms.
The government realm so far has consisted of classified briefings, of sanctions, of taskers and talking points.
While in the tech realm, executives believe in content curation, in fact checking, and furious games of what I call whack a troll.
Removing fake accounts, thank you, thank you.
Removing fake accounts created by malign actors only to see others pop up.
Like the carnival game of whack-a-mole, whack a troll is all but unwinnable.
Neither tech platforms nor governments nor journalists can fact check their way out of the crisis of truth and trust that Western democracy currently faces.
Keeping people at the heart of Western policy on the Kremlin's influence campaigns is critical not only in responding to Russia's online offensives, but in repairing the cracks that our democracies have allowed to fester in the first place.
If we don't, our efforts will become yet another cautionary tale and an example of how to lose the information war.
So today I'm going to tell you a story that exemplifies all of that.
Unlike many other stories about Russian disinformation, it doesn't focus on botnets or the number of engagements that a particular post got.
It's not about the 2016 election, and it's only tangentially about former president Trump.
It will challenge, I hope, many of the popular conceptions about disinformation you have, and hopefully inform how you think about this problem.
Not as one that targets one political party more than another, but as one that threatens our democracy no matter your party affiliation.
It also shows us that part of any strategy to win the information war needs to think about how we arm individuals for this fight.
What tools do they need, do you need to navigate an increasingly treacherous information environment?
Our story begins just over three years ago on October 19th, 2018, when a criminal complaint on Russian interference in the 2016 election was unsealed.
It lays out how the St. Petersburg-based troll factory, the Internet Research Agency or IRA funded and implemented its online influence campaigns in the United States.
The level of detail is astonishing.
The complaint uncovers the budget of the so-called troll factory, or as the document refers to it, the conspiracy.
It reveals the conspiracy's organizational structure and it details communications between employees of the IRA.
But one detail in particular stood out to me, quote, "On or about July 1st, 2017, a member of the conspiracy contacted the Facebook accounts for three real US organizations to inquire about collaborating with these groups on an anti President Trump flash mob at the White House, which was already being organized by the groups for July 4th, 2017," end quote.
This detail was shockingly familiar.
As I'm sure will surprise none of you, community theater has always been a hobby of mine.
And I recalled friends posting about an Independence Day flash mob on Facebook.
They planned to dress up in colonial attire at the height of Washington's muggy summer to sing a parody of, "Do you Hear the People Sing," the famous revolutionary anthem from the musical "Les Miserables" in front of the so-called People's House.
The event page for the flash mob had long since been removed from Facebook, but in the age of live streaming it wasn't difficult for me to find videos of the festivities where several hundred people gathered in front of the White House on a sunny, sweaty Washington Independence Day.
A young guy in a Revolutionary War getup complete with a tri-corner hat and waistcoat addressed the crowd, quote, "Hear ye, hear ye citizens," he began ringing a handbell, "Resist the rule of treasonous King Donald who has betrayed the Republic and offered his soul and conscience to the czar of Russia and consigned American welfare to ruin.
Declare your independence from this stupid, stubborn, worthless, brutish man."
His words, not mine.
(audience laughing) "God save the United States."
The crowd waved their American flags and cheered.
According to the criminal complaint, it was the Internet Research Agency who spent $80 to buy ads on Facebook to promote that event.
In an entirely unexpected collision of my two great loves, it seemed that Russia had weaponized show tunes.
(audience laughs) I soon found myself down a bizarre rabbit hole, having coffee with one of the event's organizers, Ryan Clayton, who had no idea until I told him that he had been an unwitting victim of Russian election meddling a couple of years ago.
His oblivion, partly a result of the fact that he had spent a significant amount of time on a beach in Southeast Asia, triggered a troubling realization for me.
How many Americans are currently in Facebook groups or WhatsApp chats where Russian actors are laundering disinformation, seeding it within authentic American discourse?
And how many don't understand just how sophisticated these operations are, that just because a group or a message board is locally organized, just because some of its members might know each other in real life, doesn't mean that Russia or other foreign adversaries haven't found their way in to manipulate them.
How do we counter disinformation when it runs up against our own first amendment rights?
Since Mueller's report, the bulk of our collective attention to Russian election interference in the media, on social media platforms, and in government has been focused on fake accounts and outright disinformation.
Companies like Facebook and Twitter issue regular take-down reports as part of their furious game of whack a troll, and Congress' focus has been on the illicit purchase of campaign ads and content moderation policies.
But online information operations have become more diffuse and sophisticated, adapting their tactics to increased scrutiny.
Rather than just simply creating fake accounts, Russian operatives are also infiltrating authentic activism and using American voices to turn us against one another.
It was part of the toolkit even in 2016.
And my own research and experience shows that our collective retreat into more private spaces online leaves us even more vulnerable to such manipulation today.
This is a complex strategy that's far more difficult for the US government and tech companies to combat.
And one that is especially powerful during times of civil unrest and uncertainty like the coronavirus crisis.
The story of that July 4th flashmob in 2017 is a timely warning of just how vulnerable we continue to be to sophisticated foreign and domestic machinations even today.
I decided to seek out the flash mob's organizers to find out more.
Ryan Clayton was the leader of Americans Take Action, ATA, a progressive activist group that was one of the core organizers at that protest.
After nearly two decades, working in politics as a campaign manager, political advertiser and bonafide rabble rouser, the Trump era had done Ryan in.
He was protesting at an event for right-wing activist James O'Keefe, and several attendees there put him in a choke hold, pushed him down a flight of stairs, and that left him with depression and post-traumatic stress disorder, he said.
He left the country in search of solitude.
When I looked him up, his group's website seem defunct, but I sent out an interview request anyway.
A few hours later Clayton replied, thank you for bringing this to our attention along with a few choice expletives that I will leave out today.
I met him a few days later in a DC coffee shop just before the 2018 midterms.
Clayton had no idea that one of ATA's protests had been described in the complaint until I emailed him.
He said, quote, "When I heard about that, I was like, Jesus, I'm glad I got out when I did because twilight zone politics just turned into like Inception," end quote.
Clayton, and a few progressive friends started ATA right after Trump's November 2016 victory.
And they secured tickets to his inauguration and stood up and linked arms as the president-elect was taking his oath of office, revealing blue T-shirts with red and white letters that spelled out resist.
Clayton wore the letter T, he and his friends were arrested, and the picture of their protest became an icon of the Trump era.
At the Conservative Political Action Conference that year, you may remember this, Clayton and ATA handed out Russian flags emblazoned with Trump's name and gold letters during the president's address.
They were found out and ejected from the speech, but not before hundreds of audience members, unaware that they were holding the Russian standard, began to chant USA, USA while waving the flags.
A few months later, ATA attended the Washington Nationals baseball home opener, and dropped a resist banner from the upper levels of the stadium.
Creative protest is what Americans Take Action built their movement around.
And the July 4th Les Mis flashmob was no different.
Americans Take Action wanted to create a positive environment where people could protest on Independence Day.
A few hundred people attended, which Clayton describes as an outlier for their events.
They chalked up the high attendance to the creativity of the event.
He told me, quote, "A lot of people like karaoke.
A lot of people like showtunes.
A lot of people had off work.
It was July 4th, it was the National Mall.
We thought that's what brought people out.
We definitely had no idea there was somebody sitting in the IRA social media unit drilling psychographically targeted ads to people like us," end quote.
They did know, however, that someone was advertising the event.
Several progressive Facebook groups had come together at the last minute to organize the protest.
They got on a conference call before the flash mob, which Clayton vaguely remembers.
Someone on the call mentioned that an offer of free advertising had come in and Clayton recalled saying, quote, "Hell yeah, I want free advertising".
But there was a hitch.
In order to advertise the event, the group offering the ads needed to be made an administrator of the event page.
Clayton hesitated, only a little.
Quote, "I remember thinking what's the group?
It's not like Politicians for Killing Puppies or something," end quote.
He thought it had the word resistance in it.
And given that most progressive organizations function in operational poverty, Clayton's words, not mine, the group decided that they would allow the ads to run.
Now according to the criminal complaint, the organizers were actually communicating with an Internet Research Agency employee posing as Helen Christopherson, one of the IRA's carefully cultivated fake profiles, Created in May, 2015, the Christopherson account claimed to live in New York City and her hometown, she said, was Charleston, South Carolina.
Quote, "While concealing its true identity, location, and purpose," the October 2018 criminal complaint says, "The conspiracy used false US persona Helen Christopherson to contact individuals and groups in the US to promote protests, rallies and marches, including by funding advertising, flyers and rally supplies," end quote.
Christopherson wrote to one of the organizers of the July 4th flash mob in off-kilter, but not entirely incorrect English.
She wrote, quote, "I got some cash on my Facebook ad account so we can promote it for two days.
I got like $80 on my ad account so we can reach like 10000 people in DC or so.
That would be Massive," end quote.
Capital M massive.
Clayton didn't think these messages were addressed to him.
He remembered finding out on that conference call that an outsider was offering these free advertisements through that call.
And if his memory is murky, he has no way to correct the record.
Facebook removed all evidence of those messages and the Christopherson account.
The criminal complaint says the proposed targeting for the ad, which put individuals within 30 miles of DC and Virginia in its crosshairs, it reached 29000 to 58000 people.
And Clayton thought it was impactful.
The turnout far exceeded his expectations and he doesn't remember seeing anybody suspicious at the protest besides the few hundred people dressed as American revolutionaries, that is.
Of course he can't definitively say that the event was well attended simply because of that $80 in Facebook ads, but he still thought they brought some people out.
As I mentioned, Clayton had worked for a time in political advertising and believed that even if he himself had placed those Facebook ads, there wouldn't have been a few hundred people there.
Clayton's political background also made him a little apprehensive and perhaps even frightened about what the IRA and the Russian government did in 2016 and continue do today, divide the American people.
At first, he was confused about why the IRA bought ads for his event.
He said it would be like the Democrats running ads for the Republicans.
But the $80 spent to drive showtune-loving progressive DC area residents to the White House to protest the president in song on Independence Day wasn't a mistake.
Russia has long tried to increase discord in American society.
Repairing and not exacerbating the rifts in society means that we need to play more than just that game of whack a troll by deleting inauthentic social media accounts and straight up disinformation.
We need to invest in building public awareness like we're doing today, that disinformation is not just about cut and dry fakes.
It feeds on and amplifies and weaponizes our emotions, putting us against one another.
And it's not just widespread digital literacy campaigns targeted at voters and social media users that we need.
Politicians and public relations professionals need to sign on as well, calling out disinformation whether foreign or domestic, no matter which political party it opportunistically supports.
It is an equal opportunity threat to democracy.
If our elected officials and political organizations are trafficking in disinformation, we are doing Russia and China and Iran's work for them.
Ryan Clayton now understands this far better than most Americans.
By buying ads for that liberal leaning, Trump attacking musical theater flashmob, Russia's Internet Research Agency was trying to hasten American polarization.
Clayton told me, if you can weight the sides, you can really pull at the fabric of society.
You can pull it apart.
This tracks with what I've observed in central and eastern Europe, where as I mentioned earlier, our allies have been fighting disinformation long before the US even recognized it was a problem.
What they've learned is that they can't hermetically seal their information environments to counter disinformation.
Instead they've invested in building up their citizens' resilience and filling in those fissures that allow disinformation to flourish in the first place.
In Estonia the government has invested in Russian language media and educational opportunities to help inoculate ethnic Russians against disinformation.
And they've been targeted by the Kremlin's lies since the country's independence.
In the Republic of Georgia, where Russia occupies 20% of the country illegally, a civil society organization is training influencers like comedians, musicians, actors on how to spot disinformation and why that matters, sending them to their home towns to do performances that include creative material reflecting their training.
Think, you know, John Oliver, but a little bit more post-Soviet.
And in Poland and the Czech Republic, two countries that are well aware of the national security threat Russia poses, civil servants are learning why they are targets of disinformation and how to counter it in their work.
In Ukraine secondary schools and libraries are offering training on media, digital literacy, and emotional manipulation to both students and voting age adults alike.
I don't kid myself.
Education and awareness alone cannot win the information war.
It's going to take a longterm investment in them to even win a single battle.
But in countries where disinformation emanating both from Russia and domestic sources has long been a reality of life, empowering people to be active and engaged members of society through investments in the information space and in people themselves is always part of the solution.
While platforms, governments, and civil society organizations play losing games of whack a troll and fact checking, and they wield their digital cudgels, bad actors are going to continue to manipulate and amplify our weaknesses.
Moscow will continue to attempt to influence our democracy as it has for decades.
And now that the Kremlin has written the textbook for how to do so, other bad actors are already imitating Russia's tactics.
To prepare for these future attacks on democracy, and they will happen, and indeed, even those attacks from within, as we've already seen this year, we have to think beyond Russia to the key actors in the democratic process, people.
Thomas Jefferson recognized this, writing in 1820, "I know of no safe depository of the ultimate powers of the society but the people themselves.
And if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education."
And that's where you come in.
I hope if you walk away with one idea from my presentation today, it's that individuals have incredible power.
You're on the front lines of the information war.
Make every tweet you send and every Facebook post you interact with grounded in truth.
With your help, we can return reality to our civil discourse and disincentivize the disinformation that is degrading our democracy.
Thank you so much.
(audience applauding) Today at the City Club, we're listening to a forum about disinformation and democracy featuring Nina Jankowicz, author of the book, "How to Lose the Information War, Russia, Fake News, and the Future of Conflict".
We're about to begin the Q and A portion of our program.
We welcome questions from everyone here in the room and via live stream.
And including those who are joining us via our radio broadcast on 90.3 Idea Stream Public Media.
If you'd like to tweet a question, please tweet it to @thecityclub.
You can also text them to 330-541-5794.
That's 330-541-5794.
And our staff will try to work it into the program.
May we have our first question, please?
- My question is, the Republican party is now using critical race theory as divisor.
And that is going to become much more political.
Most parents love their kids.
And when they think the kids are being attacked correctly or incorrectly, they become much more passionate about it.
And that's what we're seeing in school boards right now.
The election is in two days, that we're going to have that the division is becoming much more severe.
How do you get people to understand and care to go into more detail rather than just look at the headlines?
And one party taking advantage of the lack of understanding of the issues and just run with the headlines?
- Thank you for that question.
And yeah, you're absolutely right.
Critical race theory has become one of those hot button issues that the Republicans and other, you know, disinformers who are engaged in disinformation for profit, frankly, there are plenty of, you know, media outlets that are making money off of this too, have seized on.
And I live in Virginia and in Loudon County, that's one of the areas where people have really honed in on this topic.
But it's no different than any of the other hot button issues that, you know, allow disinformation to flourish.
It's, you know, weaponizing people's emotion.
And so what I try to tell people to do, and this doesn't solve the problem for us right now, but it is, you know, a broader kind of skillset that I hope everybody takes home with them, when you're looking at stuff on the media, whether it's, you know, mainstream media, fringe media, or even social media, if you feel yourself getting really emotional, there's a good chance you're being manipulated, right?
It is meant to drive that emotional interaction in you.
So that is something I always try to warn people about that when you feel that, you know, angst rising in your system, it's good to just take a step back and think before you share.
More broadly, though, we were talking about PBS and NPR at our table at lunch.
This is really why I think we need to invest more in public media in this country.
I frankly, during elections, during important events like on January 6th, I don't watch anything but PBS because they are getting into the nuance of the issues.
I've been on a lot of the main cable stations.
You got three minutes for a hit there.
When you go visit Judy Woodruff on PBS, they apologize if it's less than five.
(laughs) And I think that's really important.
You know, a lot of people have, they're kind of incredulous, right?
That PBS, that NPR, these government funded institutions can provide a nonpartisan, balanced source of information.
But it works for the BBC.
60% of Brits still trust the BBC, even though they have a lot of polarization problems as well, 60% still trust the BBC in a time of crisis.
And NPR and PBS remain the highest trusted media outlets in our country.
And they cover news deserts as well, where there aren't local for-profit news channels.
So I think if we invest more than the $1.33 per year per person that the United States invests in public media, that's going to help balance things out a bit.
- [Man] But the audience for PBS is such a small-- - Yeah, yeah, you're absolutely right.
So for those that didn't hear, the gentleman said that the audience is quite small.
And that's right, there's a small audience because they can't compete with Fox and MSNBC that are running these emotional headlines in their fancy studios with flashy maps on the wall.
It's just not the same.
PBS News Hour is run out of a kind of rundown building in Arlington, Virginia, right?
It doesn't compete.
And that's because we're spending so little on them.
And they're really, even with the Corporation for Public Broadcasting, relying on user donations.
So if you believe in what I'm saying, I encourage you to write to your representatives and ask them to increase that part of the federal budget instead of continuingly hacking at it.
Because the US, when we go abroad and we're supporting developing democracies, we say, ah, it's really important that you have a public broadcaster when we don't even have one that can stand on its own two feet.
So I truly believe in it.
And thank you for that important question.
I really believe that public broadcasting can balance out the scales in this equation.
Yup?
- Thank you very much for a knowledgeable introduction.
And it was very important for me.
But working on this field myself, I was working on this field for a long time.
I'm sure that you can add lots to my knowledge regarding especially the work since 1995 the Russian began regarding social computing and how to inject special ideas in mass discussions.
And they invested a lot in their academia in studying social computing and the whole process.
The other issue I would like to also to understand from your studies about the relation between the shadow business and its relation to this campaign, whether it be in Britain, whether it be in Georgia, whether it be in Moldavia for example, all those countries where there have been a marriage between this kind of media and the business, the corrupted business of crony state capitalism.
Thank you very much.
- Yes, thank you so much for the question.
So I'm going to plug a friend of mine's book.
His name is Casey Michel, and he's just written a book called "American Kleptocracy" that is coming out I think in the next couple of weeks.
I highly, highly recommend it.
And it looks at some of that shadow funding and all of the corruption, frankly, that funds so much of the disinformation that we see.
The Internet Research Agency is owned by a Russian oligarch who's close with Putin.
If we cut off some of those streams of funding that are, you know, kept in shell companies on offshore accounts and things like that, we'd have a lot easier time shutting down some of these campaigns if we just shut off the funding.
And you're absolutely right, sir, the Russian government has for a long time studied these social computing, social media campaigns, and perfected those tactics across eastern Europe.
I worked for a long time in Ukraine and frankly, a lot of the things that we have seen happen in Ukraine first, sorry, the things that we've seen happen in the United States first happened in Ukraine.
So the Internet Research Agency was first trained on Ukraine during the Euromaidan protests in 2013 and 2014.
Later when we started to see this prevalence toward information laundering, which is the Russian tactic of choice these days, that was happening in Ukraine in 2019 when I was there covering their presidential election.
So I think it's really important to invest, just as Russia has done, in the study, not only of the technological advancements in this area, but also in the regional study.
I don't just say that because I have my degree in Regional Studies and Russian and Eurasian Studies, but I think it's so key to understanding how both our allies and our adversaries operate and think, that linguistic and cultural and sociopolitical context, I wouldn't be able to do my work without it, that's for sure.
So thanks a lot for the question.
- Our next question is a text question.
Can you describe the type of awareness we should build, for example, how to identify disinformation?
Are there other areas of awareness to build?
- Yes.
So a lot of people throw around media literacy, information literacy without really saying what that consists of.
It's not just about who publishes newspapers and how, you know, news outlets are funded, how they do their fact checking and reporting.
I think it also has to be about, in this day and age, how social media works.
Like, look at all we're learning from the Facebook papers about how Facebook has weighted certain types of interactions on the platform to push certain content toward us.
How, for instance, the funding and advertising behind those platforms affects their decision-making and how we're being emotionally manipulated in order to generate ad revenue for these multi-billion dollar corporations.
I think everybody needs to understand that from, you know, kids who are watching, you know, YouTube Kids videos on their parents' accounts to adults who are using it to keep in touch with their grandchildren and beyond.
That's critical.
And I also think, yes, we do need to understand as individuals how to do basic fact checking.
So I always tell audiences there's a couple of steps you need to take when you encounter something that seems a little fishy.
Either, you know, it's, you know, making you emotional or just seems wrong.
Maybe you don't know about this outlet that it's coming from or the account that it's coming from.
First, if it's its own website, I want you to check and see if that website has contact information.
Any good news organization is going to list an address and a phone number.
That's probably going to direct you to a switchboard or an operator, but still it's going to have contact information beyond a PO box or a standard internet form.
It's also probably going to have a masthead of the editorial staff.
So those are good indications that that's a bonafide outlet.
You can also check to see if that author or reporter has published anything before.
Now, everybody has a first time, right?
But they're likely to have, you know, some sort of social media presence or something like that that seems authentic if they're a real person.
You can also do what's called a reverse image search.
And this is another thing that I hope everybody takes home today.
If you see an account that looks fishy and you're on your Google Chrome browser, you can right click on an image and click search Google for that image.
And it will show you all similar images as well as images, like, that are earlier versions of that image.
So you can see if somebody stole that image to appropriate it or misappropriate it to their account.
Now this is getting a little bit harder with deepfake and AI generated images.
And that's a whole other lecture I could talk about.
But still, this is a great way to spot a fake account.
So doing those quick things, which should take you no more than like, 60 seconds, that scan, as long as you have that emotional awareness too will help equip you to spot sketchy stuff on the internet.
And I think that should be part of everybody's education, not just for kids, but for voting age adults.
I mentioned libraries.
Libraries are one of the most trusted institutions in the United States still.
They are local, people trust them, no matter their political party.
I would love to see some funding going to libraries to train voting age individuals and kids in information literacy tactics.
There's nobody better in our society to do that in my opinion.
Yes, sir?
- Thank you, good afternoon.
Thank you so much for being here.
Welcome to Cleveland.
I work at a local public policy think tank here, and we focus on health and service in public policy.
And so I actually have a two part question.
I also want to thank you, by the way, for dedicating your book to truth tellers.
That's pretty awesome.
(laughs) But so my question is from a think tank person to another think tank person.
What's the role of think tanks in this conversation about information and correcting disinformation?
And then beyond that, what's the role of advocates and activists in helping to share their truths and getting the truths out to their fellow citizens?
- Oh, that's a great question.
Thank you so much for that.
I get really frustrated with the think tank and academic community a lot of the time.
And in fact, my book got turned down by a lot of academic publishers who didn't think it was academic enough because I wanted to tell a story in a way that people would find interesting and accessible.
To quote my mother, who did the index for my book and reads a lot, she's a really smart person, NPR person, PBS person.
She said, Nina, most nonfiction books are boring, but yours isn't.
(audience laughing) And I was like, that's good.
That's the best review that I could possibly get.
So I think it's really important for think tank people like you and me to really come down off of that ivory tower and interact with people at the level that they're at.
Again, that's why I love doing events outside of the beltway, because it's a good reminder that not everybody is stuck in this Washington policy merry-go-round.
And so really getting in touch with people and explaining things in accessible ways is, I think, of the highest priority.
Remind me quickly your second question.
Activists.
So yes, this is a little difficult, right?
Because sometimes activists and advocates can have a partisan and usually do have a partisan bent.
So some people are just not going to trust them sometimes.
And it's difficult, but I think that means that as long as activist organizations, civil society organizations are being immaculate about their transparency, their funding, when they make a mistake, owning that mistake, people are going to trust them more.
That stuff matters.
And I think that's incredibly important in terms of trying to even out the information space that we're all navigating right now.
Thanks again for that question.
- Hello, I'd like a followup on the school board question.
Based on the actions of the National Association of School Boards and some of their stances, the state school boards of Pennsylvania, Ohio, I believe Wisconsin, and several other states have think about leaving.
My question is would you be willing to speak to your local school board about transgender issues, transgender assault, vaccinations, masks, critical race theory, and other topics fully knowing that as per the letter of the National Association of School Boards to Merrick Garland and the FBI, would you be willing to speak noting that that school board say they are afraid of you and you should be looked at as domestic terrorists?
Thank you.
- Yeah, I mean, that is, it's a serious situation that we've got going on here.
And as somebody who hopes to start their family soon, I certainly don't want my children to be involved in those sorts of clashes.
I think education should be divorced from all of that, frankly.
And I think you make a really good point, sir, that the school board members who are there just trying to do what's best, in most cases, for the students there.
They didn't sign up for this, you know, basically volunteer salary to be protested, to be ridiculed at their homes, to have their children threatened.
So it's a very, very serious situation and I'm glad the FBI and Merrick Garland, I actually hadn't heard of that, have made such a declaration.
I think it's important.
And it's clearly another disinformation fisher.
If a school board, my local school board did invite me, I would be happy to speak about these topics to say, you know, this is something that we need to look at and it's domestic disinformation.
And it shouldn't touch our education system.
Thanks.
- [Man] Well, if you disagree with the school board's stance, would you be willing to be called a domestic terrorist?
- Would I be willing to be called a domestic terrorist?
No, (laughs) absolutely not.
And I think we have to be careful about throwing, you know, local school boards throwing that stuff around.
Yes, sir.
- All right, thank you very much for being here today.
It's a very interesting discussion.
And misinformation, which you talked about has become not only a local, but a world situation and problem that very, being discussed and very hard to evolve themselves.
Also, we are now experiencing hacking, which as they say, there's nothing they can do about it, which I don't believe in.
And in a turnaround, I have to believe that our country is doing the same thing to other countries around the world, but that you don't hear about.
What can you tell us about what our country is doing reverse-wise?
- Yeah, those are all really important points.
So let's start with the hacking and cybersecurity, because these are interconnected issues, as Cody and I are talking about at lunch.
But often kind of get conflated.
So we can see cybersecurity events or breaches fueling disinformation sometimes.
And you may remember the hack and leak of the DNC in 2016, which I would argue altered the course of the election.
This, of course, was done by Russia, a group called Fancy Bear.
These events where documents that shouldn't be public are released into the public or not meant to be public are released into the public and get taken out of context or, you know, are used to embarrass people, that's called malinformation.
And so these two can be closely linked.
We can do stuff about hacking, of course.
And I think the Biden administration actually has really started beefing up on cybersecurity.
The State Department just this week announced a new special envoy for cybersecurity.
The NSA and the intelligence community are working pretty hard on this.
And we've seen a lot of ransomware attacks this year that have been sponsored by Russia and China, which take hostage these public systems including hospitals and schools for money.
Russia can do much worse.
So can China.
When I was living in Ukraine, there were a couple of hacks into Ukrainian government systems, as well as the power grid.
And the NotPetya ransomware, which you may remember spread across the world even affecting Russian companies and causing billions of dollars of damage.
These are serious, serious issues that can actually affect people's lives.
And I think it's becoming extremely problematic.
And the government is starting to really beef up in terms of the directorates and the hires that they're making.
Is it enough?
No, not necessarily because Russia and China are formidable enemies and these what's called zero days, software vulnerabilities, are now on a black market where people who have the money can buy vulnerabilities to a Microsoft software or, you know, whatever software you're running on your government machine and hack into all of those machines.
If you want to read a book on the zero day market, Nicole Perlroth who works for the "New York Times," another friend of mine, "This is How They Tell Me the World Ends".
So very, very cheery topic.
So she's great on that.
Now, what does the US do?
Russia always likes to say that when we're talking about interference, political interference or election influence campaigns and disinformation, that the US gives as good as it gets.
In this instance, I do not agree.
Now we can talk about, you know, decades past.
That's a totally different story.
Right now what the US is doing in the public domain, apples to apples here, comparing those two.
If Russia says, you know, Voice of America and Radio Free Europe are the same thing as what we did in 2016, absolutely not, right?
The US claims those organizations.
We say they're funded by the US government.
And they have journalistic standards.
That's not the same as Russian trolls posing as Americans.
And in some cases, stealing real Americans' identities to influence dialogue in a US election.
What covert operations are happening right now?
Every nation is involved in spying.
Every nation is involved in hacking.
And that kind of is just the way that things go.
Is the United States using offensive cyber operations to protect itself?
Yes.
One of the more famous ones was when in 2018, the United States took the Internet Research Agency offline around the midterm elections.
Did that really do anything?
I think it was more of a, see, we can do this rather than enacting any actual harm because elections are just inflection points.
They're not endpoints to disinformation campaigns.
If we really wanted to make an impact, we would have taken them offline for months before that.
So I know that's kind of a wishy-washy answer.
Everybody's involved in the spying and the covert stuff.
I like to think that on the public diplomacy side of things, the United States is on the right side of the equation.
And if something came up where we weren't on the right side of the equation, I would be the first person to call us out.
(chuckles) Thanks.
- Thank you so much for being here.
My name is David Glasner, I'm the superintendent of the Shaker Heights City School District, which is a school district right outside of Cleveland.
And I'm here today with some students from our high school.
I was wondering if you could share, especially given what's been in the news recently around social media and some of the things you've talked about today, what advice or guidance would you give for students who are learning how to use these tools and for educators who are thinking about how to use media and social media in the classroom?
What advice would you give for those people in terms of preparing students for the future?
Thank you.
- Yeah, that is a wonderful question.
And one near and dear to my heart.
I have a lot of educators in my family.
My brother is a high school social studies teacher.
My sister-in-law, both of my sister-in-laws are also teachers.
And frankly, I credit my high school AP history teacher for getting me into this work.
I was in the debate club, as I'm sure will surprise none of you.
And I also remember in high school AP Gov, we did Operation Civic Duty, where we had to register to vote, we had to attend jury duty or, you know, volunteer for local campaigns, all that sort of stuff really, really sparked my interest in politics and global affairs.
So clearly like that civics part of the equation is super important.
In terms of social media, I mean, we've got some high schoolers here today.
I'm sure you guys are really tired of hearing about how bad social media is, right?
Like, and there's some great stuff on there.
I mean, I love TikTok.
I spent probably an hour on TikTok last night in my hotel room.
And there's so many great things on social media that are educational, that bring people joy.
But there's also stuff on there that, as we've heard from the Facebook papers, that can cause some real psychological harm that has offline harm as well.
And my next book, which is coming out in April if you want to pick it up, it's now available for pre-order, it's called "How to be a Woman Online".
And I'm going to speak to the young ladies in the room.
I've done focus groups with high school aged young women and college aged young women.
And the thing that strikes me comparing your generation to mine, and I'm quite a bit older than you, when I was in high school, I had multiple blogs.
I was on LiveJournal.
My MySpace probably exists somewhere still.
I was putting myself out there and I wasn't afraid to do it.
And now you guys are aware not only of if I misstep, what's this going to do to my college application, but also, especially for young women or people from marginalized communities and intersectional identities, you're worried about the abuse that you're going to get.
You're worried about, you know, people making fun of the way that you appear or the way that you dress or rating you, these sorts of things.
And that doesn't stop, unfortunately, when you're in your 30s and in the public eye.
And so I would say, claim that space for yourself.
You don't need to always be a behind a locked account because the world needs your voices too.
And I think it's really important that teenagers speak out.
We've seen so many times in history, even here in Cleveland, where young people have changed the course of events, right?
The course of world events, your voices are really important.
So I think it's important for schools to teach people how to remain safe while still making their voices heard.
And for educators to say, you know, these are powerful tools, there are some bad things on them, learn about emotional manipulation, learn about how to spot fraud and other types of manipulation online.
But encourage your students to put themselves out there in a smart way that's not going to affect their college applications.
Because I truly do believe that there is some good in social media, that it can be a tool for empowering people and for building democracy.
And young people are part of the key to that.
So, solidarity.
(laughs) - Hi, I just wanna say thank you for coming.
Great everything, honestly.
So my question is there are people who are deeply entrenched into these different disinformation campaigns, and they're very passionate to the point, as we saw, storming the Capitol.
Do you believe that there's any way to actually turn these people from disinformation back into the truth?
- Yeah, that's a question I get a lot.
And one that I think is probably near and dear to everybody's heart, right?
We've all got somebody in our lives that we wish we could talk to about these issues.
How do we do that?
And I think the first is to recognize their humanity, the fact that they are our loved ones and they're not just people who have gone off the deep end, right?
There's something that led them there.
And that's what you need to work to figure out in order to bring them back.
It shouldn't just be, you know, crazy grandpa Bob or grandma Betty's Facebook posts that you comment on and say, hey, here's a Snopes link saying while you're wrong.
Don't do that in public, that's not good.
Pick up the phone, send a text or a DM and say, hey, like I saw you posted this article.
Can you tell me why that interests you?
Get into a conversation.
Because again, there's something that led them there.
There's a kernel of truth at the heart of that disinformation that made them care about it.
With QAnon, for instance, a lot of people got into that conspiracy theory because they really care about child trafficking.
Redirect them to, you know, sources of information that are more credible about child trafficking.
Similarly with COVID vaccines, we've seen a lot of disinformation coming from the health community and the like, exercise community.
Similarly, try to try to redirect people towards the good stuff, listen to their concerns and have a conversation.
Don't just get in there looking to win the argument.
That's a recipe for disaster.
I think I'm getting signals that we are wrapping up.
Thank you, thank you.
(audience applauding) Thank you so much.
Delighted to inscribe your books if you'd like.
- Today at the City Club, we've been listening to Disinformation and Democracy, Civic Discourse in the Digital Age as part of our annual meeting, featuring Nina Jankowicz, global fellow at the Wilson Center and author of the book, "How to Lose the Information War, Russia, Fake News, and the Future of Conflict".
This forum is our Samuel O. Friedlander Memorial Forum on Free Speech.
Our deepest thanks to Nina Gibbons for her tremendous support.
Today's forum also is part of our Authors in Conversation Series, in partnership with Cuyahoga Arts and Culture and the John P. Murphy Foundation.
We welcome guests today at tables hosted by Cuyahoga County Community college, Eaton, Fifth Third, friends of Dave Nash, Shaker Heights High School, and Alma and Bern.
We're so happy to have all of you here.
Make sure to join the City Club next week on Wednesday, November 3rd at 7:30.
We are back in person at The Happy Dog.
It will be the day after Cleveland's mayoral election, and we'll be welcoming back moderator Tony Ganzer with Idea Stream Public Media, as well as a panel of local experts to discuss the priorities for the first 100 days of our new mayor in the city of Cleveland.
You can find out more about this Happy Dog forum as well as other upcoming City Club forums at cityclub.org.
And that brings us to the end of today's forum.
Thank you, Nina Jankowicz.
Thank you, members and friends of the City Club.
This program is now adjourned.
(audience applauding) (bell chimes) (energetic music) - [Narrator] For information on upcoming speakers or for podcasts of the City Club, go to cityclub.org.
- [Narrator] Production and distribution of City Club Forums and Idea Stream Public Media are made possible by PNC.
- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
The City Club Forum is a local public television program presented by Ideastream