Alaska Insight
Alaskans address violence against Asians and biases in tech
Season 4 Episode 24 | 26m 32sVideo has Closed Captions
Racism permeates our existence and tech. How can we combat this racism and violence?
Racial bias permeates policy, justice systems, and our technology. Artificial intelligence uses data to make predictions about who we are and our behaviors. That bias can find its way into other parts of our lives. As violence against people of color has been on the rise, how can we combat harmful stereotypes and racism that may start in subtle ways but can lead to violence?
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
Alaska Insight is a local public television program presented by AK
Alaska Insight
Alaskans address violence against Asians and biases in tech
Season 4 Episode 24 | 26m 32sVideo has Closed Captions
Racial bias permeates policy, justice systems, and our technology. Artificial intelligence uses data to make predictions about who we are and our behaviors. That bias can find its way into other parts of our lives. As violence against people of color has been on the rise, how can we combat harmful stereotypes and racism that may start in subtle ways but can lead to violence?
Problems playing video? | Closed Captioning Feedback
How to Watch Alaska Insight
Alaska Insight is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipLori Townsend: Racial bias permeates government policy and justice systems, and it is ubiquitous in the technology that we engage with on a daily basis.
Artificial Intelligence uses data to make predictions about who we are and what we'll most likely do.
And that bias can gradually find its way into other parts of our lives.
Attacks against people of color, and especially in recent months against Asian Americans have been on the rise.
How can we combat harmful stereotypes and racism that may start in subtle or covert ways, but can lead to violence?
We'll discuss it tonight on Alaska Insight.
The Independent Lens documentary Coded Bias premieres on Monday at 9pm on Alaska Public Media Television.
It chronicles the work of a black computer scientist who discovers that the facial recognition technology and algorithms that increasingly influence our lives carry the same bias as the people and data that created them.
Unknown: After I had the experience of putting on a white mask to have my face detected, I decided to look at other systems to see if it would detect my face if I used a different type of software.
So I looked at IBM, Microsoft, Face++, Google.
It turned out these algorithms performed better on the male faces in the benchmark than the female faces.
They perform significantly better on the lighter faces than the darker faces.
If you're thinking about data, and artificial intelligence, in many ways, data is destiny.
Data is what we're using to teach machines how to learn different kinds of patterns.
So if you have largely skewed datasets that are being used to train these systems, you can also have skewed results.
So this is-- When you think of AI, it's forward looking.
But AI is based on data.
And data is a reflection of our history, so the past dwells within our algorithms.
This data is showing us the iequalities that have been here.
I started to think this kind of technology is highly susceptible to bias.
Lori Townsend: Data is destiny.
But it also reflects our past.
How then do we reconcile past inequities with current technology and address racism outside of the virtual world at the same time?
Here to help us understand some of the implications are guests Kenrick Mock, a professor of Computer Science and interim Chair of the College of Engineering at the University of Alaska, Anchorage, E.J.R.
David a professor of Psychology at UAA.
And Lynette Pham is an Anchorage community organizer.
Thanks, all of you for joining me this evening.
I just must mention also that EJ is one of the hosts of Alaska Public Media's program Hometown, Alaska.
So in the interest of full disclosure, we wanted folks to know that you're one of our hosts, which is fantastic, EJ.
Kenrick, I want to start with you the term artificial intelligence seems inaccurate and to my mind, somehow less ominous than this documentary reveals that to be.
Intelligence is collected knowledge, but is the distinction in how AI learns is that the artificial aspect of it that it's not really learning, but it's simply being fed a ton of data to categorize or, or how would you describe learning in this sense, in this AI sense?
Unknown: Yeah, so there are lots of different AI algorithms.
Some of them don't do learning.
So for example, some previous algorithms to say play chess might have learned or to do your taxes.
But most of the algorithms that we see today that are considered AI, do some form of learning.
And they learn based on whatever data you feed it.
And so the way these things learn is you have some big set of data, for example, faces and then the computer is taught these faces or this person, and it's able to learn from those examples.
And so it learns based on what it's presented, and in the Code Bias film.
And you know, people think of machines as a machine can't be biased.
It's not like a person or you know, a machine can't be racist, but based on the data that it's fed, it can produce biased results Lori Townsend: Kenrick, following up their Coded Bias, the documentary itself references nine companies that are building the future of artificial intelligence, six of them are in the US three are in China.
The ways to program are, are to give either a series of instructions like a recipe, as they referenced it or feed it a lot of data.
One of the scientists quoted said, we don't understand why it works.
And it has errors we don't understand.
And because it's machine learning, it's a black box to programmers.
That really struck me.
What does this mean?
And how are these systems then able to be controlled?
Unknown: That's right.
And so a number of these machine learning algorithms, we don't know exactly what's happening inside them.
So we know how they how they're trained.
And mathematically, you know what's happening when you present an input, and you can output that kind of in the middle what values are used is a little mysterious.
And that's the way that they learn.
So when we're using a neural network, for example, the neural network adjusts its weights based on the training examples.
And exactly what those values are, and how it comes to a result is a little bit mysterious.
And so the way that we can evaluate them is we really have to look at what the the outputs are produced.
And if those outputs make sense from from the inputs, Lori Townsend: But there's quite a bit then that's unknown about how these systems are actually learning?
Unknown: Yeah, I mean, the part we don't know is, how it--what kind of model is built.
And so it's coming up with the model to make these predictions.
And how the machine comes to that model is the is the black box.
Lori Townsend: Interesting.
EJ, I want to turn to you now.
Algorithms make predictions for the future based on tons of data, as we've been discussing, there is an old expression related to programming "garbage in, garbage out".
Is this essentially what you see happening here that the garbage of bias and racism is just being baked into this data?
Unknown: Oh, yeah, absolutely.
I think, you know, this documentary is evidence of how widespread how strong and how insidious racism and other forms of oppression like sexism are, you know, what it's all says is that our society is so thick with racism and sexism, that even the most seemingly neutral objective or fair processes and systems can still be infiltrated by racism and sexism.
This shows us how racism and sexism can be hardwired into our technology, into our software into our computer systems.
You know, and for historically oppressed peoples, this isn't surprising or shocking, because we've known for generations, that oppression has been hardwired into our other systems, you know, racism, and sexism has been hardwired as loss, as policies as standard operating procedures in our, you know, systems and institutions for hundreds of years.
You know, but for others who might think that racism and sexism are no longer issues in this country, or not as big of an issue anymore, this might be a wake up call.
I hope this sparks a sense of urgency.
And that it makes people are very concerned.
I hope they realize that, you know, if racism and sexism can seep into something fair or neutral, or objective, like computer systems, then what more with the other systems in our society like education, health, policing, the justice system, the government, right?
Like how much more prevalent, powerful, imbedded in dangerous racism and sexism are in the systems and institutions that significantly shape our lives?
Lori Townsend: Well, following up there, where are some of the most egregious areas where you're seeing where it affects, especially people of color?
Is it access to credit, housing, employment?
Where are some of the big, biggest red flags for you?
Unknown: Well, it's all over.
Right.
I mean, it's, it's everywhere.
And I think it varies depending on you know, what groups you're in, you know, for example, with African Americans, you see a lot of that, you know, when it comes to policing when it comes to justice system when it comes to incarceration, you know, but you know, but right now, for example, you know, what's been happening over the past year is the anti-Asian violence, you know, and a lot of people seem shocked, you know, with with the rise of anti-Asian racism, you know, and even calling it un-American, you know, but I think we need to be very clear that anti-Asian racism is very American.
You know, it isn't new, it's been going on for at least 150 years, you know, listen to the Page Act of 1875 banned Chinese women from immigrating to the US because they were seen as prostitutes and as disease carriers, you know.
To the colonialism of the Philippines to the assaults, murders and lynchings and bombings of Filipino farmers in the early 1900s.
Internment of Japanese Americans during World War II, you know, the United States wars in Philippines, Cambodia, Vietnam, Korea, that killed millions of Asians, right, the US military bases established in Asian countries, you know, where American soldiers exploited and sexualized and objectified Asian women, you know, acting like they're entitled to have control over Asian bodies.
To the Muslim ban to the mass murders at a Sikh temple, you know, just a few years ago in Oak Creek, Wisconsin, you know, anti-Asian racism is is an American tradition.
You know, violence stems from, you know, that that stems from white supremacy isn't new to Asian Americans, you know, and as a matter of fact, it is white supremacist violence that brought many of us here, you know, as colonized peoples as refugees of American wars, as immigrants looking for ways to survive because the U.S. stole our resources and exploited and ravaged our homelands and made living and surviving there very difficult.
So yeah, anti Asian racism is a new white supremacist violence against Asians as a new, you know, it's quintessentially American and it's everywhere.
Lori Townsend: Kenrick, I want to follow up quickly there and then we'll go to Lynette, how does bias in the algorithms that we know exists?
How do you see it amplifying racism?
And how can we combat that?
Unknown: I mean, if it's amplify racism, and because we haven't been doing anything about it, and what I think we what we can do about it is well, first of all, this film provides a lens into making this raising awareness about this problem.
What we can do is we can educate our computer science professionals.
And so when they're developing the software, at least they have some awareness that these are happening.
We also need more diversity in in our, in our developers.
And so we need more Black, Asian, a diverse pool of software professionals, not just white males that came up with these face recognition algorithms.
And then there's some discussion about government oversight.
And so just like the we came up with a vaccine for the coronavirus, new the vaccine was developed last year, but we didn't get it until recently, while there was all this testing going on.
And so for these AI algorithms, there's no testing like there is for a vaccine.
And so one idea is to have some oversight, like the FDA.
So if someone comes up with an AI algorithm that's going to impact society and could possibly disenfranchise a whole segment of the population, then it has to go through some kind of review with the diverse, with a diverse panel.
Lori Townsend: Hmm, yeah, that sounds like a very constructive idea, and one that needs to be further explored.
Thank you for that.
Lynette, the rise in attacks, as EJ was discussing of Asian people in the US and the recent horrific shooting in Atlanta, the images we've seen in recent news stories of elders being knocked to the ground on city sidewalks.
Do you hear reports in Alaska, that racism has increased against Asian people in this last year as it has in other parts of the country?
Unknown: Thank you for that question.
And opening up this discussion to us.
So I think Alaska is not any different from most places, when it comes to racism and when it comes to discussing something like racism or experiencing it.
I think as a Filipina and Vietnamese woman, racism is, is part of my daily life.
It's part of my experience in this world.
And you know, I can't speak for the Vietnamese or Filipino community, but we have a large immigrant community in Alaska.
And I think that maybe racism hasn't showed up in violent attacks, but I think that watching how systemic racism further oppresses our people, allows us not to have the same kind of opportunities as other people who live here and you know, it goes from language access to, to just living in a society that accepts you are going to the grocery store, and having people look at you weird or follow you around the store or harass you or even while you're driving, people saying racist things to you.
And, and I think racism in Alaska shows up, you know, to everyone, not just the Asian community.
As we have seen over the summer.
I think that actually talking more about anti-racism led me into more conversations where I have experienced racism.
Lori Townsend: I, reading the comments of the report that came in about people reporting attacks was was so disturbing to, to read about the things that had happened to people, as you mentioned in stores, just walking down the street, the attacks that have happened, are you aware linette of groups?
Or are you tracking groups in the state that foment this kind of racist behavior?
Or is most of this type of rhetoric coming from groups outside the state?
Do you know?
Unknown: Um, I had participated in planning several of the peaceful protests and rallies this summer.
And yeah, this is aggression and anger towards Black, Indigenous and people of color organizers came directly from Alaskans it they showed up to rallies with guns, and and blamed us for wanting to promote violence, when we were just asking for our humanity, when we're just asking people to treat us with equality.
And so, I mean, I don't know the names of these groups.
But I mean, they are, they are organized, they're speaking to each other.
And all I can say is that racism in Alaska, it just puts us farther back.
And, but other groups that are working against racism and anti-racism that I participate in are ... Alaska, Pacific Islander Desi Asian Americans, and they work directly with the Asian community, and to have conversations about anti racism in our communities.
And we work in solidarity with the Black community, you know, and the NAACP and the BLM chapter, they have been doing this work for anti-racism for so long.
And then we can only help with their work and stand in solidarity with them, so that we can stop anti-racism or we can prevent racism from showing up in our daily lives.
Lori Townsend: Thank you, Lynette.
EJ, want to turn back to you.
How Unknown: do you, Lori Townsend: do you connect the language and bias connected to COVID-19?
And how it's contributed to the disturbing incidents we're seeing around the US?
How has that connection been made?
Unknown: Oh, yeah, definitely.
You know, there's actually a peer reviewed study showing that implicit bias against Asians increased after March 2020, when political leaders and other authority figures insisted on calling the Coronavirus, the Wuhan virus or the Chinese virus or kung-flu.
Reports of anti-Asian violence since then, has gone up 150%.
You know, and so, but in addition to, you know, the anti-Asian hate in general, there also seems to be a gender component to this racism as well.
You know, so out of the I think it's about like 3,800 reported incidents of anti-Asian violence over the past year, right, 68% of them were committed against Asian women.
Right.
And what happened in Atlanta a few days ago.
In, when Soon Park, Hyun Grant, Suncha Kim, Xiaojie Tan, Daoyou Feng, Yong Yue, Delaina Ashley Yaun and Paul Andre Michels were murdered.
You know, that's a tragic example of how dangerous gendered racism is.
How deadly the intersections of racism and sexism is for Asian women in particular, but for women of color in general.
You know, and again, this is not just an isolated incident, and, and this did not just come out of nowhere, right?
This is a product of over a century of fetish, fetishizing, objectifying and dehumanizing women.
And here in Alaska, you know, we're quite familiar with gendered racism, right in our state with we see this with, you know, the high numbers of missing and murdered Indigenous women, the high rates of violence against native women.
So we're very familiar with gendered racism.
You know, so what happened in in, in Atlanta, can and absolutely does happen here also.
You know, in Alaska, we have a big Asian population.
You know, after white folks and Native folks, Asians are the third largest racial group here.
We are the third largest immigrant group in Alaska.
I mean, not not the third, we are the largest immigrant who can Alaska.
Right.
So we're unique, in a lot of sense.
But you know, as a study that I conducted with a couple of my colleagues at UAA, Dr. Gabriel Garcia and Dr. Joe ..., in 2019, we found that 54%, you know, that's one out of two, Asian Americans in Alaska, experienced racism recently, right.
And so Asian women in Alaska in particular, experienced unique challenges.
Asian women make only about 54 cents to every dollar that a white man makes in Alaska.
Right.
And again, that's just, you know, this wage gap is just really a sign of how devalued they are in this country in general.
But in Alaska, in particular.
Lori Townsend: Kenrick, we were talking about machine learning as a black box earlier, I'm going to ask a question from the documentary itself, how do we get justice in a system where we don't know how the algorithms are working?
How to tackle this, you had one great idea about a sort of a review board?
What are some other things that can happen?
Unknown: That need to happen?
It really has to be the people.
So as a computer scientist, I, I fall into this trap where I want to rely on the technology to solve problems.
And actually watching the documentary I kind of that was going through my mind of what kind of algorithm, how can we adjust the algorithm to make it better?
But I think really, the solution isn't really a technical technological solution.
It's really a people solution.
So if we have diverse viewpoints, from the developers, from people that are using the technology, then we can nip these problems before they happen.
So we can address these really become problems.
And so really, I think that's the best solution for this.
And for other future types of problems where AI bias could come into play.
Not really a technological one.
Lori Townsend: One of the mathematicians featured in coded bias wrote a book called weapons of mass destruction.
She said, algorithms are increasingly influential and touted as objective truth, but are used as a shield for corrupt practices.
Can you explain what she means by that, that corrupt practices?
Unknown: Yeah.
So mathematically, we have something that's kind of a beautiful truth.
And what we can do is we can fall back and say, Oh, I have some algorithm, it's maybe you know, 100% accurate.
And it's, sure it's 100% accurate, when you feed it, this particular data set, and you get this particular outcome, but where we need to be careful is, is that the right thing we should be measuring?
So maybe it should really be, you know, something else.
And so where I've seen an example of this is actually in my own work with things like recommendation systems.
So Facebook, social media, you know you read a bunch of posts, and then the system presents you with articles that you might be interested in.
And we can measure the accuracy, and we can get really close to 100% accuracy in terms of what someone wants to see.
And as a computer scientist, you know, we can highfive ourselves, hey, we're doing great, we're almost 100% accurate.
But then what we've created is the system where someone only sees similar kinds of content.
And now we get someone that gets a really slanted view of the world because, you know, we've got these algorithms where we think they're doing really great, but you know, maybe in a different from different viewpoint they're, they're not being so great.
Mm hmm.
Lori Townsend: All right.
Thank you for that.
Lynette, I want to turn back to you now, we're getting short on time.
How are you working to support the Asian community here?
And what are some ideas that you have that you would want other Alaskans to do?
What What should people do to help?
Unknown: Thank you for that question.
And I mean, something that I do is just working on women's rights and working on LGBTQ+ rights.
That includes women's rights, and just overall, the conversation about feminism and like EJ said earlier about this violence directed towards women.
Well, Alaska has it you know, Anchorage and Fairbanks are top, are in the top 10 for most dangerous cities for women and Anchorage is number four for highest amount of missing and murdered Indigenous women and girls.
And so we see this this violence against us, just violence towards women, based on their race, you know, specifically Black, Indigenous and Asian women who are experiencing these things.
So there are so many organizations out there and so I can't just you know, say what I'm doing.
But all of these orgs like Awake and STAR, who are supporting women's rights and standing up for them and houseless shelters.
And I would say that what people need to do is, is support these organizations is committed to stopping violence against women, and work on being anti-racist.
So that just, you know, my existence isn't an excuse for someone to be violent towards me.
And Lori Townsend: Yes, thank you.
We'll have to leave it there.
I so appreciate your words, and that you could join us today.
Lynette, Kenrick and EJ, thank you so much.
That is it for this edition of Alaska Insight.
Remember that you can watch the premiere of coated bias right here on Alaska Public Media Television on Monday evening at 9pm.
It's also available to stream anytime on the PBS video app and at pbs.org.
Remember to tune in daily to your local public radio station for Alaska Morning News and Alaska News Nightly every week night.
Join important conversations on Talk of Alaska every Tuesday morning and visit our website alaskapublic.org for brea ing news and reports from acro s the state.
We'll be back next Friday.
Thanks for joining us t is evening.
I'm Lori Townse d. Good night.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by:
Alaska Insight is a local public television program presented by AK