Amazon Exec Defends Facial Recognition Sales to Law Enforcement, Says Would Sell to Foreign Governments

Share:
Andy Jassy, CEO of Amazon Web Services, spoke to FRONTLINE for our new documentary, "Amazon Empire: The Rise and Reign of Jeff Bezos."

Andy Jassy, CEO of Amazon Web Services, spoke to FRONTLINE for our new documentary, "Amazon Empire: The Rise and Reign of Jeff Bezos."

February 18, 2020

In recent years, Amazon has faced growing criticism from civil rights groups, AI researchers and even some Amazon employees and shareholders for selling its facial recognition technology to law enforcement and discussing it with U.S. government agencies. Some of the same groups have also raised concerns that the technology could in the future be available to foreign governments, including authoritarian regimes.

In Amazon Empire: The Rise and Reign of Jeff Bezos, FRONTLINE examines how Amazon’s chief executive built a company that has become one of the most influential economic and cultural forces in the world, with the power to shape the future of technology. The filmmakers spoke to six top Amazon executives and nine former insiders about the company’s power, dominance, business and labor practices and its controversial sale of technologies like facial recognition to law enforcement.

Andy Jassy created and runs Amazon Web Services — AWS, which has become the world’s leading cloud computing platform. In 2016, AWS unveiled Rekognition, its facial recognition software.

Since the advent of facial recognition technology, Amazon and other tech companies have been challenged on its moral and ethical implications — from issues of accuracy and privacy to any potential sales to organizations with poor track records on civil liberties. There are currently no clear regulations governing facial recognition technology in the United States, although a handful of cities and states have imposed either partial or complete bans on use of the tech by law enforcement.

That has left the companies to make their own rules. In April 2019, Microsoft said it had agreed to sell its facial recognition tech to a U.S. prison, but turned down a law enforcement agency’s request to install the tech in police officers’ cars and body camera over human rights concerns. In January, New Jersey blocked its police officers from using a facial recognition app made by Clearview AI, which the company told The New York Times was being used by over 600 law enforcement agencies.

In an interview with FRONTLINE in September, Jassy said that Amazon would sell facial recognition tech to foreign governments. However, he said, “There’s a number of governments that are against the law for U.S. companies to do business with. We would not sell it to those people or those governments.”

FRONTLINE correspondent James Jacoby pointed out that there are some countries with whom American companies are allowed to do business that nonetheless have a history of human rights abuses, such as cracking down on dissidents. He asked Jassy about Amazon’s stance on their services or infrastructure being used to carry out such policies. Jassy said, “Yeah, again, if we have documented cases where customers of any sort are using the technology in a way that’s against the law or that we think is impinging people’s civil liberties, then we won’t allow them to use the platform” — meaning all of AWS, not just Rekognition.

Watch: Amazon Empire: Andy Jassy Interview

The Washington County Sheriff’s Office started using Amazon’s facial recognition software in 2017. Police officers who spoke to FRONTLINE said it’s a valuable tool to identify suspects quickly.

But civil rights groups like the ACLU say there’s little transparency about how many police departments and sheriff’s offices are using the technology, how they’re using it, or what policies they’re following.

“It is foreseeable that if Amazon’s facial recognition product continues to be used by police in the United States and elsewhere, that it will be abused, and that it will be used to discriminate, to oppress and to harm,” Matt Cagle, technology and civil liberties attorney at ACLU of Northern California, told FRONTLINE.

Studies of Amazon’s facial recognition software found it was prone to mistakes with darker-skinned faces. Amazon has questioned the methodology of the studies. (A 2019 federal study of facial recognition algorithms submitted by 99 companies — which didn’t include Amazon — found many of those developed in the U.S. misidentified people of color and women more often.)

Read: Artificial Intelligence Can Be Biased. Here’s What You Should Know.

In 2019, the ACLU and 85 civil, human and immigrant rights organizations called on Amazon, Microsoft, and Google to commit to not selling their facial recognition tech to the government. 

Jassy and Amazon have said that’s the wrong approach.

“We’ve had the facial recognition technology out for use for over two-and-a-half years now, and in those two-and-a-half years, we’ve never had any reported misuse of law enforcement using the facial recognition technology,” Jassy said. “I think a lot of societal good is already being done with facial recognition technology,” he added, noting that the tech had been used to find missing children and trafficking victims. “Let’s see if somehow they abuse the technology.”

Jassy said it’s easy for customers and users to file complaints about the misuse of software.

But it’s unclear how people who may not be aware of facial recognition tech being used on them would file a complaint. For example, a man facing drug charges in Florida in 2016 only learned that facial recognition technology had been used to identify him as a suspect when he deposed the detectives assigned to his case. The Florida Times-Union reported that the use of facial recognition software was not mentioned in his arrest report.

Cagle said Amazon is “placing the burden on regular people, you and me, to decide and to identify and to do the research about whether somebody is being secretly monitored and information is being used against them by police.”

He added, “Amazon is waiting until people are really harmed to make decisions about this technology, and that is really reckless.”

In May 2019, Amazon shareholders voted against two non-binding proposals — one that would have barred sales of the facial recognition technology to government agencies, and another that would have studied the tech’s impact on rights and privacy.

Jassy said in June that he wished the federal government would “hurry up” and create regulations for facial recognition. (Late last year, Bezos said Amazon’s public policy team was working on regulations that the company would pitch to lawmakers.) Jassy repeated his call for the federal government to come up with guidance in the conversation with FRONTLINE.

He added, “I think at the end of the day with any technology, whether you’re talking about facial recognition technology or anything else, the people that use the technology have to be responsible for it, and if they use it irresponsibly, they have to be held accountable.”

Anima Anandkumar worked as the principal scientist for artificial intelligence at Amazon. After leaving the company, she signed a letter in early 2019 along with dozens of AI researchers and experts, asking Amazon to stop selling facial recognition technology to law enforcement agencies.

Watch: Amazon Empire: Anima Anandkumar Interview

Anandkumar told FRONTLINE that facial recognition isn’t “battle-tested” to work in the types of challenging conditions that law enforcement might use it in — low-light, grainy or low-quality images. She said, “I think it’s also the responsibility of the company to ensure that [facial recognition and related software] are well-designed and getting used in the context that it’s designed for.”

She said government regulations often lag behind the fast pace of technological development because policymakers usually don’t understand the technology as well as the companies developing it. That puts the onus on companies like Amazon, she said, to “proactively take a stance and think more deeply about the ethical concerns, and try to ensure fairness.”


Priyanka Boghani

Priyanka Boghani, Digital Reporter & Producer

Twitter:

@priyankaboghani

In order to foster a civil and literate discussion that respects all participants, FRONTLINE has the following guidelines for commentary. By submitting comments here, you are consenting to these rules:

Readers' comments that include profanity, obscenity, personal attacks, harassment, or are defamatory, sexist, racist, violate a third party's right to privacy, or are otherwise inappropriate, will be removed. Entries that are unsigned or are "signed" by someone other than the actual author will be removed. We reserve the right to not post comments that are more than 400 words. We will take steps to block users who repeatedly violate our commenting rules, terms of use, or privacy policies. You are fully responsible for your comments.

blog comments powered by Disqus