Airport face scans raise privacy concerns

The Department of Homeland Security is scanning the faces of some passengers on international flights at six airports in an attempt to catch immigrants overstaying their visas. While government officials say they are not keeping biometric data of U.S. citizens, privacy advocates are skeptical. Associated Press reporter Frank Bajak joins Hari Sreenivasan from Houston.

Read the Full Transcript


    Sometime soon, you could have your face scan as you board an overseas flight. It's already being tested at a handful of American airports with two airlines, Delta and JetBlue, directly participating. The Department of Homeland Security sees the system as a tool to catch immigrants overstaying their visas.

    But as the program expands, it will take photos not just of foreigners but U.S. citizens as well. This raises security, accuracy and privacy questions.

    Joining me now from Houston to discuss this is "Associated Press" reporter Frank Bajak.

    Frank, first of all, just lay out this experiment or lay out how this is rolling out for us.


    Well, interestingly, it's not really an experiment because the Department of Homeland Security says it intends to go ahead with this at all high volume airports beginning in 2018.

    Now, this is a program that started out as applying only to what they call non-immigrant foreigners. And that's 50 million people who visit the United States annually to make sure that they're not overstaying visas and to keep better track of that. Congress has not explicitly approved this for U.S. citizens, but Customs and Border Protection, which is part of Homeland Security, says that they can only do this if they do it for everyone, including U.S. citizens.


    What happens to this information? Say I take my photo or the airlines take my photo or CBP gets my photo, where does it go?

  • BAJAK:

    First of all, every U.S. citizen now has in their passport a little chip that has the biometric information. It has their biographical information, where they live, et cetera, their birth date. And it's got a photo which is encoded in that chip. It can use that information to compare it against say outstanding arrest warrants, basically if someone is wanted for a crime.

    The government says it's not going to retain this information. It's not going to keep record of this information but the Border Patrol official I spoke to said that they're not precluding that in the future, they could retain that information.


    What about the accuracy of these images, these image recognition. Let's say you get caught in kind of a face scan trap and there's a flag but it's a case of mistaken identity.

  • BAJAK:

    Well, facial recognition technology is steadily improving over the years. And it's now, the best of it, is at 90 percent to the 5 percent accuracy. However, the experts that I talked to said that there are apt to be mismatches.

    If there is a mismatch, what can happen? First of all, it can cause travel delays if you've got — if you're boarding an international flight with 500 passengers, you're looking at up to 50 people who are going to get stopped and have to go through a manual check.

    But the issue is also on that mismatch, what if I've got an identical twin who is a wanted fugitive. Well, is that going to flag me?


    I know in the past, there are some concerns that these algorithms aren't very good at identifying or misidentifying women or people of color because of the samples that they use.

  • BAJAK:

    Indeed that's true. In fact, there's a researcher at MIT Media Lab who looked into this and who says there is a bias against people of color because of the selection in tests that's been used. But I think that the bigger question that concerns the privacy experts is, it's very easy to quickly now compare these two, the face prints of tens of thousands of people from other databases instantaneously and I think this is the worry.


    Frank Bajak from "The Associated Press" joining us from Houston today — thanks so much.

  • BAJAK:

    Thank you.

Listen to this Segment