What do you think? Leave a respectful comment.

How should Facebook change to protect privacy?

Facebook and its founder, Mark Zuckerberg, will face a grilling on Capitol Hill Tuesday. The company has been in damage control mode since news broke about a major breach of Facebook users’ personal information. Hari Sreenivasan reports and William Branham talks with Zeynep Tufekci of the University of North Carolina about the privacy concerns and how the social media giant could change.

Read the Full Transcript

  • William Brangham:

    Facebook and its founder, Mark Zuckerberg, are expected to face a grilling on Capitol Hill starting tomorrow.

    Zuckerberg, who began meeting with lawmakers today, will appear before Senate committees on Tuesday and a House committee on Wednesday. The company has been in damage control mode since news broke about a major breach of Facebook users' personal information.

    Zuckerberg apologized again today in remarks released in advance of his testimony, and he said the company simply didn't take a broad enough view of its responsibility until now.

    He also pointed to other changes being made today. For example, users found message showing them how to find out more about the apps they use on Facebook, what information is being shared by those apps, and how to remove them.

    We're going to hear from a critical voice about all this in a moment.

    But, first, Hari Sreenivasan reminds us about what's happened up to now and what's at stake.

  • Hari Sreenivasan:

    In 1994, it was big tobacco CEOs in front of Congress. In 2010, it was big oil CEOs answering questions after the BP Gulf oil spill. In 2011, the CEOs of major banks were at the tables, but that will change for a big player of the tech sector this week, when Facebook CEO and founder Mark Zuckerberg testifies before lawmakers on Capitol Hill tomorrow and Wednesday.

    It comes after the company admitted it had not adequately protected the private data of as many as 87 million users. In fact, Facebook says that information was improperly sold to a political consulting firm, Cambridge Analytica.

    Then, in turn, the data was eventually used by candidate Donald Trump. Cambridge Analytica says its key to success is the use of psychographics, breaking people down based on their personality traits and attitudes.

    At a conference in 2016, their CEO, Alexander Nix, explained that voters shouldn't just be targeted based on race, gender or geography.

  • Alexander Nix:

    Because it's personality that drives behavior, and behavior that obviously influences how you vote.

  • Hari Sreenivasan:

    Here's the key part of his pitch as it relates to why Facebook is on Capitol Hill.

  • Alexander Nix:

    By having hundreds and hundreds of thousands of Americans undertake this survey, we were able to form a model to predict the personality of every single adult.

  • Hari Sreenivasan:

    You might not remember taking that survey, but about 270,000 people downloaded a Facebook app and took a personality quiz called This is Your Digital Life.

    The terms of that app also allowed it to download the information of all the survey taker's friends, which meant the information of as many as 87 million people could have been harvested.

    Here's Alexander Nix again with an example of how psychographic targeting works when crafting gun rights messages.

  • Alexander Nix:

    For a highly neurotic and conscientious audience, you're going to need a message that is rational and fear-based or emotionally base. In this case , the threat of a burglary and the insurance policy of a gun is very persuasive.

    Conversely, for a closed and agreeable audience, these are people who care about tradition, and habits and family and community. This could be the grandfather who taught his son to shoot, and the father who would in turn teach his son. Obviously, talking about these values is going to be much more effective in communicating your message.

  • Hari Sreenivasan:

    The Trump campaign employed such microtargeting on an unprecedented scale, creating nearly six million versions of different advertisements for different audiences. The company says they are changing their policies so leaks of personal data through apps can never happen again.

  • Mark Zuckerberg:

    This was a major breach of trust, and I'm really sorry that this happened. You know, we have a basic responsibility to protect people's data, and if we can't do that, then we don't deserve to have the opportunity to serve people.

  • Hari Sreenivasan:

    It is not the first or second or third time the company has made promises to safeguard the information of their users. In 2007, Zuckerberg apologized for oversharing the personal information of users through a product called Beacon.

    In 2009, it revealed information users thought was private without warning. These and other actions led to a 2011 consent decree by the Federal Trade Commission to protect users' privacy.

    Until now, the company has taken a defensive posture, saying that all users are aware of what they're sharing and with which apps, that users are in essence granting informed consent. But most users never bother to go into the privacy settings and adjust the levels of visibility on the information they generate.

    Accessing information about users' tastes and preference is core to Facebook's business. In a nutshell, here is how the company makes money.

    When we like or love or share a video or an article or a brand on Facebook, we're generating information that fills in a profile. Facebook helps advertisers reach very specific audiences based on those tastes and preferences.

    While Facebook users may go online to share information with their family and friends, the data they generate and, in turn, the advertisers who target based on that data, is what helped the company earned $40 billion in revenue just last year and has the company valued at nearly half-a-trillion.

    For the "PBS NewsHour," I'm Hari Sreenivasan in New York.

  • William Brangham:

    Joining me now to discuss these latest Facebook scandals is Zeynep Tufekci. She's an associate professor at the University of North Carolina-Chapel Hill. And she studies the way we interact with technology.

    Professor, welcome to the "NewsHour."

    You know Mark Zuckerberg's here on Capitol Hill. He's been meeting with senators. He's going to testify tomorrow.

    Is there anything that he could say that would convince you, who have been a very strong critic of the company, that Facebook gets this problem and they're going to solve it?

  • Zeynep Tufekci:

    Well, the problem we're facing isn't whether or not Facebook get this problem or what its intentions are even.

    The problem is, the way they set up their business and the way they're used to and allowed to harvest our data and use it for targeting, pits its incentive against the incentives of its two billion users at times, definitely creates distortions in the public sphere, has all these harms for politics.

    So, rather than Mark Zuckerberg telling us something, the thing I really wish to see is our legislators and lawmakers stepping up and doing their job and bringing some oversight to this, so that Facebook's incentives are better aligned with our interests in a healthy public sphere.

  • William Brangham:

    So, you believe that the problem with Facebook is much broader than Cambridge Analytica and this most recent presidential election?

  • Zeynep Tufekci:

    Oh, absolutely. Absolutely.

    And, in fact, Cambridge Analytica is barely a problem, considering everything that's been going on. I first wrote about the dangers of misinformation on Facebook in 2012, when there was no word of Cambridge Analytica, because it was already evident then that things could be targeted in ways that were misleading to people.

    And it was already obvious that the vast hoarding of data, when you collect this much data, and then sell people's attention on the platform using this data to profile them to third parties, has all sorts of unhealthy distortions.

    So, whether or not Cambridge Analytica did anything with this data, even if Cambridge Analytica didn't exist and had never happened, the problem would remain.

    Facebook is a surveillance machine that is using this enormous amount of data it's collecting from not just its two billion users. It also creates shadow profiles of people who are not on the platform, and then uses all that data to infer things about us computationally, to figure us out, and then sells this kind of targeted access to whoever is paying it.

    So, that's unhealthy. So, Cambridge Analytica is not the core problem here. It has just allowed us to see how the machinery operates and to have a broader conversation about it.

  • William Brangham:

    So, let's talk specifics.

    Let's say — I mean, Facebook is not going anywhere, as you admit. And there are many great things about the platform itself.

  • Zeynep Tufekci:

    No, I use the platform all the time. I'm not telling people that it's not a good platform.

  • William Brangham:

    So, so specifically, though, what would you like to see changed, very specifically?

  • Zeynep Tufekci:

    I would liked all data collection to be opt-in.

    I should affirmatively actually consent to the way data is being collected about me. That data should be only be collected minimized to function. It should be collected for whatever I want it to be collected for, rather than just harvested and then goes around and used for whatever.

    It should come with an expiration date. If I allow the company to have the data for a while, it should then disappear when my consent to its use is done.

    The thing I fear is that the testimony just turns into congressional spectacle, that lawmakers yell at Zuckerberg, and Mark Zuckerberg apologizes. It will feel cathartic, but that's not the problem. This isn't about personalities.

    And Facebook keeps saying it's an idealistic company. It's a giant company with half-a-trillion dollars in market capitalization. It's about not about idealism. It's about protecting our data. It's about protecting our public sphere.

  • William Brangham:

    All right, Zeynep Tufekci, thank you so much.

  • Zeynep Tufekci:

    Thank you for inviting me.

Listen to this Segment