Facebook founder and CEO Mark Zuckerberg opened his testimony Tuesday on Capitol Hill with an apology — one that he’s repeated often since first acknowledging the firm Cambridge Analytica had improperly used personal information from as many as 87 million of the tech giant’s users: “I started Facebook, I run it, and I’m responsible for what happens here.”
But that wasn’t good enough for many of the 44 senators who grilled Zuckerberg during a five-hour hearing on how Facebook handles user privacy. The lawmakers, from the Senate Judiciary and Commerce Committees, focused on the company’s knowledge of its users, where and how that information was shared, whether its practices were legal, and what Facebook was doing to prevent a massive data breach from happening again.
Sen. Lindsey Graham, R-S.C., suggested Facebook was a monopoly, one that may need to have more federal oversight. Sen. Dick Durbin, D-Ill., asked Zuckerberg whether he’d be comfortable sharing what hotel he stayed in last night. When Zuckerberg, after a long, nervous pause, said “no,” Durbin pointed out that that’s exactly the kind of user information his company shares with advertisers.
“Your user agreement sucks,” Sen. John Kennedy, R-La., told Zuckerberg in a pointed exchange. “It’s about covering Facebook’s rear end,” not protecting users’ rights. Kennedy added: “You know that and I know that. I’m going to suggest you go home and rewrite it.”
Here’s what we learned from Zuckerberg’s testimony — and what we still don’t know.
We still don’t know what Facebook knew about the Cambridge Analytica data breach and when exactly it happened, said Franklin Foer, a staff writer for “The Atlantic” and author of “World Without Mind: The Existential Threat of Big Tech.” But we do know that Facebook didn’t tell users right away, and that it didn’t report it to the Federal Trade Commission, Foer told the PBS NewsHour’s Amna Nawaz.
It’s clear that users consent to data sharing. But do they understand it? “I really doubt it,” says Nina Jankowicz,a global fellow at the Wilson Center’s Kennan Institute. Zuckerberg said several times Tuesday that users choose which data they share, but that’s a little deceptive, she explained. “What Zuckerberg is referring to is the physical act of consent, that is, clicking yes in order to access the platform or parts of it, or consenting to certain default privacy settings by not changing them. This isn’t really meaningful, as he himself admitted on multiple occasions today,” Jankowicz said. If users were actually presented with better details of what they were giving away they’d be far less likely to do it, she argued. Clear and frequent explanations of how data is used — and what other apps might be able to access it — is key to solving the problem, Jankowicz added, but that’s not happening at the moment.
Zuckerberg acknowledged that Facebook in some ways “is a publisher, that it has responsibilities for the content that it publishes,” Foer said — a far cry from the Zukerberg of six months ago, who said Facebook was not a media company and was not responsible for the content of its users.
Zuckerberg only paid lip service to “news literacy.” “This is more than just a buzzword,” said Jankowicz, who said she was surprised the CEO didn’t detail Facebook’s investment in separating sources that are reliable from those who are not, such as its recent fact-checking initiative. “This is a long term project that will take generations to bear fruit, and Facebook has a unique role to play in developing it as a skill with its ubiquitous access to everyone’s lives. How will it use it?”
The hearing focused much more on data protection and the Cambridge Analytica case than it did on bad actors’ interference in the elections. That’s troubling, Jankowicz said. “I was hoping Zuckerberg would shed some more light on Facebook’s about face regarding the role Russian disinformation played in shifting the discourse surrounding the 2016 election.”
Zuckerberg said that Facebook will be taking a more active role in monitoring and policing problematic content, with user-reported flags and, in the future, more AI technology that can catch inappropriate online behavior. But those tools may not be enough “for the most delicate and grey issues in our political discourse,” Jankowicz said. Zuckerberg stopped short of defining hate speech and struggled to answer questions from senators who asked him to draw lines between it and political activism. “I believe the best thing Facebook could do would be to spell out the types of content that are and are not permitted on the platform in plain English, and then vigorously enforce these standards,” Jankowicz said.
Congress is asking the right questions. But they must also produce the right answers, said Sam Lester, the consumer privacy counsel at the advocacy group Electronic Privacy Information Center. “It is not up to Mark Zuckerberg to fix this problem — Congress must pass comprehensive privacy legislation,” Lester said. “The United States lags behind Europe in privacy protection, and we are at a critical moment now. We are optimistic that today’s hearing will lead to action.”
But even Mark Zuckerberg at times seemed not to understand how his platform works, Lester said. Still, “you don’t need a PhD in computer science to understand that Internet users need greater control over and protection of their data. And agencies exist precisely to provide expertise. For instance, Congress doesn’t need understand all the science behind pharmaceutical drugs – we have an FDA for that purpose.”
A key question going forward: Have Facebook’s actions complied with a 2011 consent order from the Federal Trade Commission on how it communicates with users? If not, why hasn’t the FTC taken action? In 2009 and 2010, the Electronic Privacy Information Center and other organizations approached the FTC, claiming Facebook transfers users’ personal data to application developers without their knowledge or consent. That complaint led in 2011 to a 20-year consent decree that requires Facebook to “obtain consumers’ affirmative express consent before enacting changes that override their privacy preferences.” Lester and EPIC argue that Facebook has violated that order by failing to prevent incidents like the Cambridge Analytica scandal. Facebook never got affirmative express consent for allowing apps to access users’ friends data, Lester argued. Senators focused on this issue during Tuesday’s hearing, signaling there may be more inquiries into what Facebook — and the FTC — should have done better.
Zuckerberg said he recognized that the company was likely going “to be regulated by government, and that regulation might ultimately be good for Facebook and Facebook’s users,” Foer said. Zuckerberg’s game, Foer added, is “trying to preserve Facebook’s dominant position, its monopoly. He would rather have this discussion be about the prospect of the regulation, rather than the prospect of having a discussion about antitrust and about breaking up the company.”
“I don’t see how they are ever going to escape this kind of arms race that they’re in, where they are going to have to keep pushing the boundaries of surveillance on their users.”
There’s a systematic failure of Facebook’s model that Zuckerberg didn’t fully address: The company is pledging to protect user data, but has built a business on profiting from it, Foer said. “They’re always going to be collecting as much data as possible in order to keep people’s attention for as long as possible, and then to also allow advertisers to target that attention as much as possible,” Foer said. “So I don’t see how they are ever going to escape this kind of arms race that they’re in, where they are going to have to keep pushing the boundaries of surveillance on their users.”
Where do we go from here? We shouldn’t expect any immediate action, Jankowicz. But there are a few things to watch: The Honest Ads Act, which has been backed by both Facebook and Twitter and would require more transparency for political ads online, and more tweaks to the platform, such as the additional privacy features Facebook has added over the past week.Congress will also be watching as the new European Union data law — which gives users more control of their data, particularly around consent — comes into effect next month, Jankowicz said. At some point, she added, “we will see some attempts at similar data privacy legislation in the U.S.”
The PBS NewsHour’s Joshua Barajas reported for this story.