TOPICS > Nation

Facebook Changes Privacy Policy After Pushback from Users

May 26, 2010 at 12:00 AM EDT
Loading the player...
Facebook changed its privacy controls after users protested that their information was being made public. Jeffrey Brown looks at the growing pressure to safeguard user information online.
LISTEN SEE PODCASTS

TRANSCRIPT

JUDY WOODRUFF: Now: big changes in privacy controls on Facebook.

“NewsHour” correspondent Spencer Michels begins with this report from Palo Alto, California, where the social media giant made its announcement this afternoon.

SPENCER MICHELS: When people sign up for Facebook, it’s usually to connect with old and new friends via the computer, high school buddies, former workmates, long-lost relatives, and recent acquaintances.

But Facebook is far more than a digital phonebook. Billions of pieces of personal information are posted on the site, likes, dislikes, social doings, photos of everything, the good, the bad, the embarrassing, and streaming videos.

With some 400 million users and counting, six-year-old Facebook has grown into the largest social networking site, with sharing amongst friends and whole groups. If measured by population, it could rank as the world’s third largest country.

It’s free. The company makes its money from ads. Of late, several media outlets have alleged that Facebook has shared data on its users with advertisers, and that has provoked a storm of questions, including who controls that data and whether that data is being used for commercial purposes.

Today, Facebook responded to consumer and government pressure, announcing new privacy tools intended to make it easier for users to specify who can see their information, as well as whether Facebook can share that information with other Web sites or commercial interests.

Twenty-five-year-old CEO Mark Zuckerberg, who founded Facebook with friends in his Harvard dorm room, said that the new procedures were a response to complaints. Product manager Chris Cox explained to us how a very complicated procedure for restricting access to information has been drastically simplified.

CHRIS COX, vice president of product, Facebook: Our controls were too complicated in the minds of a lot of our users. And so what we set out to do, in response to a lot of the feedback we have been getting, is just to make it really, really easy for people to control the information they put into Facebook. There were a lot of different concerns that were aired out at the same time in the past couple weeks.

And, so, what we have tried to do is address a lot of them. There’s a simple switch that allows people to not share any of their stuff with third parties. There’s a simple control that allows people to lock down the content they post to just friends.

SPENCER MICHELS: These new rules of the digital road were unveiled amidst not only growing consumer unrest that Facebook has been too loose with information about its users, but also creating complicated and confusing privacy controls, putting the onus on consumers, not all of them technical wizards, to protect their own data.

To adjust how much information is accessible to anyone, Facebook users currently are required to navigate through 50 settings with more than 170 different options. Meanwhile, Facebook, MySpace, and several other social networking sites were found to have sent data to advertising companies that could be used to find consumers’ names and other personal details.

Facebook’s Chris Cox denied the company shared any information with advertisers.

CHRIS COX: We don’t share any information with advertisers about our users, and we never will.

SPENCER MICHELS: Does that mean the advertisers can’t get that information?

CHRIS COX: It does. Advertisers can control — they can say, I want to talk to people who are in Berkeley that like reggae music. And then we will take the ad, and we will show it to people who are in Berkeley that have listed that they like reggae, but we don’t give the advertisers information about our users.

SPENCER MICHELS: Facebook launched a product which automatically shared users’ profile information with other sites, unless a consumer opted out.

Last month, New York Senator Charles Schumer said he wants tougher consumer privacy laws and asked the Federal Trade Commission for an opt-in option, rather than an opt-out only.

SEN. CHARLES SCHUMER, D-N.Y.: Social networking sites have become the Wild West of the Internet. Users need to have the ability to control private information and fully understand how it’s being used.

SPENCER MICHELS: As Facebook tries to make amends, some consumers are taking matters into their own hands.

One group organized online is encouraging users to delete their Facebook accounts en masse on May 31. Many people have said they will do just that, but, meanwhile, more people continue to sign up for the site.

JIM LEHRER: Jeffrey Brown has a closer look at the issues raised by the Facebook case.

JEFFREY BROWN: And, for that, I’m joined by Kevin Bankston, a senior staff attorney with the Electronic Frontier Foundation, an Internet privacy group, and Peter Cashmore, founder and CEO of the popular social media news blog Mashable.com.

Kevin Bankston, before we assess today’s action by Facebook, help us frame the privacy question at issue when someone uses Facebook or another social media site. How would you define it?

KEVIN BANKSTON, senior staff attorney, Electronic Frontier Foundation: We think, at the Electronic Frontier Foundation, that the key question is, does the user of the social network have complete control over how all of their information is shared?

Facebook, which originally started as a good place to connect with your friends and family, has recently been pushing its users and in some cases forcing its users to make more and more of their information public, when they used to be able to keep it private.

And so today’s changes are a reaction to the uproar over those changes, and have given users some new control over how their information is shared. And we think the changes are a good start, but we think there’s some more work to be done when it comes to giving users their own control over their own information, as Facebook itself states is one of its principles.

JEFFREY BROWN: All right, let me come back to that, but bring in Peter Cashmore, because, on this question of how you define the issues, hundreds of millions of people have made it a normal part of their lives to share information.

So, how do you define the sharing-vs.-privacy equation?

PETE CASHMORE, founder and CEO, Mashable.com: So, I think there’s a wider trend here that Facebook is tapping into, that there is more sharing going on, on the Web. Sites like Twitter having encouraged public sharing being the default.

And Facebook is really trying to compete with these sites and also be part of that wider movement, but there’s also that — that gap between user expectations — they expect Facebook to be a private site — and the direction in which the Web is going.

JEFFREY BROWN: So, were they — staying with you, what did you make of today’s announcement?

PETE CASHMORE: I agree that the announcement was a good step. It was a step in the right direction. It greatly simplifies the controls.

But I also think it’s a little bit of misdirection going on here. The instant personalization, which is the product that automatically shares information with selected third parties, is still an opt-out system. So, you’re still opted into that, at least according to the announcements today.

JEFFREY BROWN: Peter, explain — explain that instant personalization, what that means.

PETE CASHMORE: So, the instant personalization piece is essentially that Facebook has chosen some sites, music site Pandora, Yelp, which is a review site, where, if you visit those sites, information that you have already chosen to be public will be sent to those sites.

So, Facebook’s argument is, if you have already set it to be public, it’s not really a problem to be sharing that again with some trusted partners the second you log on to their sites.

JEFFREY BROWN: So, Kevin Bankston, you started talking about today’s announcement. You think it’s a good step, but — but what?

KEVIN BANKSTON: Well, they made a few important changes today, rolling back some of their worst privacy missteps in the past few months.

First off, last month, they forced users to go through a transition where their likes and their dislikes, the places they worked and they schools they went to, which previously had been able to be set to private, had to be made public, or they would be deleted.

Facebook has wisely, I think, stepped back from that position, and has restored new true privacy controls for that information, so that you can set it to be privacy again.

Facebook has also provided a new ability that it took away back in December whereby users can, if they choose, opt out of any sharing of their information over what’s called the Facebook platform, the platform on which all the applications that people can use on Facebook are built on.

JEFFREY BROWN: Peter…

KEVIN BANKSTON: And this is an important…

JEFFREY BROWN: Go ahead.

KEVIN BANKSTON: … privacy step.

JEFFREY BROWN: Well, Peter Cashmore, I’m wondering, do people know — what do we know about the psychology and the understanding of users? Do people know about the flow of information? And do they care?

PETE CASHMORE: I think the simple answer is, they don’t know. There was reference to a change in December where, essentially, Facebook came up with this dialogue box that said, hey, we’re updating our privacy settings. Do you want to agree to these settings?

Now, the settings that Facebook had chosen by default were somewhat — basically sharing a lot more than you might be comfortable with. So, I think Facebook has somewhat pushed users to be more open without their full understanding. I don’t think there’s that much evidence that users fully understand these settings.

JEFFREY BROWN: And does anyone fully understand the flow of information? I mean, do you? We were — in our setup with Spencer Michels, we were talking about where information goes, who it goes to, does it go to advertisers.

How much is really known about all that?

PETE CASHMORE: It’s an extremely complex issue, even for technologists who are covering this space. I think the simple rule is, if you don’t want it to be shared, don’t put it online.

But Facebook is doing a good job of simplifying the settings and making it a little bit more understandable to the average user, but there’s no doubt that this is a really complex issue and that understanding where your information is going is very challenging.

JEFFREY BROWN: Kevin Bankston, what do you — what can you tell us about the flow of information?

KEVIN BANKSTON: I think the most important thing for people to understand on Facebook, and something that wasn’t changed with the changes today, is that, right now, there’s something we in the privacy community call the app gap.

That’s the privacy problem whereby, even if you aren’t choosing to use a Facebook application or a Facebook-connected Web site, if one of your friends uses it, then that Web site or application is automatically going to get your information.

And, so, we think that users should have complete control over their information, so that they can block all of the hundreds of thousands of Facebook-connected applications and Web sites, and only approve sharing with the applications they want to use.

So — but, right now, Facebook is posing users a choice. Either you can opt out completely from that sharing, and not use any applications, or, if you want to play Scrabble with your mom or face off in the Mafia Wars game with your friends, that automatically means that your privacy is at the mercy of your friends. And whatever of the hundreds of thousands of other apps your friends choose to play or install, those apps are going to get your information, even if you never use them.

JEFFREY BROWN: But, Kevin, how much — how much…

KEVIN BANKSTON: And so that’s…

JEFFREY BROWN: I’m sorry. I’m sorry for interrupting, but I’m wondering, how much do people really care about all this, about their privacy?

I mean, we — we said that there is a backlash against Facebook. This is sort in the air right now. On the other hand, hundreds of millions of people are doing it. Are there evolving standards of privacy for people, especially younger people who have grown up doing this? How much do you sense that people really care about this and what they’re willing to do about it?

KEVIN BANKSTON: I think they care a lot.

Certainly, people, as they are being given new tools for publishing online, are becoming more comfortable publishing more online, including to a broad audience. However, that doesn’t mean they don’t also want control over what information is being published to people other than their friends or family.

Indeed, recent research here in the Bay Area at the Berkeley Law School found that younger Internet users are more comfortable sharing more information, but they’re just as uncomfortable as older users when information is shared outside of their control. So, the key really is control.

And we’re pro-free speech, and we think it’s a great thing that more people are able to speak online, but we think they should only have to speak when they want to speak, rather than having Facebook or any other company decide what information to share about.

JEFFREY BROWN: All right, let me give a last word to Peter Cashmore.

What do you think about this evolving standard of privacy?

PETE CASHMORE: I think that was a really correct summary there.

I think what’s happening on the Web is that, yes, young people are becoming more comfortable with sharing more things. But the way that Facebook has implemented these features has generally been towards an opt-out or at least pushing users beyond what they fully understand.

I think Facebook would do a better job of perhaps highlighting the advantages of sharing. If you share your likes, you might be able to meet up with people in your local area who have similar interests, for instance. And I think Facebook would be better to sell users on that vision, and then encourage them to opt in to that vision.

JEFFREY BROWN: All right.

Peter Cashmore and Kevin Bankston, thank you both very much.