TOPICS > Nation

What’s the future of privacy in a big data world?

January 23, 2014 at 6:47 PM EST
Technologies that track data can make life more efficient, but can they go too far? Jeffrey Brown talks to technology and privacy experts Jules Polonetsky and Adam Thierer for more on why corporations should avoid being "creepy" and why it's important to empower consumers to hold companies and developers to strict standards.
LISTEN SEE PODCASTS

TRANSCRIPT

JEFFREY BROWN: So how do we weigh the appeal of these devices against their potential to intrude into our lives?

We’re joined by Jules Polonetsky, executive director of the Future of Privacy Forum, a think tank that promotes responsible data practices, and Adam Thierer, senior research fellow with the Mercatus Center at George Mason University.

Welcome to both of you.

Mr. Polonetsky, let me start with you.

In a car, on a person, in your home, do you think people understand how much of our lives are being collected? What else do we not know?

JULES POLONETSKY, Future of Privacy Forum: Clearly, most of us are excited about the latest feature.

We’re excited about the idea that cars could be safer if they’re aware of other cars on the road. We like the idea of having more power over our home environment and being able to automatically save money. But, clearly, every one of these new devices is powered by data. And it’s fair for us to scrutinize and hold the companies to a real strict standard.

As you collect our data to try to serve us better, how do we make sure that what you’re doing is for us and not something that is going to leave us discriminated against or narrowed in?

JEFFREY BROWN: All right, Adam Thierer, how do you think about all this information? Because, clearly, it has benefits for the people who perceive the benefits, but …

ADAM THIERER, George Mason University: Well, big data is the fuel that powers the information economy.

All of the wonderful sites and services and content that we enjoy today, much of it free of charge, is powered by data that’s collected, often to better advertise, but sometimes just to better tailor services to it.

The classic example would be Amazon’s ability to tailor what we might like based on past searches, or our wireless technologies in our phones, which enable various types of mapping services or traffic services to better give us a feel for what’s happening out there in the world. These are services or conveniences that we now take for granted, but that are only possible because data is collected.

Of course, it could be true that some of that data can be misused and that some consumers might not be aware of how it used, and we need to do a better job of educating them about this.

JEFFREY BROWN: Well, so where do they go too far? What worries you in the night?

JULES POLONETSKY: I’m worried about the security of some of these things, when you have got everything connected — your refrigerator. We heard about a refrigerator that was spamming people the other day. I now need an antivirus program for my refrigerator?

I think, when we do connect everything to the Internet and to other devices, we need to make sure that we do the work to lock these things down so that these devices talk to us or to each other, and not to strangers.

But, in addition to security, I think it’s fair to say, great, you’re helping us live a better life. In the U.K. right now, there is a huge debate, almost as big as the NSA debate here, as to whether or not the entire country’s health database can be used by researchers to try to come up with new cures and new diseases.

So, on one hand, that is exciting. Who knows what great breakthroughs…

JEFFREY BROWN: On the other hand…

JULES POLONETSKY: An entire country’s health information is sitting in one database. What kind of risk is that?

JEFFREY BROWN: Yes. Yes.

So how much are companies aware of their — I mean, are there differences among companies and — or among gadgets in terms of their awareness of the responsibility and what they do with the information?

ADAM THIERER: I think, right now, there are differences.

I mean, bigger companies are starting to realize because of the public pressure and also pressure from regulators that they have to better — be better stewards of the data that they collect. The Federal Trade Commission here in the United States has already slapped a number of large digital companies like Google, Facebook, Twitter, Apple and others with various types of fines and other types of requirements about better taking care of their data.

So there are reputational effects associated with misuse of data. And we do need to do a better job of making sure that companies live up to the promises they make consumers. But, at the end of the day, we should make clear that we don’t want to have a sort of regulatory approach that stops these technologies and slows the sort of ability to innovate with data that powers our digital economy.

JEFFREY BROWN: All right. Well, that leads to — would you like to see more regulation? Is there more that could be done by regulators?

JULES POLONETSKY: You know, I don’t want to see congressmen editing algorithms or privacy subject to the next budget sequester.

But I do think that government and advocates and media can give some real scrutiny and try to make sure that companies are putting the smart thinking that needs to happen in place. So it might be a little too early to think about laws that could restrict data innovation. But it is fair to ask companies as they go about the Internet of things to not be creepy.

JEFFREY BROWN: The Internet of things is referring to this — all these gadgets.

JULES POLONETSKY: Everything being connected.

JEFFREY BROWN: But to ask them to not be creepy, what does — I mean, what does that mean?

JULES POLONETSKY: Well, the president on Friday after he announced his NSA changes said, you know what, we need to look at big data, and we need to look at the private sector.

And so, over the next 90 days, the White House is going to be leading an effort to really provide some challenges. What are the benefits, what are the risks, how do we decide what risks we want to take for what benefits? I’m optimistic that shining a light on this, some real transparency is at the end of the day going to show us and make us face some hard decisions.

JEFFREY BROWN: And on the other side, what are you afraid of in terms of regulation? What would be lost if we start looking more closely at these — at this data collection and transfer?

ADAM THIERER: If we spend all our time living in fear of hypothetical worst-case scenarios and basing public policy upon them, then best-case scenarios will never come about.

We have to understand that there’s going to be a certain need for a certain amount of social adaptation and changing privacy expectations about these new devices in our lives, because all these devices will be interconnected, have sensors, cameras, and will be part of our lives from a very young age.

We do need to talk to people and to developers both about understanding good data practices, good data hygiene, if you will, and proper and improper uses of these technologies. And that is a conversation we are going to need to continue to have.

JEFFREY BROWN: How much do you sense — I will ask both of you, but start with you, Jules — the consumer awareness or even a backlash now, fueled in part by some of the NSA revelations we heard at the beginning of the program? How much do you sense that there is a backlash of concern over privacy?

JULES POLONETSKY: Well, certainly, Target’s sales were down over the holiday because of the big public awareness over their data breach.

Clearly, people have some sense of unease sometimes when they are on the Internet. Should they clear cookies? Who is tracking them? We don’t want that unease when it comes to driving a car, when it comes to your house. I think we need to do better in the Internet of things at making sure people feel empowered by the way that data is being used, that they are sure that it is being used for them, as opposed to companies doing things to them.

So, I don’t think we have seen a backlash yet. I think the NSA revelations have forced a lot of people to think a little bit harder. We have seen the increase in the number of people who use privacy tools or private search engines. But I think the big questions are still in front of us. And if companies are going to want to be intimate with us, they are going to need to be transparent with us.

JEFFREY BROWN: Adam Thierer?

ADAM THIERER: Well, we need to understand that privacy is a very subjective value, and that some people will be very sensitive about it, others not so much.

What we need to do is, we need to provide diverse tools to a diverse citizenry. We need to make sure that people who are highly privacy-sensitive have tools at their disposal, and they have many today that can block certain types of tracking technologies or location awareness technologies or whatever else, and that other people are willing, if they are willing to, can trade off their privacy in exchange for more convenience, better services, cheaper goods, whatever it may be.

JEFFREY BROWN: Some of that is going to put more on the consumer though, right?

ADAM THIERER: It will, but it also needs to be developers who need to be thinking about this. And we need to talk to them about the ethics in this regard.

JEFFREY BROWN: All right, continuing discussion.

Adam Thierer, Jules Polonetsky, thank you both very much.

ADAM THIERER: Thank you.

JULES POLONETSKY: Thanks for having us.