VPM News Focal Point
Digital Privacy | March 02, 2023
Season 2 Episode 2 | 26m 46sVideo has Closed Captions
Children’s privacy in the digital age, Facial recognition, Protecting intimate privacy
Can public schools protect children’s privacy in the digital age? How is law enforcement using facial recognition technology? A legal crusader combats online abuse and fights to protect intimate privacy.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback
VPM News Focal Point is a local public television program presented by VPM
VPM News Focal Point
Digital Privacy | March 02, 2023
Season 2 Episode 2 | 26m 46sVideo has Closed Captions
Can public schools protect children’s privacy in the digital age? How is law enforcement using facial recognition technology? A legal crusader combats online abuse and fights to protect intimate privacy.
Problems playing video? | Closed Captioning Feedback
How to Watch VPM News Focal Point
VPM News Focal Point is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.
Providing Support for PBS.org
Learn Moreabout PBS online sponsorshipANGIE MILES: How private is your life?
Do you have a right to expect your most personal information belongs only to you?
We're talking about privacy, both online privacy and the potential use of technology to track your everyday life.
From targeted ads to facial recognition, from school data leaks to intimate details weaponized online, let's talk about digital privacy.
This is VPM News "Focal Point."
Production funding for VPM News Focal Point is provided by The estate of Mrs. Ann Lee Saunders Brown.
And by... ♪ ♪ ANGIE MILES: This is VPM News "Focal Point."
I'm Angie Miles.
We're going to talk about expectations of privacy versus what's happening in the world today.
Are you aware of how much of your personal life is available for others to see or to sell?
Do you have privacy rights?
And if so, are they enforced?
These are some of the questions we'll explore in this program.
Beginning with targeted ads.
They impact all Virginians.
What you type in your search bar and your social media accounts, as well as your online activity can lead to specific advertising tailored to you.
So, what are your privacy rights in the digital sphere?
In Virginia, a new law went into effect earlier this year giving consumers some of their power back.
VPM News Reporter, Keyris Manzanares, has more.
KEYRIS MANZANARES: In today's digital world, privacy is virtually non-existent.
Advertisers can track your activity and use your personal data for a range of things from marketing products to pushing political ads.
DAVE MARSDEN: The Consumer Data Protection Act was legislation that I introduced several years ago to start protecting people's data and their data privacy.
KEYRIS MANZNARES: That's Senator Dave Marsden of Fairfax, who championed Virginia's successful effort to pass a data privacy law.
DAVE MARSDEN: Quite simply, if you are getting targeted ads from a particular source, you can notify them, I no longer want them, I no longer desire to have them sent to me and they will purportedly stop.
KEYRIS MANZANARES: Virginia is only the second state to pass a data privacy law.
And what makes it powerful is that it's enforced by the state's attorney general and those protections extend to children, who are increasingly affected by online tracking.
TANIA FERNANDEZ: I believe that targeted ads are an issue for parents first because children shouldn't be profiled at such early age.
I think that they should be protected by a certain amount of privacy that we're all entitled to but specifically minors.
KEYRIS MANZANARES: Tania Fernandez is a mother of two, ages thirteen and eight.
She says she's vigilant when it comes to their digital footprint.
TANIA FERNANDEZ: And we do need support from our government, from our community to say it's not okay for you to experiment with our children.
ANGIE MILES: Under the Privacy Act, Virginians have the right to know what personal data is being collected or maintained.
Companies are now required to provide you with a copy of the information if you ask for it.
ANGIE MILES: We asked people of Virginia several questions related to online privacy.
Most told us they've had the experience of feeling their devices were tracking them.
Most said they don't believe people read or understand data privacy statements, and most said they feel they are required to share too much information online.
DAVID FENCER: One of the statistics I noticed is that especially among young kids, they share, 50%, one out of two, share personal information such as their schedule, their address, or, where they go to school, and a lot of this is being picked up on by human traffickers, people that are looking out to molest children, and so on.
KEVIN HARRELL: We have major data centers all around the country that just do nothing but collect information about us.
Our health, health status, and how much we have in the bank, and stuff like that, and they shouldn't know that.
ANGIE MILES: And a very specific form of data collection may be happening without your knowledge.
It's hard to hide your face from the world, and it's become a way to track you.
Virginia police have been using facial recognition technology to solve crimes for years, but now new guardrails have gone into place after privacy experts sounded the alarm.
VPM News Producer, Adrienne McGibbon, takes a look at what these new rules could mean for your privacy.
ADRIENNE MCGIBBON: It is everywhere.
You use it to open your phone and to tag friends on social media.
It's even cropping up going through airport security.
Facial recognition technology has been around for decades, and in Virginia, it's become a tool of 21st century policing.
SCOTT SUROVELL: So, it's out there and being used right now commercially in a lot of ways, but when the government wants to use it to lock people up in jail or to create consequences for it, that's when we have to have some really tight standards on it to make sure it's used properly.
ADRIENNE MCGIBBON: In 2022, Virginia passed a law putting guardrails on police use of facial recognition.
The high-tech tool is an algorithm that compares facial images and determines if there they're a possible match.
Senator Scott Surovell, a defense attorney who represents Fairfax County, introduced the bill which passed with bipartisan support.
SCOTT SUROVELL: In Virginia, we basically had zero standards on facial recognition technology.
ADRIENNE MCGIBBON: The law sets out 14 instances when police are permitted to use the technology, including identifying a suspect in a crime, a victim, or a missing person.
The law requires any facial recognition software used in Virginia must receive at least a 98% accuracy rate from the National Institute of Standards and Technology.
But some critics say these restrictions aren't enough.
ALISON POWERS: As I looked more into it, I became more increasingly concerned about its effect on police investigations, our clients specifically.
ADRIENNE MCGIBBON: Alison Powers is the director of policy at the Virginia Indigent Defense Commission; an agency that oversees the state's public defenders.
Powers says the restrictions are too broad.
ALISON POWERS: It will sweep up clients of color, clients who are indigent, and people who typically are involved in property crimes and low level offenses; the things that are typically captured on video.
ADRIENNE MCGIBBON: Historically, facial recognition technology has disproportionately misidentified people with darker skin pigmentation.
Virginia's law attempts to address this disparity, but Powers says the accuracy testing doesn't hold up in the real world.
ALISON POWERS: Many algorithms get very high marks when you're comparing a mugshot photo to another mugshot photo.
The algorithms can perform very well on those.
When you change one of those photos to a more blurry photo like a kiosk photo, a surveillance photo, the accuracy levels drop significantly.
ADRIENNE MCGIBBON: There are no known cases in Virginia of police misidentifying a suspect with facial recognition.
But in the US, over the past few years, there have been at least four instances when police improperly charged black men after using the tool.
TRICIA POWERS: The algorithms have come a long way through the years.
They're much more accurate than they used to be, but they're not there yet.
ADRIENNE MCGIBBON: Virginia State Police has been using facial recognition for more than a decade.
It runs comparison photos through a database compiled entirely of mugshots.
(car cruising by) Colonel Tricia Powers, who oversees VSP's use of facial recognition, says this technology should just be part of an investigation.
TRICIA POWERS: It's not designed to be an exact match like a fingerprint is an exact match for an individual.
ADRIENNE MCGIBBON: Colonel Powers says problems can arise when police don't understand the technology.
TRICIA POWERS: You know, sometimes, enforcement action was taken when it shouldn't have been.
ADRIENNE MCGIBBON: While Attorney Alison Powers says databases like the state police's are the best option, she worries about the implications if the technology becomes more widespread.
ALISON POWERS: We'll gather more data and stories of how the technology itself is flawed because it's human made.
SCOTT SUROVELL: There's pluses and minuses to all these types of technologies, and what's important is that we have clear standards so that, you know, people just can't use 'em whenever and however they feel like using 'em, which is what this bill does.
ANGIE MILES: The new law requires local police departments to provide public notification on their website if they are using facial recognition technology and annual disclosures about how it's being used.
It also prohibits police from tracking people in real time.
VPM News Focal Point is interested in the points of view of Virginians.
To hear more from your Virginia neighbors, and to share your own thoughts and story ideas, find us online at vpm.org/focalpoint.
ANGIE MILES: When the Los Angeles Unified School District announced last September that a data breach had occurred, it was cause for alarm among students and parents.
Since then, we've learned that as many as 2,000 sensitive student records, including mental health files, were leaked on the dark web.
This California calamity may seem a world away for Virginians, but Virginia schools are not immune to data hacking.
And now experts are raising new warnings about just how unprotected school-related information can be.
From data hacking to online tracking, experts are outlining the scope of the problem and are suggesting ways for students, staff, and communities to be safer.
ROYAL GURLEY: I believe at the core of what we are required to do is to provide students with this world-class education, and times have evolved.
ANGIE MILES: The head of Charlottesville City Schools understands how crucial technology is for classroom learning.
ROYAL GURLEY: I grew up in a time where, you know, we instructed paper and pencil and now, you know, students rarely have access to a book because they get that via technology.
And so the balancing act now is how do you do that and do it safely?
ANGIE MILES: With a degree in technology.
Superintendent Royal Gurley also understands the challenges of keeping student data safe.
Charlottesville has managed to avoid data breaches like the one that impacted Los Angeles city schools and leaked sensitive student information onto the dark web.
But many Virginia school divisions have had their technology systems compromised, data stolen and held for ransom.
Since 2015, cyber criminals have hit Virginia universities, as well as large school divisions like Fairfax and Arlington, but also smaller districts like Greensville and Smyth counties.
The Federal Bureau of Investigations says to schools- CHRISTOPHER COPE: Understand that you are a target.
Your networks are a target, your school is a target, and that bad actors are trying to access all of that in order to find sensitive information, steal it, and profit.
ANGIE MILES: In an attempt to profit, these bad actors infiltrate vulnerable systems, encrypt the data so schools can't access it without a decryption key, and then blackmail divisions, threatening to release private data such as student ID numbers, social security numbers, birthdays, home addresses, test scores, mental health records, salaries, or other protected employment information unless the division pays a ransom.
CHRISTOPHER COPE: Anybody that has a laptop that connects remotely could potentially be compromised.
The sooner the school districts train their students and more frequently train their staff on cybersecurity risks, the better off they they will be.
You know, even if they don't have, quite have, the budget or the ability to lock down those networks.
ANGIE MILES: Profit is the incentive, but experience says it's not just criminals who are incentivized by profit.
Dorothy Rice is a retired Richmond educator with a daughter who has also become a teacher.
DOROTHY RICE: And she's always going into her pocketbook to provide for her students.
ANGIE MILES: The fact that many teachers are so committed to finding what will make learning more fun and more effective makes free resources like free apps enticing.
DOROTHY RICE: And I'm not so sure teachers are really scrutinizing it because as long as they are getting a useful product for their students to use, they probably don't really delve into any sinister notions.
ROYAL GURLEY: Nothing is free.
And so I think we've done a very good job through our professional development and say, hey, if you see something you'd like, let's run it through technology.
Let your principal know.
Let's do it the right way.
Because there's so many unintended consequences.
ANGIE MILES: It's the unintended consequences that concern Amanda Kozak.
The Powhatan mom has homeschooled her children since the start of the pandemic.
She's been studying data privacy issues and she says parents should be paying much more attention.
AMANDA KOZAK: Yeah, so as a parent, there are a lot of concerns that I have from a data perspective.
You know, a lot of this data collection starts when a child is in pre-K. Everything from their attendance to their test scores, their behavioral health.
All of that is collected and stored and shared in the cloud, but we don't know who has access to it.
You know, from a third party perspective, how this information can be used later on down the road.
ANGIE MILES: Lisa LeVasseur says Kozak is correct.
Her nonprofit, Internet Safety Labs, is on a mission.
The technology watchdog organization recently tested apps for more than 600 schools, including 13 in Virginia, monitoring the flow of information that leaves the school networks.
LISA LeVASSEUR: We tested around, I think it was over 1300 apps we tested and we looked at the data flow.
We looked at the behavior of the apps.
The result was that 78% of those apps, we scored them as a do not use because they were sending data to entities that monetize data, either advertising or analytics.
The next bucket down, the high risk bucket, 18% of the apps were in there.
So only 4% of the apps were reasonably safe.
ANGIE MILES: By their tally, 96% of education apps that are either required or recommended by schools are passing data directly or indirectly to others, including big tech companies like Google, Apple, and Facebook.
LeVasseur says schools, like the general public, are mostly unaware of how the data is being used, partially because of the vetting process itself, which often relies on self-reporting of safety measures by the vendors who may be looking to monetize or who may not be aware of what is actually happening to the data.
LISA LeVASSEUR: The behavior needs to be analyzed.
It's one thing to look at the privacy policy or what vendors are saying that the product is doing.
It's quite another to look at the traffic, to watch the network traffic from the phone into the cloud, and see exactly what is happening and where information is going to.
We think the proof is in the pudding with that, and that's the kind of analysis that we do.
We look at the behavior of the apps themselves.
I think that there are some ed tech vendors who work really hard to do it right and still fail.
And then I think there are some that are maybe not as transparent as they could be.
DOROTHY RICE: By virtue of calling them vendors, they're capitalists.
They want to make money and they're looking at those students as potential customers, clients, or whatever.
They can mine that information.
ANGIE MILES: Internet Safety Labs says that yes, the data might be used for targeted ads in the future or to predict students' later behaviors or risk factors in ways that might benefit retailers or insurers, or might negatively impact those targeted.
LISA LeVASSEUR: We see direct pipelines between school technology and law enforcement, including emotional and social assessments and things like that, and that's deeply troubling, deeply troubling.
Especially for communities of color because it's just a self-fulfilling machine with all of the implicit bias.
AMANDA KOZAK: This is an issue that impacts everyone.
It doesn't matter how you vote.
It doesn't matter what your background is like.
This is an issue that's impacting every child right now within, not only our public school system, but also private schools and homeschools as well.
ANGIE MILES: Meanwhile, who is ultimately responsible for where the data goes?
LeVasseur says the burden of guaranteeing student data safety is currently with the schools and with the parents, but she says it should not be that way.
LISA LeVASSEUR: It's as if we're asking people when they buy a car to install their own airbags and their own safety belts and their own safety equipment.
That's where we are with software, and connected software today, and that is not right.
Nobody should have to do that.
You should be able to get a reasonably safe product directly from the manufacturer and trust that it is safe.
ANGIE MILES: "Focal Point" reached out to some of the biggest educational technology providers in the United States, and thus far we have no response from any of them.
In May of 2022, the Federal Trade Commission announced a rule that prohibits tech companies from surveilling children while they do their schoolwork.
Internet Safety Labs plans to make public detailed findings of the data dissemination practices of more than a thousand education apps by May of this year.
And we'd like to disclose the producers of the story do have family members who work in some capacity for Charlottesville Public Schools.
Those relationships have no bearing on this report.
ANGIE MILES: Virginians are getting more control over their personal data.
The state is one of the first in the country to put limits on the type of information a company can collect and store about you.
But some say this new law puts the burden on consumers to protect themselves.
Virginia Tech professor and advocate Irene Leech is joining us to explain how to optimize your privacy and opt out of data collection.
Thanks for being with us Professor Leech.
IRENE LEECH: Delighted to be here.
Consumers really need to know about this.
ANGIE MILES: You have been an advocate for quite a while, campaigning for protections like the ones espoused in this law.
Why is there such a need and why do you think it's taken such a long time to get rules in place?
IRENE LEECH: Data is being collected every day in every function that we are involved in and so far, the United States has failed to put reasonable guardrails up to protect and make sure that consumers do own our own data and that we're not taken advantage of.
ANGIE MILES: What's wrong with the law as it's written?
IRENE LEECH: The biggest problem is that the onus is completely on consumers and we are required to go to each entity with which we do business, follow the procedures that they have put in place, there are no standards, and work with them to get what we want.
ANGIE MILES: What are some of those concrete things that people can do even if the companies themselves are still holding a lot of power in what they can collect?
IRENE LEECH: It's possible that taping over the camera when you aren't needing the camera, being sure that you know what all the controls on your devices are and how to turn them on and off.
If we were a little less free with providing information, particularly sensitive information about gender and age and income and that kind of thing, if we question that more often, I think that could make a difference.
You can watch the full interview on our website.
ANGIE MILES: Since 1996, our health records have been protected by a federal privacy law known as HIPAA.
Doctors are not allowed to disclose our health data.
Yet, much of our personal information is available on internet-connected devices, and there's growing concern about who should have access to that.
Last month, Governor Glenn Youngkin helped defeat a bill that would've barred law enforcement from accessing menstrual data from tracking apps.
VPM News Senior Producer, Roberta Oster, introduces us to a University of Virginia law professor leading a national movement to protect intimate privacy.
(background street noise) DANIELLE CITRON: Young people actually want privacy, they expect privacy.
LAURA FAAS: I would say that digital privacy is critical today because everything is digital.
DANIELLE CITRON: I'm just trying to think of all the ways that we share information about our innermost fantasies our sex lives, our activities.
And that includes the Alexa in the kitchen.
It's the period tracking app that you have on your phone that's really helpful, chart out when, you know, going to ovulate and have your period.
It's the dating apps that so many people use, young, old.
Dating apps that collect an enormous amount of information.
And you look for diseases and chronic conditions on WebMD.
All of that information, your searches, your browsing, your geolocation collected from your apps on your phone, all that put together tell a story about your intimate life.
There are three sort of central actors that worry me and that implicate our intimate privacy.
And that's the corporate surveillance of intimate life.
It's individuals surveilling and exploiting intimate privacy and it's governments invading intimate privacy.
When it comes to the corporate surveillance of intimate life, we know that like one third of all women and girls use some type of health app.
And those health apps gather a tremendous amount of intimate data that is then shared and sold and sold onward and then sold to data brokers.
If you read the privacy policies on each and every one of the things I've just talked about, most often they're saying we're sharing that information with third parties and those third parties often include advertisers and marketers.
Companies may say to us, the information is anonymized, but it's often accompanied with your mobile device ID number, which is your name.
In a state in which abortion is criminalized, and that state abortion law might criminalize the actions of doctors, as well as anyone helping a person who gets an abortion.
And it could also include the pregnant person.
And if we've taken our phone to the clinic with us, it will provide circumstantial evidence of an abortion.
So all of those data points help enable law enforcement to get a warrant and then a warrant to get our phone that can be then used as a predicate to arrest a physician or a pregnant individual who's terminated a pregnancy.
LAURA FAAS: As Professor Citron had, you know, I've learned from her, we tend to think of privacy in these silos.
And we have the law HIPAA that protects health data in certain contexts, but in much more limited context than a lot of people think.
And we also have some consumer protection privacy laws.
But we don't have a comprehensive piece of legislation that protects privacy in all respects.
And I think that that is the goal.
DANIELLE CITRON: What are the protections that we have in Virginia around intimate privacy?
And the answer is woefully inadequate.
Because Virginia lacks a comprehensive approach to intimate privacy.
We all deserve intimate privacy.
We all want it, we all expect it and we all deserve it.
And so I think we need to bring kind of our moral chops to the table and say, this is for all of us and stop it.
ANGIE MILES: Professor Citron recently published "Fight For Privacy," a book that explores this issue and your legal rights.
Learn more about our stories at vpm.org/focalpoint.
There, you'll also find links to our full interview with consumer advocate, Irene Leech, and information on how to protect your privacy.
We invite you to share your story ideas with us, and we thank you for watching.
We'll see you next time.
Production funding for VPM News Focal Point is provided by The estate of Mrs. Ann Lee Saunders Brown.
And by... ♪ ♪
Digital Privacy | People of Virginia
Video has Closed Captions
Clip: S2 Ep2 | 42s | How much personal information are people required to share online? (42s)
Video has Closed Captions
Clip: S2 Ep2 | 4m 15s | A new Virginia law puts guardrails on the 21st century tech, but some say it’s not enough. (4m 15s)
Video has Closed Captions
Clip: S2 Ep2 | 7m 44s | Hackers pose threats to students’ data, but what about the software that schools trust (7m 44s)
Impacts of Virginia’s Consumer Data Protection Act
Video has Closed Captions
Clip: S2 Ep2 | 1m 43s | The Virginia Consumer Data Protection Act gives Virginians the power to protect their data (1m 43s)
Video has Closed Captions
Clip: S2 Ep2 | 10m 16s | Irene Leech discusses the Virginia Consumer Data Protection Act. (10m 16s)
Intimate privacy: the fight for cyber civil rights
Video has Closed Captions
Clip: S2 Ep2 | 3m 29s | Intimate privacy: the fight for cyber civil rights (3m 29s)
Providing Support for PBS.org
Learn Moreabout PBS online sponsorship- News and Public Affairs
Top journalists deliver compelling original analysis of the hour's headlines.
- News and Public Affairs
FRONTLINE is investigative journalism that questions, explains and changes our world.
Support for PBS provided by:
VPM News Focal Point is a local public television program presented by VPM