Subscribe to Here’s the Deal, our politics newsletter for analysis you won’t find anywhere else.
Thank you. Please check your inbox to confirm.
Leave your feedback
The social media giant Facebook is the subject of a Wall Street Journal investigative series out this week that highlights the ways in which Facebook handles -- or doesn’t handle -- a range of issues across its vast digital empire including misinformation and violent content. John Yang spoke with Jeff Horwitz, the series’ lead reporter, to learn more.
It is a deep dive into the inner workings of Facebook.
A Wall Street Journal investigation out this week alleges deceit and dangers at the social media giant. Among its conclusions? Facebook's platforms are — quote — "riddled with flaws that cause harm, often in ways only the company fully understands."
John Yang has more.
Judy, the series called "The Facebook Files" is based on The Journal's review of internal company documents.
The stories highlight the ways in which Facebook handles or doesn't handle a range of issues across its vast digital empire, from the negative effects of Instagram on young people to misinformation and violent content.
Jeff Horwitz is the lead reporter on the series and joins us now.
Jeff, thanks for being with us.
You have got four installments that have published so far. You have a fifth coming. Is there a common thread or a common takeaway to all these stories that you're finding?
Jeff Horwitz, The Wall Street Journal:
There are probably a couple.
One is that Facebook has, in the last few years, come to understand, through significant research, what its effects are on society. And they turn out to be pretty grave in some instances. And it is not all good news they're finding. In fact, a lot of it is very bad.
And I think how the company has done that work and responded, or more often than not, not responded to it, is, I think, a very important thing. And then I think there is also an element of this where it's just Facebook appears to have a very hard time managing itself, keeping attention on problems to fix them, and actually sort of adhering to its own rules internally.
And a lot of these issues, when they arise and are talked about, the — Facebook's leaders, Mark Zuckerberg and others, often, to me, seem surprised that this is an issue.
But you're finding otherwise.
Yes, it is a strange thing, given that the company just sort of doesn't seem to expect the misuse of its platform that people who are misusing its platform should have trained them on by now.
So, one of the stories is Facebook's failure to address human trafficking and cartel violence on its platform. They really turn out to sort of have a hard time focusing on this and putting resources in. And every once in a while, it sort of suddenly pops up, and it is very embarrassing, and they have to run around.
But they don't ever sort of manage to get it done on a day-to-day basis, if that makes sense.
And on that issue of drug cartels, human trafficking, Facebook provided us a statement that says: "In countries at risk for conflict and violence, we have a comprehensive strategy, including relying on global teams with native speakers covering over 50 languages, educational resources and partnerships with local experts and third-party fact-checkers to keep people safe."
But, as I read your story, they — employees may be flagging things, but there isn't necessarily reaction within the company.
So, to start off with, Facebook offers its services in over 100 languages. So the 50 languages might not be as impressive as one would hope there. But, yes, I think there are a whole bunch of really dedicated people who are working for this company. They have been asked to solve really, really horrendous societal issues, or at least to address them on the platform, such as people being sold into indentured servitude in the Gulf states on their platform.
And they sort of recommend the company do things. And then, oftentimes, the follow-through just isn't there.
In talking — about one of your stories that got a lot of reaction, or has gotten a lot of reaction, is about Instagram, the photo sharing app that Facebook owns.
You write that Facebook's and Instagram's own research shows the harmful effects this app has or can have on young people, and especially teenage girls.
Yes, so this is something that, over the last — since basically 2018-2019, they have been researching what they call negative social comparison.
And I think at least, listen, when I started this stuff, I would have assumed that sort of Instagram's kind of like high school or something, right, in the sense that there's going to be social pressure, but we all get through it for the most part.
I think the thing that was really surprising was how heavily it seemed to impact people who Instagram identified as already vulnerable. So, we make body image issues worse for one in three teen girls is a conclusion that they drew internally and presented to management. And, in fact, they found that, in some instances, for young women who were thinking about self-harm within the last month, that 6 percent of those traced that thought directly back to Instagram.
So we're not talking necessarily huge numbers of people, but it is potentially life-or-death issues here.
Instagram's head of public policy providers with a statement about that story.
She said: "We take these findings seriously. And we set up a specific effort to respond to this research and change Instagram for the better."
These issues that you have you have uncovered about Facebook or you write about, how much are they — to what extent do they drive the bottom line for Facebook? Do they drive their revenue, sort of define what Facebook is?
Yes, I think Facebook has really sort of built itself around the idea that the more usage of Facebook is, the better the world is in general.
And so they have kind of engineered this system to be as sticky as possible, to keep you coming back as often as possible, and to be as entertaining as possible. And they don't really — focusing on the quality of the things and the types of content that succeeded wasn't really the focus.
So I think it's becoming more of a focus now. They're thinking about it more, but it never has been the focus. And there still is a very large company attached to this work that wants to just sort of keep on doing the things that made it big and successful.
I think, with the teenage mental health in particular, it's extremely awkward, because they have done this really good research. They have invested in understanding the problems in ways that I'm not convinced other tech giants actually have.
But the problem is that the findings were that some of their products' key features are uniquely problematic, that Instagram focuses attention on the body, unlike competing social media apps, which focus on the face or on performance.
And there's not an easy way to sort of get around that. So I think this is kind of a situation where they have realized the negative side effects of what they do, but I'm not totally clear that there's an easy way to address them, nor are they.
"The Facebook Files," a series in The Wall Street Journal.
Jeff Horwitz, thank you very much.
Watch the Full Episode
Support Provided By:
Additional Support Provided By: