October 28, 2021

Excellent Pix

Unlimited Technology

When Facebook’s Staff Flagged Criminal Content, Its Response Was Weak – Tech News Briefing

This transcript was prepared by a transcription service. This version may not be in its final form and may be updated.

Zoe Thomas: This is your Tech News Briefing for Friday, September 17th. I’m Zoe Thomas for The Wall Street Journal.
Mexican drug cartels, human traffickers in the Middle East, and armed militia groups in Asia and Africa have all used Facebook platforms to recruit, conduct illegal business, or incite violence. What’s more, scores of documents reviewed by The Wall Street Journal show that Facebook employees have warned company executives about this criminal behavior in developing countries. But in many instances, the company’s response has been inadequate or non-existent.
On today’s show, the latest installment of The Wall Street Journal’s investigative series, The Facebook Files. Reporter Justin Scheck joins us to discuss why Facebook struggles to remove dangerous or criminal content and the catastrophic effect it can have on countries and individuals. That’s after these headlines.
China is trying to bring low-level tech workers onto its side as the government continues to battle big tech. Last week, officials summoned some of the country’s biggest tech companies, including Alibaba, Tencent, Didi, and Meituan and pressed them to improve conditions for tens of millions of gig workers. China’s ruling communist party has said its efforts come from wanting to shrink the country’s wealth gap. Activists and scholars say it’s really a broader effort to tighten control.
The US, UK and Australia agreed to a new security partnership that’ll help Australia build nuclear powered submarines that are faster, stealthier and remain underwater longer than traditional submarines. Until now, the US had only shared this technology with the UK. The deal also includes plans for cooperation in cyberspace and the development of AI, as well as quantum technologies and undersea capabilities. US officials declined to say if the effort was intended to counter China’s growing influence in the Indo-Pacific region, but the Biden Administration has made that part of the world a priority.
And the trial of former Theranos CEO, Elizabeth Holmes is wrapping up its second week. Prosecutors have been laying out their case that Holmes and former Theranos President Sunny Balwani misled patients and investors about the ability of their blood testing technology. Holmes has pleaded not guilty to 10 counts of wire fraud and two counts of conspiracy to commit wire fraud. WSJ reporter Sara Randazzo has been following the trial all week.

Sara Randazzo: We had two witnesses on the stand. Up first was a woman named Denise Yam, who was the longtime corporate controller at Theranos and effectively the highest financial employee for a number of years. And so she talked about how, ultimately from 2003 to 2015, the company logged $585 million in losses. So it really painted a picture of how not enough revenue was ever coming in the door to support the expenses that they had. And then the second witness we had this week was a woman named Erika Cheung, who was the whistleblower for the company. And so she was a young employee who joined out of UC Berkeley. And so she joined with all this hope and expectation about this company and what it could do, but then pretty quickly realized that the company’s proprietary machine just didn’t work as was promised.

Zoe Thomas: Erika Cheung is expected to take the stand again today and more former employees are expected to be called next week.
All right, coming up, advertisements for sex trafficking, a photo of severed hands, and calls for ethnic cleansing have all been posted on Facebook in developing countries. Why isn’t the social media giant doing more to stop it, we’ll discuss after the break.
Facebook is relying heavily on user growth from developing countries. More than 90% of monthly active users are now outside the US and Canada, but the company has struggled to police how criminal organizations and violent groups in those places use its platforms. It’s hired teams of experts in criminal investigation, human trafficking, and other wrongdoing to search Facebook and Instagram for nefarious activity and root out bad behavior. But when these cases were flagged by employees, documents reviewed by The Wall Street Journal show, the company’s response has often been weak or completely absent. Joining us to discuss this is our reporter Justin Scheck. Hi, Justin.

Justin Scheck: Hi. Thanks for having me.

Zoe Thomas: Let’s talk about some of the instances that were flagged by Facebook’s own investigative teams. In your story, you talk about a Mexican drug cartel that was using Facebook. Can you explain what happened and what Facebook’s response was?

Justin Scheck: Yeah. So there was a team of investigators led by a former police officer that was observing criminal content on Facebook’s platforms. And they found that a famously violent Mexican drug cartel was using Facebook products to recruit and organize the training and payment of hitmen. And so in response to this, Facebook took down a lot of posts by the cartel and they took down the posts of people who they found to be part of this network, but nothing was done that could effectively prevent new cartel material from going onto Facebook. And so what we found reviewing the site is that nine days after this report was circulated within Facebook, a new post went up on Instagram, under an account name for the cartel, and it was a video of a person with a gold pistol shooting someone in the head. And that followed with other violent posts, including a man being beaten and someone holding a trash bag full of severed hands. This content stayed on for at least five months. And we know that because we found it by searching Instagram. It appears to be taken down now, but it took a while.

Zoe Thomas: Why didn’t Facebook just block the cartel from using its platforms?

Justin Scheck: That’s not clear. The documents say that this cartel in Facebook’s systems had been designated as a dangerous organization and therefore should have been automatically prevented from posting things. It’s not clear why the company didn’t take more effective action to keep their content off. Facebook told us that its employees know that they can improve their anti-cartel efforts. And they said the company’s investing in artificial intelligence to help bolster its enforcement against these groups.

Zoe Thomas: Now, they also had an issue with human trafficking on their site that rose to a point that even Apple was threatening to take Facebook off its App Store. How did that happen and what was Facebook’s response?

Justin Scheck: Well, these documents show that for several years, the company says since 2015, the documents we could see going back to 2018, people in the company were very concerned that people were using Facebook to recruit other people into coercive labor situations. And then in some cases to sell those people to employers in the Middle East under circumstances that even Facebook said was coercive and the amount of the trafficking. And they pointed this out internally and there was discussion about it, but they didn’t take very effective measures to address it.
And then in 2019, the BBC was working on a documentary about this issue, about people being bought and sold in Gulf countries. And they contacted Facebook and said, “We’re working on this. Can you comment on it?” In response, Facebook took down some pages, but didn’t do anything more than that. They then contacted Apple and told Apple, “We’re going to report that apps in your App Store are being used to sell people.” And so Apple contacted Facebook and said, “If you don’t take measures to prevent this, we are going to take Instagram and Facebook out of our App Store.” In response to that, Facebook took down much more content and took much more action to try to prevent the content from going up, but it still wasn’t terribly effective and there’s still content selling people for labor in the Middle East on their platforms.

Zoe Thomas: And then I want to talk about what happened in Ethiopia, where Facebook was used by violent groups, because I think this highlights a problem that a lot of native English speakers maybe don’t think about.

Justin Scheck: Internally, Facebook researchers have found that violent groups in a bunch of countries are using the platform to incite violence, to spread hate speech. And it’s something that’s been, I think well documented in Myanmar against the Rohingya minority, but the internal document show that Facebook knows it’s happening in other places, including Ethiopia. And it is much more of an issue in other languages partially because there’s a lot more violence in many of these non-English speaking countries. And partially because the company invests a lot less in technology to catch this stuff and they don’t have enough moderators who speak these languages to effectively get rid of this stuff.

Zoe Thomas: So it seems then, given Facebook’s content moderation efforts and kind of where they’re focused, that there’s sort of this two-tiered system when it comes to Facebook safety efforts.

Justin Scheck: What we found in these documents is that when it comes to fighting misinformation, 87% of the man hours they spend on that is in the US, 13% of it is the rest of the world. And part of that from what former executives tell us, is that it’s much more consequential if you get in trouble in the US, it has much stronger regulators and governments. And the consequences for Facebook’s business are probably much more serious if something goes deeply wrong in the US and if something goes deeply wrong in one of these other countries where the user base is growing a lot more, but there isn’t the same kind of regulatory risk.

Zoe Thomas: Justin, anyone who’s watched Facebook over the last few years knows that they’ve made kind of a big deal about their content moderation efforts and how much inappropriate content they’ve taken down. So how has this situation emerged? It seems really surprising that so much dangerous and violent content could be up there.

Justin Scheck: Well, Facebook has this dilemma, I guess, where they say that they’re not a publisher like The Wall Street Journal, which is responsible for the content it disseminates. They say they’re a platform. And this is an issue across the tech industry. They said, they’re a platform. And it’s a platform that other people use to post things, and Facebook’s not reliable for that stuff. On the other hand, they are spending very large amounts of money, trying to prevent bad content from being on the site. So they’re in a difficult position where they’re kind of forced to take responsibility for some of the bad behavior, but they also say they’re a platform that’s not necessarily responsible for it.

Zoe Thomas: All right. And ultimately what’s at stake for Facebook users, given these different practices for content moderation around the world?

Justin Scheck: Well, outside of countries like the US, where there’s relative safety, there’s not a lot of violence, there’s not a lot of danger, the consequences for people of bad behavior on Facebook could be much worse. Meaning in the US, if there’s misinformation, even with hate speech, that generally doesn’t translate into large-scale interethnic violence. Whereas in other countries that are less wealthy and less stable, content on Facebook that could be harmful is much more likely to translate into real-world harms, like physical violence and like some of the other issues we’ve been discussing, human trafficking and cartel activity, that kind of thing.

Zoe Thomas: All right. That’s our reporter Justin Scheck. Thanks for joining us, Justin.

Justin Scheck: Thank you.

Zoe Thomas: Before we close out the show, we want to give you one more opportunity to send us your questions on Apple. They will be featuring in an upcoming episode. If you’re wondering about Apple’s new line of products, what’s next for the App Store, or if you have any questions about privacy, leave us a voicemail at 314 635 0388. We’re planning to answer listener’s questions in an upcoming episode. Once again, that number is 314 635 0388. And that’s it for Tech News Briefing this week. Our producer is Julie Chang. Our supervising producer is Chris Zinsli. Our executive producer is Kateri Jochum. And I’m your host, Zoe Thomas. Thanks for listening and have a great weekend.

Source News