October 16, 2021

Excellent Pix

Unlimited Technology

Sex, Drugs and TikTok: What the Viral Video App Shows Minors – Tech News Briefing

This transcript was prepared by a transcription service. This version may not be in its final form and may be updated.

Zoe Thomas: This is your Tech News Briefing for Thursday, September 9th. I’m Zoe Thomas for the Wall Street Journal. A lot more of us downloaded TikTok during the pandemic. Its highly addictive algorithm hooked us with its viral dance trends and silly pranks. Showing us one video at a time, it learns what we like to watch and feeds us more of that. This approach helped make TikTok particularly popular with younger users. But the algorithm can also lead users to increasingly extreme content. On today’s show, a Wall Street Journal investigation found TikTok’s algorithm can drive its youngest users aged just 13 to 15 to videos about sex and drugs. Investigative reporter, Rob Barry joins us to discuss how young users can fall into this cycle and why it’s difficult for those posting adult content and TikTok to prevent it. That’s after these headlines.
Opening arguments took place yesterday in the criminal case against former Theranos chief executive, Elizabeth Holmes. The government alleges Holmes, once a Silicon Valley darling whose blood testing startup was valued at over $9 billion, lied to patients and investors about the accuracy of her company’s technology. She’s charged with 10 counts of wire fraud and two counts of conspiracy to commit wire fraud. She’s pled not guilty. The Journal’s Heather Somerville is covering the trial for us. She was there as Holmes arrived at the courthouse flanked by family members and was met by a stampede of photographers.

Heather Somerville: Once inside courts, government prosecutors laid out their case for the jury. They talked to the jury about the fraud that they alleged Ms. Holmes had committed deceiving investors, as well as patients about the accuracy and reliability of her blood testing technology and the harm that that caused, costing investors hundreds of millions of dollars in giving patients false test results about cancer and pregnancy. And then Ms. Holmes’ attorneys got their turn and talked to the jury about a very hardworking young woman who had dropped out of college and given 15 years of her life to the quest to improve lab testing, making it more affordable and less painful for patients, and made the argument that Silicon Valley startups often fail and this was nothing different.

Zoe Thomas: You can get live updates on the trial at wsj.com.
GameStop reported sales growth of 26% in the latest quarter, three months after overhauling its leadership team. Quarterly revenue rose to nearly $1.2 billion up from just under 950 million a year ago. But the video game retailer still reported a loss last of more than $61 million. GameStop is in the midst of trying to modernize its business as video game buyers shift away from traditional retail and more to downloading titles. In June, the company named Amazon veteran, Matt Furlong as its chief executive. And shareholders voted Chewy co-founder, Ryan Cohen as chairman cementing oversight and control he’d been gaining during the months when meme stock investors sent the company’s share price soaring.
And Australia’s top court ruled that media companies that post articles on Facebook are liable for comments on those posts. The High Court of Australia determined that newspapers and television stations should be considered publishers of comments because posting content on Facebook facilitates and encourages comments from users and therefore they should be responsible for any defamatory content in their comment sections. Media companies involved in the case, including News Corp Australia, a subsidiary of News Corp, which owns the publisher of the Wall Street Journal, criticized the ruling which could prompt traditional publishers to rethink how they engage with social media.
All right, coming up, sex, drugs, and viral dance videos. A Wall Street Journal investigation found TikTok’s algorithm can direct miners to inappropriate content along with harmless trends. We’ll explain how it happens after the break.
TikTok is the fastest growing social media site. As we’ve talked about on the show before, one thing that makes it so popular is how it seems to get into user’s heads, serving up a continuous stream of content that appeals to their interests. In July, the Wall Street Journal released an investigation into TikTok’s secret weapon to doing this, its powerful algorithm. The analysis revealed the algorithm really just needs one piece of information, how long you watch a video. And it found users can be sent down rabbit holes of increasingly extreme content. But what does that mean for minors on TikTok who are among the biggest users of the app? The latest findings from the Journal’s investigation show that TikTok’s algorithm can drive minors into endless pools of content about sex and drugs. Joining us to discuss the findings and TikTok’s response is our reporter, Rob Barry. Hi Rob.

Rob Barry: Hi there.

Zoe Thomas: You were part of the investigative team looking into how TikTok’s algorithm works and what type of content it shows users, particularly young users. Can you explain to us briefly how this analysis worked?

Rob Barry: We created a bunch of accounts, 31 in total that we told TikTok ages for. And we said the ages were between 13 and 15 years old. And we programmed these accounts with various interests, but we didn’t tell TikTok what these interests were. Instead, what we did is we had these accounts just start browsing their TikTok feed and then when they saw videos that matched their interests, whether that be through text or images that we’d programmed them to watch, then they would spend more time watching those videos and quickly scrolling past the videos that didn’t match their interests.

Zoe Thomas: Okay. Let’s break that down a little bit. Can you maybe describe how this worked for one account in particular?

Rob Barry: For one of our 13 year old accounts, we first signed up, created it, and then the first thing we had the account do is actually run a search for OnlyFans, which is a social media site that specializes in adult entertainment. It watched a couple of videos that came back from that search, total of four, and then it started browsing its feed. We had it programmed to rewatch videos that involved text around sexually suggestive content. And so fairly quickly as it was browsing its feed, it started getting served videos that were about sexual topics. And in those cases, the bot would stop and rewatch those videos and spend more time on them and then keep scrolling. And what we found was that before long, the bot just was immersed in this type of content.
On the other hand, just really quickly, there was another account that we created that we did not do any kind of searching or anything like that and it just browsed its feed, but it paused on videos related to a couple of topics, including drugs. And that account also found itself in a rabbit hole of marijuana and psychedelics and other drug-related content. It got hundreds and hundreds of videos like that back to back in very quick succession.

Zoe Thomas: It seems pretty shocking that accounts for such young users would be served up this type of content. What was TikTok’s response?

Rob Barry: First and foremost, they said that our accounts did not simulate real users of TikTok in that real users jump around and do a variety of things. They also said that a lot of the videos that were served to our accounts didn’t violate guidelines, TikTok’s guidelines. Though we know it in the piece that after we contacted TikTok with the videos, a large number of the videos were taken down. We can’t say for certain whether they were taken down by TikTok or by their creators, though TikTok did acknowledge that they took some of them down. They wouldn’t tell us how many.

Zoe Thomas: What about the people who post this type of content? What do they say about young people seeing maybe sexualized videos that they’re making?

Rob Barry: One of the things we noticed was that thousands of these videos had little tags on them in the user’s description or in the video description that said this video is for adults only basically, 18 plus. And we actually reached out to some of those people and one of them told us that they really wished that TikTok did a better job of controlling whether or not minors can see specific content. So giving creators the ability to flag content explicitly as for adults only. But TikTok told us that as of right now, they don’t differentiate between what they show a 13 year old and what they show, say a 21 year old.

Zoe Thomas: All right, then what types of rules does TikTok have then to protect minors?

Rob Barry: TikTok’s terms of service say that users have to be at least 13 years old and that users under 18 need consent from their parents. And they say that they have taken industry first steps to promote a safe and age appropriate experience for teens. They say that the app allows parents to manage screen time and privacy settings for their children’s accounts.

Zoe Thomas: And of course it’s not the only social media platform that has this type of inappropriate content on it. What did the experts you spoke to say about the implications for young people watching these types of videos?

Rob Barry: One psychologist that we spoke to, David Anderson said that teenagers in particular are particularly vulnerable to seeing back to back large amounts of problematic content. He said that they can experience what he called a perfect storm in which social media normalizes and influences the way they view drugs or other topics.

Zoe Thomas: And for many sites, not just TikTok, removing this type of content is kind of like a game of whack-a-mole. Every time they take something down, a new one is up. Can you tell us what TikTok is doing, how it tries to remove this content?

Rob Barry: Yeah, look, TikTok acknowledges that no algorithm is going to be perfect in identifying problematic content and that there’s challenges in this that just are going to require some degree of manual intervention and people will have to just understand that, is TikTok’s position on this. The platform has, I think about 10,000 people who are reviewing content, which by the way, they do take down a huge number of videos. They said recently they took down 89 million videos in the second half of last year. But they have tens of thousands of videos uploaded every single minute. Former executives who we spoke to said that the platform just is having trouble keeping up with the app’s growth. So what that leads to is that moderators focus on really the most popular content and they essentially leave the videos with lower view counts less reviewed.

Zoe Thomas: All right. That’s our reporter, Rob Barry. Thanks for joining us, Rob.

Rob Barry: Thanks so much for having me.

Zoe Thomas: And that’s it for today’s Tech News Briefing. You can always find a more tech stories on our website, wsj.com. And if you like our show, why not leave us a rating and review? You can do that wherever you get your podcasts and it really does help. I’m Zoe Thomas for the Wall Street Journal. Thanks for listening.

Source News