October 25, 2021

Excellent Pix

Unlimited Technology

Zuckerberg Resisted Fixes for Facebook’s Divisive Algorithm – Tech News Briefing

This transcript was prepared by a transcription service. This version may not be in its final form and may be updated.

Zoe Thomas: This is your Tech News Briefing for Thursday, September 16th. I’m Zoe Thomas for the Wall Street Journal. A few years ago, Facebook made some changes to how it decides what users see when they log in. Those changes unintentionally led to more divisive content rising to the top. And we report that even when concerns were raised about it, CEO Mark Zuckerberg resisted some of the proposed fixes. On today’s show, the latest installment of the Wall Street Journal’s investigative series, the Facebook Files. Reporter Keach Hagey joins us to discuss the hidden reasons behind the company’s decision to change its algorithm and the impact it’s had on users. That’s after these headlines.
DoorDash is suing New York City again. This time over a bill the city approved in the summer that would require food delivery companies to share more information with restaurants. Data, including a customer’s name, phone number, email, and delivery address. In a lawsuit filed Wednesday, DoorDash, the biggest food delivery app in the US called the law an intrusion of consumers’ privacy, arguing it doesn’t include restrictions on how restaurants can use or store that data. Last week, DoorDash joined forces with rival companies, GrubHub and Uber to sue New York city over another law that puts permanent commission caps on what apps can charge restaurants. A city representative didn’t immediately respond to a request for comment on the new suit.
A bipartisan proposal in the Senate aims to curtail law enforcement agencies use of commercial data brokers. These brokers typically sell information to marketers and advertisers. But over the last few decades, they’ve created products that cater to law enforcement who’ve used them to track suspects and missing people. Privacy advocates say it’s the equivalent of warrantless surveillance. Our reporter Byron Tau has more on the debate.

Byron Tau: I think privacy activists would say that most Americans don’t have the faintest idea that things like weather apps or their connected car or their utility company are putting up lots and lots of data for sale about them that they would think is protected and not just purchasable on an open market. And the privacy activists would say that that kind of data reveals a lot of sensitive things about Americans, about where they shop, where they go, what doctors they visit, all sorts of very intimate details, and that should be protected. It should require judicial supervision and probable cause of a crime before a law enforcement entity can access it.

Zoe Thomas: All right, coming up, Facebook says its platforms are forming positive connections among users. So why did its CEO resist to some changes to reduce divisiveness? We’ll discuss after the break.
Facebook has always held that its algorithm, the code that decides which posts get shown to users and in what order, is designed to create meaningful connections. Here’s CEO Mark Zuckerberg before Congress in March.

Mark Zuckerberg: Now I know that technology can help bring people together. We see it every day on our platforms. Facebook is successful because people have a deep desire to connect and share, not to stand apart and fight.

Zoe Thomas: But the Wall Street Journal reports that changes Facebook made to its algorithm have unintentionally had the opposite effect, rewarding divisive and angry posts. What’s more, the company knows that and Zuckerberg himself has been reluctant to make changes. Joining us to discuss how this unfolded is our reporter Keach Hagey. Hi Keach.

Keach Hagey: Hey.

Zoe Thomas: Keach, we are going to talk about the impact that changes Facebook’s algorithm had and the reaction it received. But let’s start with discussing what News Feed is. For people who don’t use Facebook or maybe haven’t been on in a while, what does Facebook look like now and why is it so important to the company?

Keach Hagey: News Feed is Facebook’s central feature. So it’s that scroll of baby pictures and updates about whose dog died that is really the most important part of the core Facebook app and the main way that Facebook as a company makes advertising revenue of which it makes a rather great deal.

Zoe Thomas: So in 2018, Facebook decided to change the algorithm that controls News Feed because it said it wanted to encourage more meaningful interactions between friends and family and stop people spending time just passively scrolling, which research suggests is harmful to mental health. So they developed this new algorithm focused on something called MSI. Can you tell us what MSI is and how the changes to the algorithm worked?

Keach Hagey: So MSI stands for meaningful social interaction and it was a new emphasis of the algorithm that was trying to encourage people to like and comment and interact with each other more. So they used a formula to do this and there was a point system. A like was one point. A reaction, which could be something like an angry emoji, that was five points. And something like a comment could be 15 points or if it was a significant comment, 30 points. So from this formula which was one of the initial formula, you can see that they were really trying to put a lot of emphasis on comments, getting people to talk to each other and what ended up happening was that that really encouraged arguing.

Zoe Thomas: We know when Facebook said publicly about why it wanted these changes, but privately your reporting has shown they had some other reasons.

Keach Hagey: Right. So of course publicly, Mark Zuckerberg said that this was to improve user’s mental health. But according to the documents that we’ve seen, during 2017, the year before they made this large algorithm change, people inside the company were starting to panic because they were seeing that key measures of engagement were declining and in some case going into what executives or employees called freefall. So things like likes and comments were declining and they needed to do something to turn that around because they worried that if people were engaging less with Facebook and if they were just laying back and passively watching 10 minute videos at a time, they would realize that this was bad for them. They would kind of snap out of it and they would stop using the platform altogether.

Zoe Thomas: All right. Once Facebook made these changes, what were the effects that people started to see? What did users notice?

Keach Hagey: Users of different kinds noticed that things that provoked outrage did really well on Facebook. And that had always been a little bit true, but this new algorithm and its emphasis on engagement made it much worse. So some really expert users of Facebook were publishers. That would be a company like Buzzfeed. In these documents, we see the CEO of Buzzfeed emailing an executive at Facebook saying, “This thing that is called meaningful social interactions, it does not boost meaningful social interactions. In fact, it incentivizes people like us and our competitors to make really divisive sensationalistic content that makes people argue.” And so they saw that the most sensationalistic and divisive content was doing the best.

Zoe Thomas: So we know Buzzfeed raised this as an issue with Facebook. Did other publishers or other users of Facebook raise concerns?

Keach Hagey: Yes. In these documents, we see that political parties around the world, particularly in Europe, told Facebook’s own researchers who were looking into the question of, okay, how did this algorithm change affect political discourse? Political parties told Facebook that it forced them to make more divisive and sensationalistic messages out on the platform. So if you wanted to communicate in order to get distribution, you basically had to attack your opponents and then get those attacks to create negative comments and then use those negative comments or fighting as sort of a tailwind that would propel you throughout the Facebook ecosystem and give you distribution.

Zoe Thomas: Once Facebook started hearing that this was happening, what was its reaction and what was CEO Mark Zuckerberg’s reaction?

Keach Hagey: So inside Facebook, there is a team called the integrity team that is there to help mitigate some of the harms that Facebook causes and especially to increase the quality of content on Facebook. And members of this team proposed a range of fixes to try to tweak MSI, to undo the worst parts of it, which are really the parts that relied so heavily on engagement. And a few of these discussions were proposed to Mark Zuckerberg and one particular fix was to undo or turn down one particular aspect of this algorithm that basically boosted content in your feed based on Facebook’s own guess about how likely you and people on from you down the chain would be to pass it on to others. So basically how likely it was that this piece of content would generate a long chain of reshares. It’s called downstream MSI. And the researchers could tell that this thing was really tied to misinformation and tied to toxicity and divisiveness. And so they proposed first to get rid of it or turn it down for civic and health content.
And Facebook did actually turn that down and then to broaden that effort beyond that to other categories. And when they brought that idea to Mark Zuckerberg, Mark basically said no. In this memo that we saw, the person who did that said, “Mark doesn’t want to go broad with this fix. Maybe we could test it a little bit, but not interested in going broad and definitely not interested in launching it if it would have an impact on MSI,” meaning he was not willing to suffer any declines in engagement or any meaningful declining engagement in order to fix this problem. So you really see two things. You see a company that is interested in trying to solve these problems and has put a lot of resources behind measuring the problems. But then you also see that as they flag their concerns and as they move up the chain, when they get all the way up to the top, it just, again and again appears to be not worth the trade-off for growth and for profit to make these changes.

Zoe Thomas: Have we heard from executives now about the impact of these changes and what they think of the Wall Street Journal’s reporting on it?

Keach Hagey: Yes. We have spoken to several Facebook executives about it and their main points are, any algorithm is going to have its problems. Any algorithm is going to risk giving some folks who want to game it a leg up. And that’s why we have an integrity team. And the integrity team is in fact the group that made a lot of these reports that we saw, they were flagging the problems. So that’s one thing and their broader point really is, “Look, we didn’t create divisiveness in society. It was there before Facebook. You cannot say that we are responsible for the full state of our divided discourse today.”

Zoe Thomas: And what changes has Facebook made?

Keach Hagey: So actually just recently Facebook made a change that was very similar to the one that was proposed to Mark Zuckerberg almost a year and a half ago and that he gave such a cool reception to. This change was part of a broader push that the company announced in the wake of the January 6th riots because of course Facebook had come under a lot of criticism for the way that protestors and the election protesters have been using the platform both to criticize the election and to organize the protest that led to the riots. So in the wake of those in February of this year, Facebook announced that it was going to try to reduce the amount of political content in News Feed overall, and recently they gave kind of an update to that effort and as part of that update, they said one of the things we’re going to do is we are going to stop putting so much emphasis on content that we think is going to be reshared or get comments.

Zoe Thomas: All right. That’s our reporter, Keach Hagey. Thanks for joining us, Keach.

Keach Hagey: Thank you.

Zoe Thomas: And that’s it for today’s Tech News Briefing. But before we close out our show, we want to give you another opportunity to send us your questions on Apple. On Tuesday, the company unveiled its latest line of iPhones and a new smartwatch. So are you thinking of upgrading? Do you have any questions about privacy, what to do with your old devices or Apple’s competition with other tech giants? Leave us a voicemail at (314) 635-0388. We’re planning to answer these questions in an upcoming episode. Once again, that number is (314) 635-0388. We’re looking forward to hearing from you. And remember, you can always find more tech stories on our website, wsj.com. And if you like our show, please rate and review it. You can do that wherever you get your podcasts. I’m Zoe Thomas for the Wall Street Journal. Thanks for listening.

Source News