April 20, 2024

excellentpix

Unlimited Technology

The Facebook crisis in India might be the worst Facebook crisis of all.

Thanks to whistleblower Frances Haugen’s testimony and the news articles based on documents she leaked, the public has gained an alarming new perspective on how Facebook ignored, downplayed, or failed to adequately address harassment, mis- and disinformation, and incitements to violence on its platform in several major countries. The documents, which CNN claims could be “the biggest crisis” in Facebook’s history, have revealed just how the network became an incubator of hate and terrorism from the U.S. to Ethiopia. Yet the most shocking revelations concern the nation that serves as the app’s biggest user base: India, the world’s largest backsliding democracy.

Reports of social media–fueled horrors within India—attacks on Muslims, lower-caste peoples, women, the poor, and refugees—have been troublingly commonplace for a half-decade now. Yet what the Facebook Papers confirm is not just that the network failed to curb Hindu nationalist hate speech and inadequately directed resources to monitor a nation with 340 million users; it also actively granted impunity to the worst offenders. And according to Haugen, this was one of her foremost concerns when she began to reach out to reporters with the internal information she held.

Let’s start with the first point. According to the Verge, at the end of 2019 Facebook placed India within “tier zero,” meaning it was one of the countries of utmost priority for the network’s harm-reduction efforts. (The timing here is noteworthy, as India had then erupted in mass protests against an Islamophobic law, which continued until an ugly anti-Muslim pogrom months later effectively deflated the rallies.) Yet Facebook was far more focused on another “tier zero” country: the U.S. Per the New York Times, “Eighty-seven percent of the company’s global budget for time spent on classifying misinformation is earmarked for the United States, while only 13 percent is set aside for the rest of the world.” In other words, the bulk of attention was directed toward a country that doesn’t even have as many people as India has Facebook users. Furthermore, the company’s “misinformation classifiers”—automated systems trained on machine learning to detect and take down posts with harmful falsehoods—were not developed enough to recognize and take action on millions of multilanguage disinformation posts that proliferated across Indian feeds.

The result was that Facebook simply missed huge amounts of misinformation, even as it kept touting the ability of its internal tech to detect and take down false news. The company invested in uncovering hate speech written in Hindi and Bengali, two of India’s major languages—even though the country as a whole has 22 constitutionally recognized languages and hundreds more dialects. These limitations also meant that Facebook was ill-equipped to stop the virtual human trafficking of Indian domestic workers. Not to mention, there were spillover effects from inaction in other countries: Myanmar’s ethnic cleansing of Rohingya Muslims was amplified by the network, and thousands of refugees streamed to India, where they’ve faced further persecution.

Facebook also clearly took sides when it came to Indian violence.

When Narendra Modi was elected India’s prime minister in 2014, there were plenty of reasons for any democracy-favoring thought leader to be concerned: his oversight of fatal Hindu-Muslim riots when he was chief minister of the state of Gujarat, his historic ties to the Hindu nationalist Rashtriya Swayamsavek Sangh organization, and his political party’s use of Islamophobia as a campaign tactic—a successful one for the far-right Bharatiya Janata Party, many of whose members stemmed from the RSS. Yet none of this dissuaded Facebook CEO Mark Zuckerberg from literally embracing Modi during his first years in power. After all, Modi’s campaign made ample use of Facebook and other social networks, and the new PM was eager to work with Silicon Valley firms to modernize India’s internet experience.

But as I noted just earlier this year, reports of BJP-linked cells using Facebook and WhatsApp to spread toxic rhetoric and lies surfaced as early as 2016; more such troll operations proliferated in the subsequent years, both within and without election contexts, and led directly to lynchings of religious minorities and riots stirred up by aggrieved Hindus. As revealed by the leaked documents, Facebook did carry out thorough probes surveying the rot of Indian social media content; an employee who spearheaded the research noted that “I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total. … [The feed] is a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.” According to the Wall Street Journal, investigators zoned in on two BJP-linked Hindu nationalist organizations they pinpointed as key drivers of mass Islamophobia, the RSS and the Bajrang Dal, and recommended that the latter be banned. But it didn’t happen, as the company worried that removing the Bajrang Dal would anger Modi. The Journal revealed last year that the then-head of Facebook India, Ankhi Das, opposed applying hate speech rules to Hindu nationalists and BJP politicians who aimed to spark violence—perhaps due to the fact that she herself was ideologically sympathetic to the country’s Hindu nationalists. (Das stepped down by October 2020.)

We now know for sure that Facebook was fully aware of the RSS’s anti-Muslim crusade and did nothing to address the root issue. And even with this level of appeasement, Modi and his government kept turning their wrath on Facebook for its belated banning of a few Hindu nationalist figures who advocated for killing Muslims. It didn’t matter that Facebook had also removed fake information pages started by opposition politicians and the Pakistani military; any reprimanding of the BJP crossed the line. This year, Modi’s government has cracked down the hardest it ever has on Facebook and other social networks, forcing them to remove posts unfavorable to the BJP, condemning them for spreading content supposedly offensive to fundamentalist Hindus, and threatening to fully expel them if they don’t follow new, restrictive rules drawn up by the Ministry of Electronics and Information Technology meant to ensure compliance. The message is clear: If Facebook doesn’t follow the BJP’s Hindu nationalist dogma to a T, it can kiss its largest market goodbye.

Ironically, the end result of Facebook’s willingness to tolerate political leaders’ religious violence and whittling of democratic institutions may be a further emboldened BJP—which was catapulted to power in part by the influence of Facebook—that could decide it no longer has a need for the world’s most powerful social network. That’d be one hell of a way to end this partnership.

Future Tense
is a partnership of
Slate,
New America, and
Arizona State University
that examines emerging technologies, public policy, and society.

Source News