NEW DELHI – 93 percent of all hate speech posts reported to Facebook by monitoring group Equality Labs remain on the platform — including content advocating violence, bullying and use of offensive slurs, according to a new report from the South Asian advocacy group, which is dedicated to ending caste-based discrimination, Islamophobia and religious intolerance.
Facebook’s inability to curb hate speech is disproportionately harming India’s Muslim minorities and at times spilling over into real-world violence, according to the report, which draws worrying comparisons between the situation in India and the platform’s failures in Myanmar, where it was used to fuel violence against the Rohingya Muslim minority.
It has noted that if the timely intervention is not being made into the situations, it may cause large-scale violence.
“With an estimated 350 million+ Indian caste, religious, gender, and queer minorities currently at risk from this hate speech in India, this report provides timely and expert analysis and solutions. Informed partly by actual affected users, the insight and answers in the report provide a road map for stakeholders from multiple vantage points to help counteract a looming human rights disaster. The authors warn that without urgent intervention, such hate speech is likely to be weaponized as a trigger for large-scale communal violence in India,”said the authors about the report.
OVERRUN BY ISLAMOPHOBIA
Facebook has faced near ceaseless criticism at home and abroad for the often-unchecked megaphone it provides to hate mongers and merchants of disinformation. In India, those flaws appear super-charged and directed primarily at one community: Muslims. According to the report:
- Islamophobic content was the biggest source of hate speech on Facebook in India, accounting for 37 percent of the content reported by Equality Labs. Fake News (16 percent), casteism (13 percent) and gender/sexuality hate speech (13 percent) were the next biggest groups.
- 43 percent of the hate speech Facebook initially removed was restored within 90 days, and 100% of these restored posts were Islamophobic in nature.
- Facebook repeatedly states it responds to the majority of reports in under 24 hrs, but Equality Labs found that the median response time in India was 48 hours.
Facebook said it has removed some of the content Equality Labs flagged as breaching its Community Standards, though it has not seen the full report. But the company did not respond to a question about why so much of the content that was removed later reappeared on the platform.
Overall, researchers pinned the blame squarely on Facebook, which it described as ill-equipped and unprepared to deal with the torrent of hate speech on its platform. With almost 300 million active accounts and potentially hundreds of millions more still to join, India is Facebook’s biggest market, and its most challenging, with unique obstacles to overcome,. “Indian religious and socio-political contexts are complex enough to require their own review and co-design process to adequately address safety.” the report said.
But instead of tailoring a solution to cope with India’s specific challenges, the company continues to rely on community standards and practices designed for western markets, Equality Labs says, that don’t track with India’s challenges.
The problem is two-fold.
First, Facebook’s moderators have not been trained to properly understand the nuance and cultural context of posts in dozens of languages, Equality Labs said.
Second, Facebook only supports eight of India’s 22 official languages, meaning community standards and reporting mechanisms are often only available in English — meaning users don’t even know how to flag hate speech. To try and cover over the cracks, Facebook continues to rely on an army of volunteer translators to deal with issues in the languages it doesn’t support.
“If they have enough money to enter the market shouldn’t they have enough money to protect the users in those markets, particularly as they make money off the violence they face?” Soundararajan said.
The rise of Islamophobic hate speech on Facebook has coincided with a rise in real-world violence against Muslims in India, which has been fomented in part by increasingly divisive national politics. According to a recent study, Muslims were the victims of 59 percent of cases of religiously motivated violence — even though they make up less than 15 percent of the population.
Considering the current environment in India, Facebook has no excuse not to have had a better response plan in place to address Islamophobia, said Soundararajan, nor should they have been surprised, particularly in the wake of the atrocities in Myanmar.
“As early as 2013 Facebook knew the content on its platform could lead to large scale communal riots,” Soundararajan said. She points to Facebook’s role in helping to instigate the Muzaffarnagar riots. which led to left more than 50 deaths and over 75,000 people displaced from their homes. “Many say these riots were sparked by videos which were spread in part on Facebook.”
PEPE THE FROG TRAVELS TO UTTAR PRADESH
The report highlights a range of hate speech that circulates on Facebook in India. Among the most surprising was the proliferation of Pepe the Frog, the image favored among American white supremacists. In India, the internet meme was used to glorify the 1992 desecration of the Babri Masjid mosque in the Ayodhya district of Uttar Pradesh state by Hindu nationalist mobs, an act that triggered riots across India and the killing of hundreds of innocent Muslims.
The use of Pepe the Frog, considered an anti-Semitic hate symbol by the Anti-Defamation League, shows the common language of hate speech across the globe. Facebook knows this too. Documents uncovered by Motherboard a year ago show the company has a specific policy for Pepe, that doesn’t ban the image completely but deletes it if shown “in the context of hate, endorsed by hate groups to convey hateful messages.”
The report also reveals a worrying crossover with the hate speech problems Facebook encountered in Myanmar. According to Equality Labs, 6 percent of all Islamophobic posts researchers examined were anti-Rohingya posts. Facebook users labeled Rohingya “cockroaches” and posted screenshots from a debunked video claiming to show Rohingya slaughtering and cannibalizing Hindus.
“Clearly something is wrong with Facebook moderation when it comes to Rohingya centered hate speech and given the precarious conditions Rohingya face in India and across South Asia, this issue must be dealt with immediately,” the report says.
Ultimately, the problems facing Facebook in India stems from its failure to engage with activists and groups in India, Equality Labs said. And simply hiring more staff won’t solve the problem.
“Facebook staff lacks the cultural competency needed to recognize, respect, and serve caste, religious, gender, and queer minorities,” the report says. “The hiring of Indian staff alone does not ensure cultural competence across India’s multitude of marginalized communities.”
Facebook did engage to some extent with activists in India, and at the company’s South Asian Safety Summit held in Delhi last fall, Equality Labs presented an early draft of its findings — but the process was “slow and often times did not address the structural problems our report outlines,” Soundararajan said.
The activists are now calling on Facebook to conduct an independent, third-party human rights audit on the problems in India, similar to the civil rights audit it is conducting in the U.S.
“Facebook is complicit with the extremism that is pulling apart Indian society and it must act before it is too late,” Soundararajan said.
On the basis of the study, the report suggested recommendations to tackle the menace of growing fake news of the Facebook.
The report suggested that Facebook must prioritize an aggressive and thorough human rights audit of the following:
• The human rights impact of Facebook policy and programs and how the platform has been used by hate groups, political entities, and other public figures to stoke casteist and religious animosity or violence
• What risk assessments, if any, were conducted to improve understanding of the threats faced by Indian minorities on the platform
• Facebook’s hiring practices, especially with respect to safety, policy, and content moderation teams.
• Content Moderation, which should include hiring practices, contractor demographics, and slur lists. These lists should be open and transparent to the public.
• User Privacy
• Targeted Advertisements
• Security Policies
• Facebook’s elections and government unit’s work from elections 2014 to elections 2019.
• Empowering an independent audit team that is approved and monitored by both civil society and Internet Freedom advocates as well as by Facebook.
• This audit team must have clear competencies in caste, religious, and gender/queer minorities and includes members of Indian minorities in its composition.
• Research comparing impacts across India’s discrete language markets for analysis of implementations.
• Determinations regarding risk prevention, mitigation, and remediation plans for vulnerable communities.
• Revision of policies and practices to address the human rights risks identified.
• We recommend a regularly convened working group of Indian Internet freedom and civil society groups that work on the issues of caste, religious, and gender/ queer minorities. This group could work actively to counter casteist and religious bigotry while also helping provide input into Facebook’s policies and processes.