This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.

Facebook Profits From Hate Speech Aimed At The Marginalised, Activists Say

Thenmozhi Soundararajan of Equality Labs on how Facebook’s business model makes hate lucrative and why there needs to be human rights audit of the online platform in India.
Dado Ruvic / Reuters

HYDERABAD, Telangana — Social media giant Facebook has failed users who hail from oppressed communities, said Thenmozhi Soundararajan, the executive director of Equality Labs, a South Asian-American organisation that works on issues of technology and human rights.

Soundararajan, a US-based activist, spoke to HuffPost India ahead of the release of a report by Equality Labs at the RightsCon conference in Tunisia on Wednesday.

The report concentrates on what Soundararajan describes as Facebook India’s “failure to follow their own community standards” to protect the rights of marginalised castes and religious minorities on its platform.

The social network has “used its business model to make hate lucrative to profit off the normalisation of violent hate speech”, said Soundararajan, adding that they discovered in the course of their work that the company lacked cultural understanding of the problems faced by marginalised users in India.

For the latest news and more, follow HuffPost India on Twitter, Facebook, and subscribe to our newsletter.

“There are so many ways by which caste, gender, and religious discrimination becomes normal to Facebook India that we need an independent audit of the platform’s operations. From its hiring, to its content moderation pipelines, advertising, and its work related to elections, the company must allow audit of its operations in the Indian market. This can lead to an honest dialogue on the harm done to our communities in this market,” she said.

Edited excerpts from an interview:

The report says that “Facebook India has become a critical platform for building community and seeking new audiences”. The community referred to here is the majority of people in India who are marginalised—Dalit, Bahujan and Adivasi people, religious minorities and other oppressed categories. How vulnerable is this community on a platform like Facebook, which has close to 294 million accounts in India?

Facebook has failed its caste, gender and religious minority users. Going by its own community standards, it has failed to prevent normalisation of hate speech and disinformation on its platform. In fact it has done the opposite. It has used its business model to make hate lucrative to profit off the normalisation of violent hate speech.

Dalit women were the canaries in the coal mine, some of the first people to be targeted by the disinformation apparatus. Many in our communities were the first to face account bans and doxxing campaigns. As we built an advocacy relationship with Facebook to better understand how so many problems could be endemic to the platform, we also uncovered that there was basic lack of cultural understanding of our issues.

We also realised that we cannot even begin to address the harm until there is an audit of this harm, because the problem is rather huge. Our findings were akin to the findings of social media accountability campaigns, which were led by civil rights groups in the US.

There are so many ways by which caste, gender, and religious discrimination becomes normal to Facebook India that we need an independent audit of the platform’s operations. From its hiring, to its content moderation pipelines, advertising, and its work related to elections, the company must allow audit of its operations in the Indian market. This can lead to an honest dialogue on the harm done to our communities in this market.

Facebook is now a platform which creates “powerful opportunities for dialogue, engagement and global connection” for vulnerable sections in India, says the report. On the other hand, it is also a global corporate giant which has market interests in India. How difficult is it for marginalised communities to navigate and negotiate terms of online safety when there is a conflict of interest between Facebook’s interests and the aspirations of vulnerable groups?

This is the contradiction. For many, Facebook is the de facto internet and it is their place for news and community. People use it like a public square. While it might feel like a communal and collective platform, the reality is that it is a space under corporate surveillance where WE are the product.

Our use of this platform—even our experiences of violence on the platform—helps Facebook make money. As we are both the users and the product, we have every right to demand Facebook fulfil its basic community standards. Indian users have a good leverage to make this demand because we are the largest market (and we are still growing) for both Facebook and WhatsApp.

Everywhere in the global north (developed countries), communities are demanding social media platforms to act against disinformation and hate speech. Indians also have this right. Particularly since Facebook contributed to the problems of polarization. In 2013, Facebook had proof that content on their platform could lead to large-scale communal violence. At that point, they should have taken a pause and conducted a human rights assessment as recommended by the UN Guiding Principles on Business and Human Rights. Instead, going by a pay to play model, with little insight into the volatile nature of Indian politics, they supported one party (Bharatiya Janata Party). How strange of them to think this would not have ramifications? Would they have supported one party over the other in another global market? If they did there would be an outrage. The damage done by this engagement is felt till date. Without a proper assessment, we will not know the scope of the harm done.

Indians deserve to know the scope and scale of Facebook’s engagement in the country.

Equality Labs has been advocating a “human rights audit” of Facebook India so that civil society gets access to “effectively track and contribute to mitigating hate speech”. The report suggests the same. When your work started at Equality Labs, did you have enough resources to take on a platform like Facebook, which has opaque community standards and hiring practices? What were the challenges you faced while trying to drive home the point that Facebook should make the platform safe for marginalised communities?

The human rights audit is the bare minimum Facebook must do to address the harm done to our communities on the platform. Already, our communities have faced physical and online violence.

“The Indian public deserves to know Facebook’s operations during the 2014 elections.”

This has not been an easy battle because Facebook is a large corporation that has continually minimised its engagement with civil society, particularly with Indian journalists and institutions run by minority communities. An early issue we faced was to get a seat at the table for dialogue to begin. That apart, as an American company, Facebook prioritises the safety of American and European markets because they get more advertising revenue from these markets. This, even when the future of Facebook is in the global south and the Indian market is a critical component of that future.

Nevertheless, because many of our counterparts in the global north and the global south (developing nations) stood by us in holding Facebook accountable, we were finally able to build a compelling advocacy pipeline and research methodology which could document what many of our communities already know through our experience on the platform.

Working with colleagues around the world, we were able to use the methodology earlier used by the Next Billion Network to document the failures of moderation on the platform. This not just gave us data for India but also allowed us to compare our findings with other countries. That report is forthcoming. But suffice it to say this is not just an Indian problem but a global one.

Recently, you became a target of hate speech when a photograph of Twitter CEO Jack Dorsey holding a poster with the line “Smash Brahmanical Patriarchy” went viral. How different are Facebook and Twitter as platforms when it comes to networking among Dalit-Bahujan-Adivasi-religious minority communities? Are the concerns raised by the report applicable to other social media platforms like Twitter?

All platforms could use more cultural competency when it comes to caste, religious minorities, and gender minorities as they all host record levels of disinformation and hate speech. The problem across the board is the Silicon Valley ethos, exemplified by Mark Zuckerberg’s statement: ‘move fast and break things’.

This attitude does not work while entering volatile democracies which their engineering teams do not understand, particularly when their development stacks are also made up of Savarna engineers who are eager to downplay the problem. Things will not change unless we take them to task for making money off the violence which has been polarizing democracies all around the world.

But we have seen some wins across the board. YouTube, for example, has, for the first time, included caste as a protected category in their hate speech guidelines. Twitter has been very open to working on these issues as well.

The report recommends an audit of “Facebook’s election and government unit’s work from elections 2014 to 2019”. This is the time when BJP came to power with a thumping majority in India. Did the rise of a Hindu nationalist party adversely affect advocacy and human rights initiatives for marginalised sections on Facebook?

The Indian public deserves to know Facebook’s operations during the 2014 elections. Facebook’s Sheryl Sandberg has waxed eloquent about Indian Prime Minister Narendra Modi. I also have a healthy level of scepticism when it comes to Facebook’s Katie Harbath, the company’s Public Policy Director for Global Elections. She was a campaigner for Rudy Guliani in New York City’s mayoral elections. She later assumed the role of National Republican digital strategist (Harbath was Chief Digital Strategist at the National Republican Senatorial Committee). When you hire someone who is so partisan for elections globally, how will her values impact her engagement? I’m concerned about how her track record has shaped her engagement in elections around the world.

In India, the company helped develop the online presence of Prime Minister Narendra Modi, who now has more Facebook followers than any other world leader. In the Philippines, it trained the campaign of Rodrigo Duterte, known for encouraging extrajudicial killings, in how to most effectively use the platform. According to campaign staff in Germany, it helped the anti-immigrant Alternative for Germany party (AfD) win its first Bundestag seats.

An audit that explains exactly how they were embedded in the 2014 Lok Sabha election campaign in India—what services they offered, how much money was spent, and also accounts created during the campaign—would be a starting point and a commitment to transparency.

Finally, I also think we need to push all platforms on their position on ‘notability’ where a politician who openly uses hate speech is allowed to retain their content online because they are notable. This may sound good in the abstract but let us place it in the context of Nazi Germany. Is Facebook saying that in Hitler’s times, it would allow Hitler’s anti-semitic content? In that context, at what point would ‘notability’ fail? Before or after the Jews were sent to gas chambers? These are questions we need to ask because their refusal to limit hate speech by ‘notable’ figures has consequences for us. And as users, we have a right to demand a response as our safety and democracies are at risk.

As social media platforms and successive governments pose challenges to vulnerable groups whose online presence is growing, do you think research in the area of hate speech is lacking? Why did you decide to spend resources on this report?

Unfortunately, our anecdotal experiences of violence do not mean anything to Facebook shareholders. We felt that the lack of data and awareness among Indian users about the platform, their rights on the platform as users and citizens is a crucial reason for this problem to persist. We felt, if more people know these problems, then people can work together to hold Facebook accountable.

A report like this definitely helps to send a message that Indian users want a more responsible engagement from Facebook. At the bare minimum, Facebook has to maintain its commitment to its user guidelines. If it does not, then it is negligent and we have a case to raise around its impact on our communities.

Is it difficult to statistically prove that verbal violence which targets historically marginalised groups, gender queer people and religious minorities exists on social media platforms? For an intersectional platform like Equality Labs, is it difficult to translate the lived experiences of marginalisation to social media giants?

I don’t believe it is difficult. While this was a qualitative study, a Facebook search on any slurs against Dalit, Bahujan, Adivasis, Muslims, Christians and Ravidassias will yield so much content that you won’t know what to do with it. At least in the global north, extremists find cryptic ways to use the N-word or anti-semitic slurs. Casteist and religious minority slurs are out in the open because of the lack of competency and diversity of Facebook content moderation pipelines.

This should be rectified immediately with transparency. Facebook cannot solve this with a bunker mindset. It needs to work with civil society of minority communities to co-design around this problem as it is clear they cannot solve this on their own.

What is the way ahead for marginalised communities who want to build solidarity networks on platforms which are not yet diversified or equipped to include them in knowledge creation and dissemination?

Caste, religious, and gender queer minorities who are being attacked on platforms should first and foremost always believe in themselves. They should believe that the problem of disinformation and casteist and extremist trolling are structural failures and not consequences of individual’s actions. The fact the platforms did not ensure our safety is not our fault. It is theirs.

So many people take the attacks personally and the violence triggers a cycle of trauma. We have to continue to support each other and understand when platforms deny the problem they gaslight not just us but millions of users.

We need to, like Ambedkar said, educate ourselves about the problem, agitate, and organize for accountability. Additionally, we should also continue to build our caste and tech equity power. Let us invest in developers, innovators and creators who help us build our platforms. Ones that are rooted in our power, where we can create new models for moderation, expression, and assembly.

There are many ways forward, we just need to hold fast, and not accept violence as the only option.

Close
This article exists as part of the online archive for HuffPost India, which closed in 2020. Some features are no longer enabled. If you have questions or concerns about this article, please contact indiasupport@huffpost.com.