Meta is deflecting accountability in India, and perpetuating human rights harms, by refusing to disclose findings from the independent Human Rights Impact Assessment (HRIA).
Meta’s first-ever human rights report has a mere four pages on Facebook’s impact on India, most of which is merely descriptive rather than providing detailed substantive information. It also shifts the blame to third parties, and emphasizes that Meta does not agree with its findings. It fails to include the assessor’s analysis, conclusions, or recommendations, and Meta does not make any commitments on next steps. In private briefings, Meta has made it clear that it will not release any further information, wholly rejecting the calls by civil society to release the full report.
HRIAs are supposed to show that a company is seriously committed to understanding and addressing its impacts on human rights. Investors also look to HRIAs to evaluate whether the companies in their portfolios are meeting their responsibility to respect human rights. Under the U.N. Guiding Principles on Business and Human Rights, Meta has a duty to “provide information that is sufficient to evaluate the adequacy of an enterprise’s response to the particular human rights impact involved” – one that it is failing to fulfill.
In late 2019, Meta commissioned Foley Hoag LLP to study the potential human rights risks in India, and 40 civil society stakeholders, academics, and journalists were interviewed in the process.
“Meta’s refusal to disclose the full findings of the India HRIA demonstrates its disregard for the effect its platforms have on people’s fundamental rights and shows contempt for the stakeholders who have consistently raised these concerns and participated in the HRIA process,” said Raman Jit Singh Chima, Asia Pacific Policy Director and Senior International Counsel at Access Now. “Unless Meta comes forward with the findings, the human rights report will rightly be interpreted as an effort to evade responsibility by appearing to do something, without doing anything meaningful at all.”
The decision to hide the assessment from the public is at odds with Meta’s own approach. It previously released a 62-page report on the HRIA on Myanmar, and summaries of HRIAs on other countries in Asia that include analysis and recommendations. Early this year, it also released a detailed report running over a 100 pages on the HRIA linked to Meta’s expansion of end-to-end encryption.
“The very purpose of conducting a HRIA – which is to identify harms, enable scrutiny, bolster transparency and accountability, and commit to change – is defeated by Meta’s decision to suppress findings,” said Namrata Maheshwari, Asia Pacific Policy Counsel at Access Now. “This will erode trust in Meta’s platforms, especially in the context of whistleblowers coming forward with revelations that Meta is aware of fake accounts and the capacity of algorithms to manipulate discourse and contribute to incitement of violence, but is failing to take necessary and timely actions to stop it.”
Conducting the HRIA was the beginning of the process of bringing out meaningful change to protect human rights, not the end. Meta must follow up the human rights report with full disclosure on India and concrete action to address its impacts.
In addition to hiding the results of the India HRIA, Meta’s first human rights report does not demonstrate support among senior level management for human rights commitments. “We have previously asked Meta leadership to show their commitment to human rights, such as by heeding shareholders’ call to appoint a board member with human rights expertise, but we are not seeing that kind of action,” said Isedua Oribhabor, Business and Human Rights Lead at Access Now. “Respect for human rights needs to start at the top.”