Update: September 16 — Amazon responded to our letter, describing their approach to ongoing human rights evaluations.
September 6: Like it or not, Big Tech, regulation is coming! For better and for worse, more and more governments are establishing rules to govern how tech companies behave. With the tech sector responding to global geopolitical crises, from the war in Ukraine to the escalating situation in Myanmar, it is crucial for powerful companies like Amazon, Google, and Meta to not only respect human rights, but also tighten up their internal controls on how they respond to human rights issues. However, judging from the latest Ranking Digital Rights’ (RDR) Big Tech Scorecard, tech companies still have a long way to go to fulfill their human rights obligations.
The sixth edition of RDR’s Scorecard evaluated the performance of 14 global platform companies, examining how their policies and practices affect users’ rights to free expression and privacy. The key findings show that, while there have been some improvements, all the companies are falling short, with none of them earning a passing grade.
As we’ve done before, Access Now reached out to these companies to reinforce key recommendations from the Scorecard. Here are the recommendations we focused on for each company:
Publish more information about its processes for responding to government demands and private requests to censor content and to hand over user information.
Establish strong human rights governance and oversight across global operations. Amazon should disclose more information about its human rights due diligence, including whether it conducts human rights impact assessments on freedom of expression and privacy associated with the use and development of algorithms.
Be transparent about rules enforcement. Apple should publish more detailed data about how it enforces its own rules. The company’s transparency reports should include the number of apps removed from its App Store and specify which rules the apps violated. It should also include the number of ads restricted for violating its ad content and ad-targeting rules.
Be transparent about demands for content censorship and user data. While China’s political environment discourages companies from disclosing detailed information about government demands, Baidu could and should publish information about its compliance with private requests for content censorship and user data.
Publish more data on actions taken to restrict content and accounts. Google should disclose data on actions it takes to enforce its rules on its search, email, cloud, and digital assistant services. It should also publish additional data on how it enforces its ad content and targeting rules.
Publish more detailed data on content policy enforcement. Kakao should begin reporting service-by-service data about its content moderation. At present, the company aggregates these numbers across its many services, leaving the public with an overly general representation of its work in this area.
Be more transparent about government censorship demands. Meta should clarify its process for responding to government censorship demands targeting content and accounts of WhatsApp and Facebook Messenger users. It should expand the data it publishes about these types of demands by including a breakdown of the total quantity of demands per country, listing the number of accounts affected, and identifying the subject matter associated with those demands.
Improve policies for LinkedIn and Skype. The company offers stronger, more human rights-protective policies for Bing, Outlook, and OneDrive than it does for LinkedIn and Skype. Microsoft should apply its other services’ better policies, particularly regarding third-party requests, to these platforms.
Commit to human rights due diligence across operations. Samsung should publish more information about its privacy impact assessments. It should expand them to consider freedom of expression and conduct similar assessments on its policy enforcement, targeted advertising practices, and algorithmic systems.
Publish data on content rule enforcement. Tencent should publish data about the volume of content it restricts for violating company rules, to complement the data it publishes about account restrictions.
Conduct human rights due diligence. Twitter should conduct human rights impact assessments to identify risks that its business operations and services may pose to freedom of expression, privacy, and the right to non-discrimination. The scope of these assessments should include targeted advertising policies and the development and use of algorithmic systems.
Be transparent about government demands to block content or hand over user information. VK should disclose its process for handling government demands to remove content and report data on the volume of such demands.
Make human rights due diligence procedures more thorough. Yahoo should reinstate its previous disclosures about its process for conducting long-form human rights impact assessments on laws, regulations, and its own policies when risks are detected.
Be transparent about content removals and account suspensions. Yandex should publish data showing how it responds to government requests to censor content and accounts and how it enforces its own rules to restrict content and accounts. This will be especially important as the Russian government becomes more inclined to censor online speech.
We commend U.S.-based company Yahoo for its timely response describing how it puts its human rights commitments into action. Ranked second by RDR this year, Yahoo also earned the top score for privacy, although it must continue to strengthen its human rights practices following its acquisition by Apollo Global Management.
VK, the Russia-based company formerly known as Mail.Ru, provided a brief response that pointed to its ESG reporting for information on how it handles government requests. However, VK needs to renew its human rights commitment, which appears to be less explicit than it was previously.
We also heard from Kakao, the South Korea-based mobile messaging platform, which provided some information on its internal processes for handling human rights issues, including its newly formed Tech for Good Committee and its Human Rights and Tech Ethics Team. Kakao earned the highest score for non-U.S.-based platforms, and has improved over time, though it still has work to do to fully address human rights risks.
The U.S.-based giant, Meta, responded to specific aspects of its performance, noting that it is following up on some of the evaluation’s criteria. While doing so, Meta must continue to increase transparency on government requests for censorship. You can read RDR’s response to Meta here.
Finally, we heard from the U.S.-based company, Microsoft, outlining its policies on privacy and free expression. For the first time, RDR evaluated LinkedIn and found that, although Microsoft is a leader in the tech field on some human rights issues, it has room to improve, particularly when it comes to their LinkedIn and Skype policies.
We ask the remaining nine companies to engage with civil society – do not ignore us. Increasing transparency and stakeholder engagement is core to the United Nations Guiding Principles for Business and Human Rights (UNGPs) requirements for business. It can also help deliver on human rights and build trust among your users and investors. The latter depend on transparent public disclosures to verify that companies are respecting human rights, and they will use their voting power to ensure it. This is how you can show people who depend on your services that you take your duty to respect their rights seriously, and demonstrate the action behind your words.