In the wake of news that Facebook disabled more than 60 accounts of Tunisian users, we and our partners wrote Facebook a private letter on June 4, 2020 demanding transparency regarding what happened. We wanted to know: Why did Facebook permanently disable the accounts? How many accounts did they disable in total? Why didn’t they notify the affected users? And how do we make sure that whatever happened in this case doesn’t happen again?
Facebook has not answered our questions fully and publicly, and today we are publishing that letter, both to be transparent and to underscore the seriousness of this issue for our communities. Millions of Tunisians rely on Facebook as a platform for democratic discourse, and increasing transparency is crucial to protect free expression and the health of Tunisia’s democracy.
As we explained in our previous post, while we were speculating about the reasons behind Facebook’s mass disabling of accounts, the news emerged from the Atlantic Council’s Digital Forensic Research Lab — and simultaneously from Facebook itself — that a Tunisia-based company, UReputation, had been carrying out “Operation Carthage,” a shady and sophisticated disinformation campaign designed to influence presidential elections, both in Tunisia during the 2019 presidential election and in a number of African countries.
Facebook reported that as a result of the disinformation campaign, it has removed hundreds of Facebook pages, accounts, and groups, as well as accounts on Instagram, for violating the company’s policy against foreign interference, which the company defines as coordinated inauthentic behavior on behalf of a foreign entity.
What we want to know is whether the accounts Facebook disabled were connected to the “Operation Carthage” clean-up, how many profiles were affected by the “technical error” the company has admitted to in a statement to The Guardian, and what criteria they used when they disabled these accounts and removed the pages.
In asking for additional information, we want to remind Facebook of its responsibilities and the commitments it has made to transparency and freedom of expression in the Middle East and North Africa region, as we outlined in our letter. Once again, we urge the company to:
- Provide adequate notice to all users as to what rule they violated and offer remedy to users with the opportunity to appeal decisions;
- Provide transparency on election campaigns and political ads for countries in MENA, as per other regions; and
- Provide information on its process for users to appeal a decision to disable an account.
Read on below for our joint letter to Facebook, to which we await a response:
June 4, 2020
Re: Disabled Facebook accounts of Tunisia-based users
On 29 May 2020, around 60 Facebook accounts belonging to users based in Tunisia were disabled without any prior notice from the platform. The users were informed that they “are not eligible” to use Facebook, and that the decision is final, without any further details or explanation, receiving the following message:
A trusted Facebook partner was able to recover some of the accounts so far and we are still waiting for more to be recovered. In less than 24 hours after the removal, 20 accounts reappeared, without any message or notification from Facebook. Some users reported to us that even if their personal accounts were recovered, the pages they used to administer were not.
Some of the affected accounts belonged to activists, bloggers, influencers, and journalists but most of them are “normal” private accounts with no big following. Some accounts used pseudonyms but most were under real names. Some users have reported that their Instagram accounts were disabled as well. Most of them are Tunisia-based accounts.
While Mark Zuckerberg has insisted that Facebook’s top priority is free expression, removing accounts with no explanation is hardly in line with that purported commitment. In fact, it reinforces the widely-shared perception that Facebook is only committed to honoring the rights of users in the United States and Europe. Furthermore, myriad experts, including the UN Special Rapporteur on Freedom of Expression, have made it clear that Facebook must provide due process when removing content and accounts, including an explanation of why the content was removed or accounts were disabled.
In 2018, after a global advocacy campaign, Facebook made a public commitment to provide notice to every user, as well as to provide users with the ability to appeal content moderation decisions in most instances. Specifically, the company stated that they would notify users that “the content in question has violated our Community Standards” and said: “We also identify for users the specific piece of content that violates our standards.” By failing to do so now, Facebook is not living up to its own commitments.
We, the undersigned organizations, ask that Facebook makes a commitment to transparency and clearness especially in regions like MENA where it has consistently made the choice to under resource. Specifically, we demand that Facebook respect its existing commitments to:
- Provide adequate notice to users as to what rule they violated
- Provide remedy to users to appeal decisions
Furthermore, we request that Facebook communicate the following by 12 June 2020:
1. Clarify the reason for disabling these accounts.
2. Specify the exact number of Facebook accounts affected by this action.
- Provide an explanation as to when the remainder of accounts will be restored or, barring restoration, provide an adequate explanation as to what rules were violated.
4. When will Facebook communicate publicly about what happened and send an explanation to the affected users?
5. What are the risks that this could happen again?
6. Is there a process that would help avoid this incident from happening again and that we can help the users implement?
We urge Facebook to consider taking the aforementioned steps to reinforce its commitment towards transparency and to strengthen its user community in the MENA region.
Dangerous Speech Project
Digital Citizenship Organization
Electronic Frontier Foundation
I Watch Organization
Ranking Digital Rights
Wikimedia/Yale Law School Initiative on Intermediaries and Information