|

Transparency required: is Facebook’s effort to clean up “Operation Carthage” damaging free expression in Tunisia?

Read in Arabic here.

Facebook may be a global company, but it doesn’t treat all of its users the same way. In Tunisia, where Facebook is the dominant social media platform, at least 60 accounts were mysteriously disabled at the end of May. Now, in the shadow of “Operation Carthage,” a Cambridge Analytica-style scandal that has revealed attempts to leverage the platform to influence presidential elections in Tunisia and countries in Africa, Facebook is remaining tight-lipped about what happened. We and our partners are demanding answers.

Accounts are abruptly disabled with no notice or explanation

On 29 May 2020, a number of Tunisian Facebook users reported that their accounts were disabled without warning or explanation. All they got was a message informing them that they “are not eligible” to use Facebook and that the decision was final.

Local Tunisian NGOs compiled a list of over 60 disabled accounts. Some belonged to Tunisian influencers, journalists, and activists, including Haythem El Mekki, Baryem Kilani (Bendirman), and Sarah Ben Hamadi. But many of the accounts were private, belonging to people who did not have a large following. Some people reported that their Instagram accounts and the Facebook pages they administered were disabled as well.

After days of silence despite the complaints, Facebook provided comments to The Guardian, stating that “due to a technical error we recently removed a small number of profiles, which have now been restored.” That was true. Thanks to the efforts of IWatch, a local trusted Facebook partner, and Access Now’s Digital Security Helpline, 28 accounts had been restored. Yet to date, Facebook has not explained its decision any further, nor has it directly communicated with any of the users impacted — including those who have had their accounts restored.

“Operation Carthage” and collateral damage 

While we continue to speculate about why Facebook made its sudden and opaque decisions, an investigation by the Atlantic Council’s Digital Forensic Research Lab, published on 5 June, has revealed that a Tunisia-based digital communications firm called UReputation is conducting sophisticated disinformation campaigns on social media platforms designed to influence presidential elections. These campaigns include efforts to influence Tunisia’s 2019 presidential election, support the reelection of Togo President Faure Gnassingbé in February 2020, and  boost former President Henri Konan Bédié’s campaign for the upcoming October 2020 election in Côte d’Ivoire.

On the same day that investigation was published, Facebook announced its own. The company said that it removed more than 900 “assets” affiliated with UReputation, including 182 user accounts, 446 pages, and 96 groups on Facebook, and 209 accounts on Instagram. They were disabled for violating the company’s policy against foreign interference, which the company defines as coordinated inauthentic behavior on behalf of a foreign entity.

We want to know: were any of the 60 disabled accounts identified by civil society linked to Operation Carthage? 

In the absence of any direct explanation from Facebook, we can only speculate that these accounts may in fact be collateral damage for the Operation Carthage clean-up, since Facebook’s report on its investigation makes clear that the company routinely uses “automated systems” to detect and disable “fake accounts.”

Lack of transparency harms free expression and puts democracies at risk

It’s still not clear what happened. What is certain, however, is that opaque decisions harm people’s right to freedom of expression and access to information. In some cases, a decision about content can also impact livelihoods. It is crucial to highlight that what Facebook characterizes as a simple “technical error” has real consequences for real people, some of whom reported loss of access to their work, art, or association’s pages. Others lost their personal photos, such as photos of their children and deceased loved ones. Many reported that removal of their accounts was so damaging that they are now reluctant to start over, as they have lost trust in the platform.

One Tunisian visual artist and illustrator told Access Now: “I’m using my Facebook and my Instagram to share my art and keep my followers updated about my art project. I’ve been active as an artist since 2000 and I always use social media to promote my work and to keep in touch with my followers. Disabling my Facebook and IG accounts as an artist is not acceptable, especially after reading the investigation that made me understand that I have no relation to what happened!”

Yet another Facebook failure: insufficient transparency or accountability in MENA


This incident is another reminder of the problems associated with relying on potentially biased algorithms and other forms of automation for content moderation, especially in places where Facebook has not invested sufficiently in localization or staffing. In September 2019, Facebook’s operation against “coordinated inauthentic behavior” in Egypt resulted in disabling the accounts of activist Hend Nafea, an outspoken artist known as Ganzeer, and journalist Ahmad Hasan al-Sharqawi.

Just in the past few days, Syrian activists launched a campaign, #FBFightsSyrianRevolution or in Arabic #فيسبوك_يحارب_الثورة_السورية, to denounce Facebook’s decision to take down/disable thousands of anti-Assad accounts and pages that documented war crimes since 2011, without providing notice. Once again, Facebook has not provided information to the public about the number of accounts disabled or why they were removed. According to the activists behind the campaign, Facebook has presumably banned mention of specific terms associated with the Syrian revolution, such as the names of militia groups, on the pretext of removing terrorist content. The terms that Facebook bans are often drawn from lists by the U.S. government or other international entities, and rules are applied without evident regard for the linguistic, cultural, and political contexts of the content.

In 2018, Facebook responded to large-scale global advocacy by committing to providing notice to all users before disabling accounts. By moving forward without notice, Facebook is violating its existing commitments to provide adequate notice as outlined in the Santa Clara Principles.

All of this shows the company has fallen short of its commitments and attempts at improving transparency and accountability in the MENA region. There are particular problems with the way Facebook is handling Operation Carthage and what we presume was an associated disabling of accounts. These problems show the damaging disparity between the way Facebook treats some Facebook users versus others.

For one, Facebook recently announced it will combat inauthentic behavior on its platform in the U.S. by verifying the identity of people and page administrators with a large following. To our knowledge, that did not happen in Tunisia or other countries in MENA. Not only did Facebook disable accounts without warning, several Tunisian users who used Facebook’s support page to complain and provide their IDs to verify their identities were nevertheless denied their appeal.

Another example of disparity is that Facebook’s Ad Library provides the public with data about who is spending money on Facebook ads on social issues, elections, or political ads, in a number of countries. The company has not included any MENA countries except Israel. 

It’s time for Facebook to step up

It was precisely in anticipation of shady campaigns like Operation Carthage that Access Now together with 14 Tunisian civil society organizations asked Facebook in an open letter to implement effective measures for transparency and accountability ahead of the 2019 Tunisian presidential elections. We specifically asked Facebook to make the identity and location of political ad sponsors public, as well as the amount spent on campaign ads. 

Facebook did not respond to us, nor did it implement any of our requests. The news breaking about Operation Carthage, and the collateral damage the clean-up has presumably caused ordinary users, is another opportunity for us to remind Facebook of its responsibilities. The company’s commitment to transparency and freedom of expression cannot be only a public relations exercise, with a bare minimum explanation due only to readers of The Guardian. It must be meaningful and extend to the MENA region and Africa, which the company has consistently chosen to overlook and under-resource.