THE AMERICA ONE NEWS
Jun 2, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET 
Sponsor:  QWIKET 
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge.
Sponsor:  QWIKET: Elevate your fantasy game! Interactive Sports Knowledge and Reasoning Support for Fantasy Sports and Betting Enthusiasts.
back  
topic
Le Monde
Le Monde
14 Nov 2023


Inline image

At the beginning of November, the major social media companies that hadn't already done so submitted their transparency reports for 2023, required for the first time by the European Union (EU) under the Digital Services Act (DSA), which came into effect at the end of August. In addition to general statistics on their services, such as the number of users, these documents give unprecedented insight into the resources dedicated by Facebook, Snapchat, TikTok, and others to the moderation of illegal, hateful, or fraudulent content.

These reports should be treated and compared to one another with a grain of salt, as there are many methodological differences between the social media companies, the observation periods do not always correspond to the six months required by the EU, and, above all, a great deal of information is missing, with Snapchat's and YouTube's reports being the worst in terms of omissions.

Moderator headcount can double or even triple

These files do, however, shed some light on the number of moderators responsible for deleting problematic posts. A comparison of the figures shows that TikTok, a social media company with a Chinese parent company, claims the highest number of moderators: Four times more than Meta, the parent company of Facebook and Instagram, which boasts the highest number of users in Europe.

How can this discrepancy be explained? In its transparency report, the American-run Meta specifies that its workforce outside the EU includes other moderators capable, in the event of a peak in activity in the region, of intervening in English, French, or Spanish.

Meta also highlights the efficiency of its artificial intelligence (AI) moderation tools, which TikTok does not emphasize. The rate of moderation automation is very high at Meta: At Facebook and Instagram, respectively, 94% and 98% of decisions are made by machines – far more than the 45% reported by TikTok.

However, AI is not always effective compared to human moderation, as demonstrated by the revelations from the "Facebook Files," internal Meta files released in 2021 by whistleblower Frances Haugen. In its transparency report, LinkedIn is also one of the few platforms to make the error rate of its automatic content deletion tools public: it is estimated at 10% for English-language publications, 30% for German, 37% for French, and 80% for Spanish.

The failings of X

For their part, Snapchat and X have decided not to communicate their moderation staffing levels for Europe alone and to give global numbers instead. However, even on this scale, their teams are skeletal.

You have 45% of this article left to read. The rest is for subscribers only.