Here is a mea culpa unexpected. Meta, parent company of Facebook and Instagram, admits to having been overzealous in its content moderation. Nick Clegg, president of international affairs for the group, acknowledged that error rates in content removal remain “much too high”, to the point of hindering the freedom of expression that the company claims to defend.

“Too often, harmless content is removed or restricted, and too many people are unfairly penalized”Clegg said at a press conference. This self-criticism comes as users of the group’s platforms, notably Threads, increasingly complain about arbitrary deletions of their publications.

The Covid-19 pandemic: an assumed excess of zeal

The most striking example of this over-moderation concerns the management of the Covid-19 crisis. Mark Zuckerberg, CEO of the group, recently admitted to the House Judiciary Committee that this strict policy had been influenced by pressure from the Biden administration.

“We had very strict rules removing huge volumes of content during the pandemic”explains Nick Clegg. “No one knew how the pandemic would evolve, so it’s easy to judge with hindsight. But with this hindsight, we believe that we exaggerated a little.

Despite investments amounting to in billions of dollars in moderationMeta’s automated systems seem to err on the side of caution. Of the “moderation failures” recently trended on Threads, the group’s new platform. The company even had to issue a public apology after its systems deleted photos of President-elect Donald Trump surviving an assassination attempt. The Meta Supervisory Board has also sounded the alarm in the run-up to the US presidential election, warning against “excessive suppression of political speech”.

Faced with these criticisms, Meta seems ready to review his copy. Nick Clegg called moderation rules a “living, breathing document”suggesting major changes could be coming. This development comes as Mark Zuckerberg increases his contacts with political figures. Latest: a surprising dinner with Donald Trump, newly elected.

  • Meta publicly admits to having been too strict in its content moderation, particularly during the Covid-19 pandemic
  • The company’s automated systems too often remove innocuous content, affecting free speech
  • Major changes to moderation rules could be announced soon
Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *