/
1 min read

Meta’s oversight board calls on the company to lift the prohibition on the Arabic term ‘shaheed’

Meta’s oversight board has advised the company to lift its blanket ban on the Arabic term “shaheed,” which means “martyr” in English, citing concerns about freedom of expression. After a year-long review, the board found Meta’s approach to be too broad and unfairly restricted the speech of numerous users.

Despite being funded by Meta, the oversight board operates independently and has proposed a more nuanced approach. It suggests that Meta only remove posts containing the term “shaheed” when they explicitly endorse violence or violate other community standards set by Meta.

This recommendation comes amid ongoing criticism of Meta’s content moderation policies, particularly regarding issues related to the Middle East. A study commissioned by Meta itself in 2021 highlighted the negative impact of its practices on Palestinians and other Arabic-speaking users.

Criticism of Meta’s content moderation policies escalated during the Israel-Hamas conflict last October. Rights organizations accused Meta of suppressing content sympathetic to Palestinians on platforms like Facebook and Instagram during a conflict that resulted in thousands of deaths in Gaza following Hamas’ attacks on Israel.

The Oversight Board’s report echoed these concerns, noting that Meta’s rules regarding the term “shaheed” were too broad and often led to the removal of unrelated content.

In a statement, Helle Thorning-Schmidt, co-chair of the Oversight Board, criticized Meta’s reliance on censorship as a safety measure. She argued that censorship could marginalize entire populations without improving safety.

Currently, Meta removes any posts containing the term “shaheed” when it refers to individuals designated as “dangerous organizations and individuals.” This includes members of Islamist militant groups, drug cartels, and white supremacist organizations.

Hamas, designated as “dangerous” by Meta, is a central focus of the debate over content moderation.

Meta began reevaluating its “shaheed” policy in 2020 but failed to reach a consensus internally. Seeking guidance, the company approached the Oversight Board last year, highlighting that “shaheed” was the most common reason for content removals on its platforms.

In response to the board’s recommendations, a Meta spokesperson announced that the company would review the feedback and provide a response within 60 days.

Leave a Reply