
Meta's oversight board has called for an end to the sweeping prohibition of the Arabic term "shaheed," meaning "martyr" in English, citing concerns over freedom of expression. The board's year-long review revealed that Meta's approach was deemed "overbroad" and had unjustly curtailed the speech of countless users.
Operating independently despite its funding ties to Meta, the oversight board proposed a more nuanced approach. It recommended that the social media giant only remove posts featuring the term "shaheed" when they are directly associated with indications of violence or when they violate other community standards set by Meta.
This decision comes amidst longstanding criticism of Meta's content moderation policies, particularly regarding issues related to the Middle East. A 2021 study commissioned by Meta itself highlighted the adverse human rights impact of its practices on Palestinians and other Arabic-speaking users.
Criticism intensified during the escalation of tensions between Israel and Hamas last October. Rights organisations accused Meta of stifling content sympathetic to Palestinians on platforms like Facebook and Instagram, amid a conflict that claimed thousands of lives in Gaza following Hamas' incursions into Israel.
Echoing these concerns, the Meta Oversight Board's report underscored the inadequacy of Meta's rules regarding the term "shaheed." It noted that the broad application of the ban often led to the removal of content unrelated to glorifying violent actions.
In a statement, Helle Thorning-Schmidt, co-chair of the Oversight Board, criticised Meta's reliance on censorship as a safety measure. "Meta has been operating under the assumption that censorship can and will improve safety, but the evidence suggests that censorship can marginalise whole populations while not improving safety at all," she remarked.
Currently, Meta removes any posts featuring the term "shaheed" in reference to individuals designated as "dangerous organisations and individuals." These include members of Islamist militant groups, drug cartels, and white supremacist organisations.
Hamas, among the entities designated as "dangerous" by Meta, finds itself entangled in this debate over content moderation.
Meta initiated a reassessment of its "shaheed" policy in 2020 but failed to reach a consensus internally. Seeking guidance, the company approached the Oversight Board last year, highlighting that "shaheed" constituted the most significant reason for content removals on its platforms.
In response to the board's recommendations, a Meta spokesperson announced that the company would review the feedback and provide a response within 60 days.
For Unparalleled coverage of India's Businesses and Economy – Subscribe to Business Today Magazine