Meta’s Oversight Board ruled that the company correctly allowed a manipulated video to remain on Facebook, depicting a Serbian protest as occurring in the Netherlands in support of former Philippine President Rodrigo Duterte. The video surfaced days after Duterte’s March 2025 extradition to the International Criminal Court in the Netherlands.
The manipulated video originated from footage of a protest in Serbia. Editors added audio elements, including chants of “Duterte” and the Tagalog song Bayan Ko. This song served as a common accompaniment during Filipino anti-martial law protests in the 1980s. Captions were also inserted to reinforce the altered narrative, suggesting the event endorsed Duterte amid his legal proceedings.
The video reached approximately 100,000 users and received shares from hundreds of accounts. Meta’s automated detection systems identified it as potential misinformation. In response, the platform reduced its visibility specifically for non-US users. Although placed in the fact-checking queue, the item escaped review due to the overwhelming volume of similar posts requiring attention.
Fact-checkers operating in the Philippines had previously examined comparable viral videos. They determined those instances to be false and labeled them accordingly. This precedent highlighted ongoing efforts to address deceptive content circulating in the region, particularly related to political figures like Duterte.
A separate Facebook user reported the manipulated video and appealed Meta’s initial decision to keep it online. This action prompted the involvement of the Oversight Board, an independent body that reviews content moderation choices. The Board’s examination focused on the balance between free expression and the prevention of misinformation.
The Oversight Board concurred with Meta’s determination to maintain the video’s public availability. However, it specified that Meta should have assigned a “High-Risk” label to the content. This designation applied because the video featured digitally altered, photorealistic elements carrying a high risk of deceiving the public, especially during a significant public event tied to Duterte’s extradition.
The Board pointed out shortcomings in Meta’s prioritization process for fact-checking. Videos of this nature, involving digital manipulation and political sensitivity, warranted expedited review. To address such gaps, the Board recommended establishing a dedicated fact-checking queue for similar content already vetted in specific markets.
Further recommendations included equipping fact-checkers with enhanced tools. These tools would enable quicker identification of misleading viral media. Additionally, the Board advised Meta to provide more detailed descriptions of its manipulated-media labels. Clearer criteria would help users comprehend the reasoning behind visibility adjustments and risk assessments.
In January, Meta discontinued its fact-checking program in the United States. The company introduced Community Notes as a replacement, drawing from user-generated annotations to verify information. Meta is currently evaluating the expansion of Community Notes to additional countries and has solicited input from the Oversight Board on appropriate locations for implementation.





