Amnesty International has raised serious concerns that Bangladesh could face heightened human rights risks unless Meta takes prompt and effective action to address harmful content on its Facebook platform. Ahead of the 12 February parliamentary elections, Amnesty and other observers noted a surge in misleading and inflammatory content online, some originating from outside Bangladesh, particularly India. This content targeted political parties and minority communities, amplifying sectarian narratives and exaggerating divisions, thereby raising the risk of discrimination, violence, and tensions in the country.
Incidents in the lead-up to the elections, including attacks on media outlets, highlight the real-world impact of online incitement. On 18 December 2025, the offices of The Daily Star and Prothom Alo were attacked by mobs after months of threats circulated on social media portraying the outlets as “Indian agents” and “anti-national forces.” Investigations by The Daily Star and local fact-checking group Dismislab found a direct link between online incitement and the violence, while Bangladeshi authorities warned Meta about delays in addressing content that called for attacks, expressing concern over public security and minority safety.
Amnesty International warns that such online harms are not isolated and can rapidly spill into real-world violence. Divisive content, including misinformation about sectarian conflicts, has previously fueled fear and tension among minority communities, often originating from cross-border sources. Alia Al Ghussain, Head of Big Tech Accountability at Amnesty, emphasized that social media companies wield significant power over public perception and must act preventively to avert escalation. Bangladesh stands at a critical juncture, and timely intervention by Meta could help prevent further unrest.
The organization noted that Meta’s surveillance-based business model, which prioritizes engagement, incentivizes amplification of sensational and polarizing content. Even lawful content can pose human rights risks if algorithmically promoted, increasing its reach and potential for real-world harm. Amnesty highlighted that emergency mitigation measures, sometimes called “break the glass” protocols, are necessary in high-risk contexts like Bangladesh to prevent harm.
Ahead of the elections, Amnesty requested that Meta explain what steps it would take to protect vulnerable groups, particularly minorities, from cross-border harmful content. Meta was unable to respond within the two-week timeframe provided. Amnesty stressed that companies have a responsibility under international human rights standards to prevent and mitigate human rights harms linked to their operations, requiring proactive assessment, transparency, and effective measures.
Amnesty has also requested data from Meta on harmful content targeting minority groups, enforcement actions, Bangla-language moderation capacity, and the implementation of emergency safeguards ahead of elections. These concerns arise against the backdrop of political instability, including the July 2024 student-led protests that forced former Prime Minister Sheikh Hasina to flee to India, her subsequent death sentence in absentia for crimes against humanity, and strained diplomatic relations between Bangladesh and India.







