Artificial Intelligence (AI) is rapidly reshaping social, economic, and political life, but its benefits and risks are unevenly distributed. The 2025 UNDP Human Development Report emphasizes the importance of understanding diverse perspectives, particularly from populations often underrepresented in global AI discussions. In Bangladesh, the UNDP Accelerator Lab conducted a qualitative study to explore how different communities perceive, use, and experience AI, focusing on disparities between informed and uninformed users.
The study engaged 240 participants through 32 focus groups and 20 in-depth interviews, including social media users, content creators, mobile financial service users, people with disabilities, marginalized communities, farmers, artists, entrepreneurs, and women. Findings revealed wide gaps in AI awareness and literacy, with younger, tech-savvy participants using AI for productivity, creative work, and livelihood enhancement, while older or marginalized groups experienced AI mostly passively, limiting access and increasing vulnerability to misinformation, fraud, and exclusion.
Participants commonly framed AI through platforms such as Facebook, YouTube, TikTok, and bKash, revealing frustrations with opaque algorithms, biased content moderation, and automated systems that undermined trust and control. Concerns about privacy, surveillance, and digital rights were prevalent, especially among younger users, while older or less digitally literate participants often lacked awareness of privacy risks. Gendered impacts were significant, with women disproportionately affected by AI-enabled harassment and deepfakes, leading to self-censorship online.
Despite these challenges, participants expressed optimism when AI tools were presented in practical, relatable contexts, such as farming, translation, or voice navigation. Marginalized and disabled groups showed curiosity and willingness to engage when provided with accessible, culturally relevant tools and guidance. The study highlights the need for inclusive AI literacy programs, clear privacy policies in local language, culturally sensitive content moderation, and participatory approaches to AI development.
Ultimately, the research underscores that AI’s potential to empower all segments of society in Bangladesh depends on equitable, transparent, and culturally grounded design. Policymakers, developers, and civil society actors must prioritize lived experiences and foster trust to ensure AI serves as a tool for inclusive development, not further inequality.







