Governments have introduced age-related bans on popular social media platforms such as TikTok, Instagram, and YouTube for users under 16, aiming to protect children from online abuse, cyberbullying, exploitation, and exposure to harmful content. These measures are intended to safeguard young people’s mental health and well-being, but UNICEF has warned that restrictions alone are not enough to ensure safety.
UNICEF cautions that such bans could backfire, particularly for isolated or marginalized children who rely on social media for learning, connection, play, and self-expression. Many children may still access platforms through workarounds, shared devices, or less regulated apps, which could expose them to even greater risks. Age restrictions must therefore be part of a broader strategy that protects children from harm while respecting their rights to privacy, participation, and access to safe online spaces.
The UN human rights chief, Volker Türk, highlighted that social media platforms were launched without adequate human rights impact assessments and emphasized that regulations should not replace companies’ responsibilities to improve platform design and content moderation. UNICEF urges governments, regulators, and tech companies to collaborate with children and families to create a digital environment that is safe, inclusive, and centered on child well-being.
The agency also recommends supporting parents and caregivers in improving digital literacy, as they are often tasked with monitoring complex platforms and algorithms without the necessary tools or knowledge. Globally, similar measures are being considered or implemented, including laws in California, USA, and proposals in the European Union. Both UNICEF and UN human rights authorities stress the need to continually monitor the effectiveness of age restrictions while ensuring that the best interests of the child remain the guiding principle in online safety policies.







