A Landmark Step for Children’s Protection Online
- Aug 15
- 2 min read
The children’s safety provisions of the Online Safety Act have come into effect, imposing significant new duties on social media platforms and search engines operating in the United Kingdom. Platforms accessible by children must now take appropriate measures to protect them from harmful content, including implementing highly effective age checks and adjusting systems that recommend content.
These protections were hard-won and will play a crucial role in keeping children safe. The Act is the culmination of over a decade of campaigning by parents, children, civil society, academics, and parliamentarians. It ensures children are protected from:
Pornography: By age 11, 27% of children have seen pornography, with the average age of first exposure at just 13. Early exposure can shape harmful attitudes about consent and violence. Platforms are now required to prevent children from accessing such material.
Predatory contact: In 2023, 31% of children aged 9-16 reported that strangers had tried to contact them online. The Act now mandates platforms to block unknown adults from contacting children via direct messages, stop recommending children’s accounts to strangers, and turn off location sharing by default for minors.
Self-harm, pro-suicide, and eating disorder content: Research shows that children’s accounts are often exposed to harmful content within minutes of signing up. The Act ensures platforms filter out such content and protect young users.
Family support and accountability: Families now have clearer pathways to obtain information from platforms when children are harmed, and tech companies are required to cooperate with coroners investigating child deaths linked to online harms.
These protections are critical. They address a range of harms and provide accountability, ensuring that children’s right to be safe online is recognised and enforced. The Act also requires that all age verification methods are privacy-preserving, proportionate, and compliant with GDPR, so children’s personal information is safeguarded.
However, if we are serious about tackling the crisis in young people’s mental health, we need to go further in tackling the algorithms that make social media so addictive.
