For over 25 years, we have worked with schools, families, industry and government to protect young people online and to ensure the internet is a place where children cannot be harmed. That is why we have joined with other leading children’s organisations, and experts including Molly Rose Foundation, NSPCC and 5 Rights Foundation to urge the Government not to pursue a blanket ban on social media for under-16s.
We agree on the problem. Too many children are exposed to preventable harm online. Parents are right to demand decisive action and for too long, technology companies have been allowed to externalise risk and responsibility, while children and families suffer the consequences. Government must stand firmly on the side of young people and finally call time on this, but we do not believe a blanket social media ban is the solution.
An Easy but Flawed Solution
Bans are a blunt instrument; they risk creating the illusion of safety without addressing the root causes of harm: unsafe design, addictive algorithms, and a lack of accountability from platforms. Children would not disappear from the online world; they would simply migrate elsewhere. The risks would follow them, often into spaces that are less visible, less regulated and harder for parents and professionals to support.
A ban would also create a dangerous “cliff-edge”. At 16, young people would suddenly be thrust into high-risk environments with little preparation. Removing opportunities for gradual, supported engagement does not build resilience, it would just defer risk.
Crucially, social media is not just a source of harm. For many young people (especially those who are LGBTQ+, neurodiverse, isolated or marginalised) online spaces provide connection, identity, peer support and access to trusted help, including support services. We must be careful not to remove lifelines for those who may struggle to find them elsewhere. Despite this, we still recognise the concern and agree that young people can be exposed to online harm across social media platforms and in specific cases, more marginalised groups can be the ones who suffer more. We therefore need online environments that preserve the benefits while reducing the specific risks these children face.
Do Not Remove Responsibility
Just as films and video games have age ratings based on the risks they pose, social media platforms should be required to set minimum ages that reflect their design and impact. For young people over 13, platforms should be required to deliver effective age-appropriate experiences. This does not refer to identity checks but processes that can verify a user’s age whilst protecting their privacy. This means:
- Highly effective age assurance to enforce minimum age limits.
- Risk-based age thresholds, reflecting the different levels of harm posed by different platforms and features.
- Blocking or modifying high-risk functionalities for younger users.
- Decisions grounded in evidence, not convenience.
We also urgently need transparency and evidence to inform decisions. The Government commissioned an important review into online harm last year but has so far not evidenced what was found. Those findings should be published without delay so that policy decisions are informed by high-quality research rather than media headlines.
The recent announcement of a consultation to determine wider views around a social media ban with news of Ministers visiting Australia (who already have a ban in place) to determine impact should also be made transparent. From our discussions with experts in the field, we are not alone in thinking this is a poor direction.
Prioritise Strengthening What We Already Have
Above all, we need a fundamental reset in our expectations of technology companies. Safety cannot be an afterthought, and ethical responsibilities and accountability should not be forgotten. Platforms must move away from addictive, exploitative design and instead build child-centred products that promote agency, healthy interactions and exposure to high-quality content.
As well as this, we must reinforce our own responsibilities around how we connect with our own children when it comes to staying safe online. Parents must continue to play an active role, bolstered by having clear guidance, resources and support so their families can make good choices around online behaviours.
David Wright CBE CEO of SWGfL said: ‘’The Online Safety Act was a significant step forward, but it must be strengthened. Enforcing a social media ban at this stage is a significant step backwards. How can we give children the confidence to become upstanding digital citizens when we remove them from the conversation entirely? This is not a solution, it is a quick fix, and if passed, will highlight the Government’s dispassion to make the online world safer, fairer and appropriate for them to grow up in.’’





