Social Media Restrictions and Emerging Safeguarding Issues

Recent changes in social media regulations, particularly under the UK’s Online Safety Act, aim to make online spaces safer.
These regulations impose stricter responsibilities on platforms to monitor and remove harmful content.
However, while these changes represent progress, they also introduce new safeguarding concerns that need urgent attention.
1. Inadequate Age Verification
Despite regulatory efforts, many platforms still lack robust age verification mechanisms, allowing minors to access harmful or inappropriate content. A recent Ofcom investigation into OnlyFans found that underage users were able to access explicit material due to weak age checks. Although the probe was closed, it raised concerns about how effective these verification measures really are. If platforms fail to implement stronger age verification, young users could still be exposed to harmful content.
2. Weakened Content Moderation Policies
Changes in how social media platforms moderate content can also create risks. Meta recently replaced professional fact-checkers with “community notes”, which rely on user-generated reports. Additionally, it has relaxed some rules around derogatory content. Advocacy groups, such as the Molly Rose Foundation, have warned that these changes could increase exposure to self-harm and suicide-related content. Without strong content moderation policies, harmful material may become more accessible to vulnerable users.
3. AI-Generated Harmful Content
Artificial Intelligence (AI) is rapidly advancing, but it has also been misused to create harmful content. Home Secretary Yvette Cooper recently criticised tech companies for not doing enough to prevent AI-generated child abuse material. As AI tools become more sophisticated, they can be exploited to create realistic, disturbing, and illegal content, posing severe risks for young users. Social media platforms must take urgent action to detect and remove AI-generated abuse material before it spreads.
4. Delays in Implementing Safety Measures
While the Online Safety Act introduces tougher rules, there are concerns about whether platforms will implement these changes quickly enough. Ofcom has warned tech companies that they must assess and mitigate risks by March 2025—or face penalties. If platforms fail to act swiftly, harmful content could continue circulating unchecked, leaving users at risk.
5. Global Discrepancies in Regulations
Different countries enforce different social media regulations, which can create inconsistencies in safeguarding standards. For example, Australia has banned social media use for children under 16, whereas the UK takes a more lenient approach. These variations can cause confusion and make it harder to protect young users globally. Without uniform safety standards, social media platforms might exploit loopholes, leaving children exposed to harmful content.
How CURA Can Help
In light of these emerging safeguarding challenges, organisations must adopt comprehensive solutions to protect vulnerable individuals effectively. CURA, developed by TASC Software, is a cloud-based safeguarding, pastoral care, and well-being solution designed to reduce administration and paperwork while improving communication and care.
Key Features of CURA
- Secure Access – Permission-based clearance levels allow fast and secure access to confidential images, statements, and evidence.
- User-Friendly Interface – Intuitive software provides quick and simple logging of safeguarding concerns, making it accessible for all staff.
- Real-Time Notifications – Logging incidents is efficient and notifications can be automatically sent to key safeguarding leads, ensuring swift action.
- Comprehensive Tracking – Staff can be linked to vulnerable individuals, enabling tracking and proactive intervention for those at risk.
- Detailed Reporting – One-click reports provide timely and accurate safeguarding data for meetings, governing bodies, Ofsted, and DSLs (Designated Safeguarding Leads).
By implementing CURA, organisations can establish strong safeguarding systems that align with the six principles of safeguarding. This ensures concerns are quickly identified, monitored, and acted upon, creating a safer digital and physical environment for all individuals.
As social media regulations evolve, tools like CURA are essential in helping organisations stay compliant while actively safeguarding vulnerable children and adults. By integrating CURA into safeguarding strategies, organisations can effectively navigate modern challenges and enhance child protection measures.
For more information on how CURA can support safeguarding efforts, visit TASC Software’s CURA page