The Online Safety Act: Can We Protect Children Without Sacrificing Privacy?

Apr 8, 2025

“We need to call time on the Wild West online. What’s illegal offline should be regulated online.”
Damian Collins MP, Chair of the Joint Committee on the draft Online Safety Bill

The internet has brought people closer than ever—but it’s also exposed its darkest corners. For too long, harmful and illegal content has circulated with little oversight. Vulnerable users, especially children, have been left without the protection they deserve.

Enter the Online Safety Act —a landmark piece of UK legislation designed to turn the tide.

What is the Online Safety Act?

The OSA is a sweeping reform of how tech platforms are expected to handle illegal and harmful content. Enforced by Ofcom, the UK’s communications regulator, the Act compels online platforms to:

  • Identify and reduce risks from content such as child abuse imagery, terrorism, hate speech, and fraud.

  • Implement proactive safety measures or face fines of up to £18 million or 10% of global turnover.

With 286 pages and 241 sections, the OSA isn’t just another regulation—it’s a reset.

Why it matters

The OSA fundamentally shifts the burden of responsibility from individuals to platforms. For the first time, tech companies are legally accountable for the content they host and recommend.

It signals a clear message: when it comes to online safety, especially for children, self-regulation is no longer enough.

This matters for every parent, educator, policymaker—and for every platform operating in the UK.

Where we are now

The Act is rolling out in stages.

By 31 March 2025, all in-scope providers must have submitted their illegal content risk assessments to Ofcom. These will inform how effectively platforms are mitigating risk—and mark the start of real regulatory teeth.

From there, enforcement begins.

Who’s on the hook?

Platforms under scrutiny include:

  • Social platforms: Facebook, Instagram, TikTok, Snapchat, Reddit, X

  • Messaging apps: WhatsApp, Telegram, Signal

  • Search and storage: Google, Yahoo, Dropbox, Google Drive

  • Adult and high-risk niche forums

In short, if your platform connects users or shares content, you're likely in scope.

Built for this moment: YEO Messaging

At YEO Messaging, we anticipated this shift years ago.

From the start, we’ve built our platform with one core belief:
Online safety should never come at the cost of personal privacy.

Our approach is simple—security by design:

  • Continuous Facial Recognition means only the verified recipient can view a message—and only while they’re present.

  • No screenshots. No message forwarding. No impersonation.

  • Geofencing ensures messages can only be accessed in pre-approved locations.

  • Audit trails and admin control support full transparency and governance.

Our platform supports child safety without weakening encryption or building backdoors.

This is how we deliver accountable privacy—and why we believe platforms don’t need to trade security for surveillance.

Final thought

The Online Safety Act is a critical turning point for the internet.

It’s also a test: can tech companies rise to the challenge without compromising user rights?

At YEO Messaging, we’re proud to say: yes. You can protect your most vulnerable users and still uphold their right to privacy.

And now, every platform has the responsibility—and the tools—to do the same.

“We need to call time on the Wild West online. What’s illegal offline should be regulated online.”
Damian Collins MP, Chair of the Joint Committee on the draft Online Safety Bill

The internet has brought people closer than ever—but it’s also exposed its darkest corners. For too long, harmful and illegal content has circulated with little oversight. Vulnerable users, especially children, have been left without the protection they deserve.

Enter the Online Safety Act —a landmark piece of UK legislation designed to turn the tide.

What is the Online Safety Act?

The OSA is a sweeping reform of how tech platforms are expected to handle illegal and harmful content. Enforced by Ofcom, the UK’s communications regulator, the Act compels online platforms to:

  • Identify and reduce risks from content such as child abuse imagery, terrorism, hate speech, and fraud.

  • Implement proactive safety measures or face fines of up to £18 million or 10% of global turnover.

With 286 pages and 241 sections, the OSA isn’t just another regulation—it’s a reset.

Why it matters

The OSA fundamentally shifts the burden of responsibility from individuals to platforms. For the first time, tech companies are legally accountable for the content they host and recommend.

It signals a clear message: when it comes to online safety, especially for children, self-regulation is no longer enough.

This matters for every parent, educator, policymaker—and for every platform operating in the UK.

Where we are now

The Act is rolling out in stages.

By 31 March 2025, all in-scope providers must have submitted their illegal content risk assessments to Ofcom. These will inform how effectively platforms are mitigating risk—and mark the start of real regulatory teeth.

From there, enforcement begins.

Who’s on the hook?

Platforms under scrutiny include:

  • Social platforms: Facebook, Instagram, TikTok, Snapchat, Reddit, X

  • Messaging apps: WhatsApp, Telegram, Signal

  • Search and storage: Google, Yahoo, Dropbox, Google Drive

  • Adult and high-risk niche forums

In short, if your platform connects users or shares content, you're likely in scope.

Built for this moment: YEO Messaging

At YEO Messaging, we anticipated this shift years ago.

From the start, we’ve built our platform with one core belief:
Online safety should never come at the cost of personal privacy.

Our approach is simple—security by design:

  • Continuous Facial Recognition means only the verified recipient can view a message—and only while they’re present.

  • No screenshots. No message forwarding. No impersonation.

  • Geofencing ensures messages can only be accessed in pre-approved locations.

  • Audit trails and admin control support full transparency and governance.

Our platform supports child safety without weakening encryption or building backdoors.

This is how we deliver accountable privacy—and why we believe platforms don’t need to trade security for surveillance.

Final thought

The Online Safety Act is a critical turning point for the internet.

It’s also a test: can tech companies rise to the challenge without compromising user rights?

At YEO Messaging, we’re proud to say: yes. You can protect your most vulnerable users and still uphold their right to privacy.

And now, every platform has the responsibility—and the tools—to do the same.

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.