Reddit, porn sites fined by UK regulators over children’s safety and privacy

| February 24, 2026
age verification stamp

The UK’s online safety and privacy regulators are targeting companies that violate new age verification laws at both ends : Porn sites that did not keep children out, and mainstream platforms that profited from children coming in.

On February 23, media regulator Ofcom fined porn operators that failed to put “highly effective” age checks in place. Within the same 24-hour time frame, the Information Commissioner’s Office (ICO)—which is a separate, independent regulatory office—hit Reddit with an $18.2 million fine for unlawfully using children’s personal data for targeted advertising and recommendation systems.​​

Together, the cases show how quickly the UK is moving from guidance and “codes” to very real enforcement for services that don’t take children’s rights seriously.

Porn sites punished for weak age checks

Under the UK’s Online Safety Act 2023, services that publish or host pornography must use “highly effective” age assurance to stop children from accessing pornographic content. That means the classic splash page warning or an “I am over 18” tick box are no longer suitable.

Porn companies have reacted in different ways as the rules began to take hold: Some big players embraced more intrusive checks; others, like Pornhub, geo‑restricted or partially withdrew from the UK, and a minority effectively called the regulator’s bluff and carried on with token measures. Ofcom is now going after that last group.​

On Monday, Ofcom fined US‑based porn operator 8579 LLC around $1.8 million for failing to implement proper age verification on its sites and for dragging its heels on compliance. The company has also been ordered to hand over a complete list of all websites it operates, with an extra daily penalty if it fails to comply.​

Ofcom said it opened investigations into dozens of adult sites and will impose fines and daily penalties until proper age checks are in place, and reports that more than 6,000 porn sites have now moved to “highly effective” age checks. It can also employ certain business‑disruption and blocking powers for stubborn operators that continue to violate the law.​

Reddit fined for using children’s data

Meanwhile, the Information Commissioner’s Office (ICO) took aim at something different, but very much related: How mainstream online platforms treat the children they already have, rather than how they keep children out.

The Office’s latest decision imposes a substantial fine on Reddit for “children’s privacy failures,” after the regulator concluded the platform unlawfully used UK children’s personal data for targeted advertising and profiling. The decision follows a multi‑year push by the ICO to enforce its Age Appropriate Design Code (also known as the Children’s code) and to crack down on platforms that treat under‑18s like just another audience segment.​

The ICO said its investigation found that Reddit:

  • Failed to effectively identify and protect under‑18s on the platform, despite knowing that children were present in large numbers.
  • Used children’s personal information in recommender systems and targeted advertising, without adequate safeguards or a lawful basis under UK data protection law.
  • Did not ensure that the “best interests of the child” were a primary consideration in its design and data‑use decisions, as required by the Children’s code.

The regulator’s concerns about Reddit are not new. In 2025 it announced investigations into TikTok, Reddit, and Imgur that focused on how the companies use UK children’s data and what age‑assurance tools they rely on. By late 2025, the ICO had issued a notice of intent to fine Reddit, signaling that it believed serious breaches had occurred. The new $18.2 million fine is the outcome of that process.

The message from Information Commissioner John Edwards is blunt: Social media and video‑sharing platforms are welcome in the UK, but “this cannot be at the expense of children’s privacy,” and the responsibility to keep children safe “lies firmly at the door of the companies offering these services.”

Two regulators, one child‑safety push

Although Ofcom and the ICO have different remits, their actions line up neatly.

  • Ofcom enforces the Online Safety Act, which focuses on harmful content and requires robust age assurance for porn and other high‑risk services.​​
  • The ICO enforces UK GDPR and the Children’s code, which focus on how children’s personal data is collected, used, and shared.

The regulators have explicitly said they will work closely together to coordinate their efforts on children’s safety. In practical terms, that means:

  • A porn site that implements “highly effective” age checks to satisfy Ofcom may also find itself under ICO scrutiny if its identity checks or data sharing do not respect data protection law.
  • A social platform that complies with the Children’s code by turning off profiling for children and tightening privacy defaults may still need Ofcom‑compliant age assurance if it hosts adult or otherwise high‑risk content.

The overlap is already visible. The ICO investigated how Reddit and other platforms use age assurance and recommender systems, while Ofcom set out specific guidance on acceptable age‑verification methods under the Online Safety Act.

How age checks actually work

Regulators such as Ofcom publish lists of acceptable age‑verification methods, each with its own privacy and usability trade‑offs. None are perfect, and many shift risk from governments and platforms onto users’ most sensitive personal data.​

  • Facial age estimation: Users upload a selfie or short video so an algorithm can guess whether they look over 18, which avoids storing documents but relies on sensitive biometrics and imperfect accuracy.​
  • Open banking: An age‑check service queries your bank for a simple “adult or not” answer. It may be convenient on paper but it’s a hard sell when the relying site is an adult platform.​
  • Digital identity services: Digital ID wallets can assert “over 18” without exposing full credentials, but they add yet another app and infrastructure layer that must be trusted and widely adopted.​
  • Credit card checks: Using a valid payment card as a proxy for adulthood is simple and familiar, but it excludes adults without cards and does not cover lower age thresholds like “over 13.”​
  • Email‑based estimation: Systems infer age from where an email address has been used (such as banks or utilities), effectively encouraging cross‑service profiling and “digital snooping.”​
  • Mobile network checks: Providers indicate whether an account has age‑related restrictions. This can be fast, but is unreliable for pay‑as‑you‑go accounts, burner SIMs, or poorly maintained records.​
  • Photo‑ID matching: Users upload an ID document plus a selfie so systems can match faces and ages. This is effective, but concentrates highly sensitive identity data in yet another attractive target for attackers.​

My personal preference would be double‑blind verification: a third‑party provider verifies your age, then issues a simple token like “18+” to sites without revealing your identity or learning which site you visit, offering stronger privacy than most current approaches.​

In almost every case, users must surrender personal information or documents to prove their age, increasing the risk that sensitive data ends up in the wrong hands.​


We don’t just report on privacy—we offer you the option to use it.

Privacy risks should never spread beyond a headline. Keep your online privacy yours by using Malwarebytes Privacy VPN.

About the author

Pieter Arntz

Malware Intelligence Researcher

Was a Microsoft MVP in consumer security for 12 years running. Can speak four languages. Smells of rich mahogany and leather-bound books.