UK ICO fines online platform £14.47m and warns that age self-declaration is not enough to protect children
Published on 25th February 2026
Regulator makes clear that age self-declaration is legally insufficient where children face risk online
At a glance:
The ICO has fined Reddit £14.47m after finding the company failed to use children's personal information lawfully.
Key failures include failing to apply any robust age assurance measures and failing to carry out a data protection impact assessment to assess and mitigate risks to children before January 2025.
The penalty forms part of a wider ICO intervention aimed at improving the protection of children's personal information online.
On 24 February 2026, the Information Commissioner's Office (ICO) fined the social media platform Reddit £14.47 million, following an investigation that found the company used children's personal information unlawfully. Although the ICO has published a press release, it has not yet published the full monetary penalty notice. Once available, it is expected to provide useful insight into the ICO's approach to the infringements. It has been reported that Reddit plans to appeal the fine.
This decision is the latest in a series of enforcement actions targeting social media and video-sharing platforms and has direct implications for all businesses operating online services that may be accessed by children.
Two principal failings
The ICO identified two principal failings:
- Failure to apply robust age assurance measures. Reddit had not applied any robust age assurance measures and therefore did not have a lawful basis for processing the personal information of children under the age of 13. While its terms of service prohibited children under 13 years of age from using its platform, it did not have measures in place to check the age of users accessing its platform until July 2025. The ICO's estimates indicated that there were a large number of children under 13 on the platform.
- Failure to carry out a data protection impact assessment. Reddit had not carried out a data protection impact assessment (DPIA) focusing on the risks of using children's personal information before January 2025, even though children between 13 and 18 were allowed to use the platform. By using under 13-year-olds' personal information without a lawful basis and without having properly considered the risks to children more generally, children were at risk of exposure to inappropriate and harmful content on the platform.
In setting the penalty amount, the ICO took into consideration the number of children affected by this infringement, the degree of potential harm caused, the duration of the failings, and Reddit's global turnover.
Self-declaration is insufficient
In July 2025, Reddit introduced age assurance measures that included age verification to access mature content and asked users to declare their age when opening an account.
The ICO has previously made clear self-declaration of age by children is insufficient when children may be exposed to risk because of the way their data is used. Asking users to simply declare their age when creating an account presents real risks to children as it is easy to bypass.
The Information Commissioner, John Edwards, was unequivocal on this point: "Relying on users to declare their age themselves is not enough when children may be at risk and we are focusing now on companies that are primarily using this method. I therefore strongly encourage industry to take note, reflect on their practices and urgently make any necessary improvements to their platforms."
The ICO's broader enforcement strategy
The ICO has said that its focus now is on companies that primarily rely on self-declaration.
UK data protection law requires that children are given special treatment when it comes to their personal information. The ICO's Age Appropriate Design Code (also known as the Children's Code) translates these legal requirements into design standards for online services likely to be accessed by under-18s, covering areas such as children's best interests and high privacy levels by default.
In its "Children's Code Strategy progress update" published in December 2025, the ICO reported strong progress on its strategy, including a proactive monitoring programme to drive the adoption of more robust and proportionate age assurance methods on high-risk platforms. It has also said it will be prioritising high-risk services for further regulatory engagement and that it will work closely with Ofcom (which has responsibility for enforcing the Online Safety Act) to ensure efforts are coordinated.
Osborne Clarke comment
This decision is a clear signal that the ICO continues to actively and rigorously enforce the protection of children's personal data online. Robust age assurance is a legal requirement, not merely a matter of best practice.
Companies operating online services likely to be accessed by children have a responsibility to protect them by ensuring they are not exposed to risks through the way their data is used. They must be confident that they know the age of their users and have appropriate, effective age assurance measures in place.
Businesses should consider several practical steps in light of this decision:
- Review age assurance measures. Where children under a certain age are not allowed to use a service, organisations should focus on preventing access and enforce their minimum age requirements using robust age assurance methods. Self-declaration alone will not suffice where children may be at risk.
- Conduct a DPIA. Businesses should proactively assess and mitigate the risks that their processing activities pose to children before those activities commence.
- Align age assurance methods to risk. Organisations should match the age assurance method they use to the level of risk on their platform. Organisations can either apply the full protections of the Children's Code to all users or use proportionate age assurance tools to tailor safeguards by age.
- Monitor ICO guidance. Businesses should consult and monitor the ICO's Children's Code Strategy progress updates for the latest regulatory expectations.
The ICO has made clear that it will continue to scrutinise platforms that rely primarily on self-declaration. With Ofcom's enforcement of the Online Safety Act running in parallel, businesses face a dual regulatory environment where the consequences of getting it wrong are substantial.
If you would like to discuss what this decision means for your organisation, or if you require support in reviewing your age assurance or broader children's data protection measures, please get in touch with one of our experts listed below, or your usual Osborne Clarke contact.