As part of its commitment to ensure regulatory coherence between the UK data protection regime and the new online safety regime, the ICO has published draft new guidance on the use of profiling tools for online safety. This guidance aims at reminding organisations offering user-to-user services who wish to use profiling to meet their Online Safety Act 2023 (OSA) commitments of the relevant UK GDPR rules. The guidance applies to all organisations carrying out profiling, as defined in the GDPR (as well as organisations providing profiling products and services). The ICO is seeking feedback on this guidance (with the feedback period to close on 31 October 2025).
With the OSA mandating more obligations on online service providers to protect users (especially children) from harmful and illegal content, it is anticipated many service providers will rely on automated profiling tools to ensure they can meet their OSA commitments. However, when deploying these tools for OSA compliance purposes, service providers must in tandem ensure compliance with UK data protection law.
The ICO is clear the guidance is not intended to cover: (i) profiling carried out to personalise or tailor a user's experience; (ii) data protection issues arising out of the training and development of AI-based profiling tools; or (iii) profiling used to estimate a user's age - although it does apply to tools that use information about a user's age as input data to assess other characteristics or behaviours. Service providers must also separately address how they ensure compliance with the OSA, as compliance with this new guidance will not automatically ensure OSA compliance.
The ICO highlights the importance of considering the impact of profiling on users’ privacy and whether the processing undertaken using profiling tools is necessary and proportionate to achieve a particular aim, given how intrusive a profiling tool can be. This may have consequences for the transparency information which accompanies the tool. Service providers will therefore need to consider if they need to revisit their privacy notices to assess if more information should be provided about the processing undertaken by the tool.
Care must also be taken if the profiling tool is likely to process any special category information. The guidance highlights the situation where an inference could be made about someone that could be special category information (e.g. if a human moderator reviews the outcome of a tool’s behaviour analysis and is able to infer a user's religious group, based on that user’s activity on the service). This may be the case even if the tool or the moderator is not seeking to make an inference about the user’s religion. If there is an intention to make an inference linked to a special category or treat users differently on the basis of an inference linked to a special category, there will be processing of special category of information.
As a number of usages of profiling tools are likely to be high risk (such as profiling of people on a large scale; processing involving new technologies, such as AI; processing that involves tracking a person’s behaviour; making decisions about a person’s access to a product or service based on automated decision-making (including profiling); and using children’s personal information as part of offering a service directly to them) a data protection impact assessment will be required. As part of this analysis, the customary test of necessity and proportionality must be met, with service providers required to consider if the purpose for deployment of the tool is sufficiently important to justify the interference with a user's privacy, the risks and impact of the use of the tool and whether users would reasonably expect their personal information to be profiled for trust and safety purposes. The ICO includes some options to reduce the risks and mandates consultation with the ICO if the DPIA identifies a high risk which cannot be reduced to an acceptable level.
The profiling of children is a particular concern, for obvious reasons. The ICO reminds users that a DPIA must be conducted if children are profiled as part of the offer of an online service and that entities processing children's personal data must comply with the Children's Code. Profiling of children should be switched off by default, unless there is a compelling reason for it to be on by default (which might include profiling to meet a legal or regulatory requirement (such as safeguarding), to prevent child sexual exploitation or abuse online or for age assurance). Whilst meeting OSA compliance obligations could be a compelling reason for profiling by default, not every form of profiling will be a compelling reason.
When assessing whether processing using a profiling tool is fair, service providers must not process people’s personal information in ways that they might find unexpected, misleading or unforeseen. Where tools are used to predict or infer data about users’ characteristics, behaviours or other attributes, service providers must ensure the tools are sufficiently statistically accurate and avoid discrimination as tools are less likely to be fair if they consistently make the wrong judgments about people or lead to unjust discrimination. This means undertaking regular reviews to minimise the risk of unfair outcomes for users as well as providing a means of redress for users who may be concerned a profiling decision is not correct or that the information generated or used by the tools is inaccurate.
The scope of the information to be provided in response to a data subject access request could be significant, with the guidance noting that a provider must give a user the personal information used in the profiling tools, and potentially also:
- confirmation that personal information is used in the tools;
- copies of the personal information used;
- copies of the personal information generated (ie the outputs of the tools); and
- copies of the information about moderation decisions taken as a result of the use of a profiling tool.
The guidance also covers the impact of Article 22 GDPR, if the intention is for a tool to take ‘solely automated decisions’ (i.e. those without any meaningful human involvement).
The final version of this guidance is expected to be published in Spring 2026 and will also reflect any changes required as a result of the Data (Use and Access) Act coming into force.