The European Commission has published detailed guidance on how online platforms should respect children’s online safety and privacy under the Digital Services Act (DSA).
Here’s a look at what the Commission expects in terms of age assurance, children’s privacy, and children’s online safety in general:
- Scope of the guidance
- What is an ‘online platform accessible to minors’?
- Do online platforms have to put age assurance systems in place?
- What child privacy measures does the Commission recommend?
- What other measures should online platforms take to ensure children’s online safety?
- When does the guidance take effect?
Scope of the guidance
The Commission’s 65-page guidance relates to Article 28 of the DSA, which comprises only the following four sentences:
- Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service
- Providers of online platform (sic) shall not present advertisements on their interface based on profiling as defined in Article 4, point (4), of Regulation (EU) 2016/679 using personal data of the recipient of the service when they are aware with reasonable certainty that the recipient of the service is a minor
- Compliance with the obligations set out in this Article shall not oblige providers of online platforms to process additional personal data in order to assess whether the recipient of the service is a minor
- The Commission, after consulting the Board, may issue guidelines to assist providers of online platforms in the application of paragraph 1.
Subparagraphs 1 and 2 impose obligations on providers of online platforms accessible to minors. Subparagraph 3 clarifies that an online platform does not have to ask a user for extra personal data to determine whether they are a minor. Subparagraph 4 entitles the Commission to issue this guidance.
What is an ‘online platform accessible to minors’?
An “online platform” is a type of “hosting service” subject to the DSA.
Specifically, an online platform is a type of hosting service that “stores and disseminates information to the public”. Examples of online platforms include publicly accessible social media networks, video-sharing websites, and online marketplaces.
A “minor” is a person under 18. Recital 71 of the DSA provides some insight into when an online platform might be deemed “accessible to minors”:
Its terms and conditions allow minors to use the service,
- Its service is directed at or predominantly used by minors, or
- The provider is otherwise aware that some users are minors (e.g., if the personal data it already processes reveals users’ ages).
The guidance further clarifies:
- A provider cannot rely solely on a clause in its terms and conditions prohibiting minors if it has not implemented effective measures to prevent minors’ access
- Platforms known to appeal to minors, marketed to minors, or with independent research showing minors use the service are also considered accessible to minors
- If the provider processes personal data revealing age for other purposes, it is deemed aware of minors using its service
- Adult content platforms without effective age-gating measures will be treated as accessible to minors even if their terms and conditions prohibit access to the service.
Do online platforms have to put age assurance systems in place?
As noted, the Commission states that adult-oriented services (such as pornography websites) that exclude children via their terms and conditions will be treated as “accessible to minors” if they do not implement effective age assurance measures.
This creates a tension with Article 28 (3) of the DSA, which states that online platforms do not need to collect additional personal data to assess the age of their users.
To address this tension, the Commission provides some general principles for age assurance processes:
- Accuracy: Methods must be measured against clear, public metrics and reviewed regularly to ensure state‑of‑the‑art accuracy
- Reliability: Methods must work consistently in real-world conditions and rely on trustworthy data sources
- Robustness: Methods must not be easy to bypass and must protect the integrity of age data
- Non-intrusiveness: Methods must minimize impact on privacy and other rights, process only necessary data, and avoid additional tracking or profiling
- Non-discrimination: Methods must work for all minors equally, without excluding or disadvantaging any group.
What child privacy measures does the Commission recommend?
The Commission recommends the following types of measures to ensure children’s privacy on online platforms:
- Set minors’ accounts to the highest privacy level by default (e.g., private profiles, limited contact and visibility)
- Apply data minimization: Process only the minimum data needed for features like age assurance or content moderation
- Avoid using age assurance data or other sensitive data for additional purposes (e.g,. profiling, tracking, or marketing)
- Do not collect more data than necessary to verify age or personalize services
- Explain how personal data is used, including privacy impacts of age assurance, recommender systems, and advertising
- Allow minors to easily manage and delete accounts and data
- Provide options to disable profiling-based recommendations and advertisements
- Design interfaces to avoid dark patterns that encourage oversharing or weaker privacy settings.
What other measures should online platforms take to ensure children’s online safety?
Besides age assurance and privacy measures, the Commission’s guidance focuses on the following key areas:
- Online interface design: Avoid persuasive design features encouraging overuse, provide time management tools, ensure child‑friendly, accessible tools and settings, and offer clear warnings about AI interactions
- Recommender systems: Regularly test and adapt recommender systems to reduce exposure to harmful content, allow minors to reset feeds, offer non‑profiling options, and ensure child‑friendly explanations of recommendations
- Commercial practices: Protect minors from economically exploitative practices, including targeted advertising and in‑app purchases, and avoid manipulative prompts during registration or use.
When does the guidance take effect?
The finalized guidance was published on 14 July 2025. While it does not have binding legal effect, it provides insight into how the Commission will enforce the DSA.
The DSA has been effective since February 2024, so online platforms within the law’s scope should already have taken steps to ensure child safety and privacy.
The Commission has already launched investigations into four adult websites and TikTok for allegedly violating Article 28 of the DSA. Adherence to this guidance could help other online platforms avoid similar actions.