Europaudvalget 2024-25
EUU Alm.del Bilag 635
Offentligt
3053975_0001.png
EUROPEAN
COMMISSION
Brussels, 14.7.2025
C(2025) 4764 final
ANNEX
ANNEX
to the
Communication to the Commission
Approval of the content on a draft Communication from the Commission - Guidelines
on measures to ensure a high level of privacy, safety and security for minors online,
pursuant to Article 28(4) of Regulation (EU) 2022/2065
EN
EN
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0002.png
1
1.
I
NTRODUCTION
Online platforms are increasingly accessed by minors (
1
) and can provide several
benefits to them. For example, online platforms may provide access to a wealth of
educational resources, helping minors to learn new skills and expand their knowledge.
Online platforms may also offer minors opportunities to express their views and
connect with others who share similar interests, helping minors to build social skills,
confidence and a sense of community. By playing on and exploring the online
environment, minors can also foster their natural curiosity, engaging in activities that
encourage creativity, problem solving, critical thinking, agency and entertainment.
2.
There is, however, wide consensus among policy makers, regulatory authorities, civil
society, researchers, educators and guardians (
2
) that the current level of privacy,
safety and security online of minors is often inadequate. The design and features of
the wide variety of online platforms and the services offered by providers of online
platforms accessible to minors may create risks to minors’ privacy, safety and security
and exacerbate existing risks. These risks include, for example, exposure to illegal
content (
3
) and harmful content, that undermines minors’ privacy, safety and security
or that may impair the physical or mental development of minors. They also include
cyberbullying or contact from individuals seeking to harm minors, such as those
seeking to sexually abuse or extort minors, human traffickers and those seeking to
recruit minors into criminal gangs or promote violence, radicalisation, violent
extremism and terrorism. Minors may also face risks as consumers as well as risks
related to extensive use or overuse of online platforms and exposure to inappropriate
or exploitative practices, including in relation to gambling and gaming. The
increasing integration of artificial intelligence (“AI”) chatbots and companions into
online platforms as well as AI driven deep fakes may also affect how minors interact
with online platforms, exacerbate existing risks, and pose new ones that can
(
1
) In the present guidelines, ‘child’, ‘children’ and ‘minor’ refer to a person under the age of 18.
(
2
) In the present guidelines, ‘guardians’, refer to persons holding parental responsibilities.
(
3
) Illegal content includes but is not limited to content depicting illicit drug trafficking, terrorist and violent
extremist content and child sexual abuse material. What constitutes illegal content is not defined by the
Regulation (EU) 2022/2065 (the Digital Services Act) but by other laws either at EU level or at national
level.
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0003.png
negatively affect a minor’s privacy, safety and security (
4
). These risks can originate
from the direct experience of the minor with the platform and/or from the actions of
other users on the platform.
3.
These guidelines aim to support providers of online platforms in addressing these
risks by providing a set of measures that the Commission considers will help these
providers to ensure a high level of privacy, safety and security of minors on their
platforms, which will contribute to the protection of minors, which is an important
policy objective of the Union. These guidelines also aim at helping the Digital
Services Coordinators (DSCs) and competent national authorities when applying and
interpreting Article 28 of Regulation (EU) 2022/2065. For instance, making minors’
accounts more private will, among others, help providers of online platforms reduce
the risk of unwanted or unsolicited contact. Implementing age assurance measures (
5
)
may, among others, help providers reduce the risk of minors being exposed to
services, content, conduct, contacts or commercial practices that undermine their
privacy, safety and security. Adopting these and other measures – on matters ranging
from recommender systems and governance to user support and reporting – may help
providers of online platforms make online platforms safer, more secure and more
privacy preserving for minors.
2
4.
S
COPE OF THE GUIDELINES
It is in the light of the aforementioned risks that the Union legislature enacted Article
28 of Regulation (EU) 2022/2065 of the European Parliament and the Council (
6
).
Paragraph 1 of this provision obliges providers of online platforms accessible to
minors to put in place appropriate and proportionate measures to ensure a high level
of privacy, safety, and security of minors, on their service. Paragraph 2 of Article 28
of Regulation (EU) 2022/2065 prohibits providers of online platform from presenting
advertisements on their interface based on profiling, as defined in Article 4, point (4),
(
4
) A typology of risks to which minors are exposed when accessing online platforms, based on a framework
developed by the OECD, is included in Annex I to these guidelines.
(
5
) See Section 6.1 on age assurance.
(
6
) Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a
Single Market for Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277,
27.10.2022, p. 1.
2
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0004.png
of Regulation (EU) 2016/679 (
7
), using personal data of the recipient of the service
when they are aware with reasonable certainty that the recipient of the service is a
minor. Paragraph 3 of Article 28 of Regulation (EU) 2022/2065 specifies that
compliance with the obligations set out in this Article shall not oblige providers of
online platforms accessible to minors to process additional personal data in order to
assess whether the recipient of the service is a minor. Paragraph 4 of Article 28 of
Regulation (EU) 2022/2065 provides that the Commission, after consulting the
European Board for Digital Services (‘the Board’), may issue guidelines to assist
providers of online platforms in the application of paragraph 1.
5.
These guidelines describe the measures that the Commission considers that providers
of online platforms accessible to minors should take to ensure a high level of privacy,
safety and security for minors online, in accordance with Article 28(1) of Regulation
(EU) 2022/2065. The obligation laid down in that provision is addressed to providers
of online platforms whose services are accessible to minors. (
8
) Recital 71 of that
Regulation further clarifies that “[a]n
online platform can be considered accessible to
minors when its terms and conditions permit minors to use the service, when its
service is directed at or predominantly used by minors, or where the provider is
otherwise aware that some of the recipients of its service are minors”.
6.
As regards the first scenario described in that recital, the Commission considers that
a provider of an online platform cannot solely rely on a statement in its terms and
conditions prohibiting access to minors, to argue that the platform is not accessible to
them. If the provider of the online platform does not implement effective measures to
prevent minors from accessing its service, it cannot claim that its online platform falls
outside the scope of Article 28(1) of Regulation (EU) 2022/2065 based on that
declaration. For example, providers of online platforms that host and disseminate
adult content, such as online platforms disseminating pornographic content, and
therefore restrict, in their terms and conditions, the use of their service to users over
(
7
) Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal
data and on the free movement of such data (GDPR), OJ L 119, 4.5.2016, p. 1.
(
8
)
Article 3 of Regulation (EU) 2022/2065 defines ‘online platform’ as a hosting service that, at the request
of a recipient of the service, stores and disseminates information to the public, unless that activity is a
minor and purely ancillary feature of another service or a minor functionality of the principal service
and, for objective and technical reasons, cannot be used without that other service, and the integration
of the feature or functionality into the other service is not a means to circumvent the applicability of
this Regulation.
3
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0005.png
the age of 18 years old, will be considered accessible to minors within the meaning
of Article 28(1) of Regulation (EU) 2022/2065 when no effective measures have been
put in place to prevent minors from accessing their service.
7.
As regards the third scenario, recital 71 of Regulation (EU) 2022/2065 clarifies that
one example of a situation in which a provider of an online platform should be aware
that some of the recipients of its service are minors is where that provider already
processes the personal data of those recipients revealing their age for other purposes,
such as during registration in the relevant service, and this reveals that some of those
recipients are minors. Other examples of situations in which a provider can reasonably
be expected to be aware that minors are amongst the recipients of its service include
those in which the online platform is known to appeal to minors; the provider of the
online platform offers similar services to those used by minors; the online platform is
promoted to minors; the provider of the online platform has conducted or
commissioned research that identifies minors as recipients of the services or where
such identification results from an independent research.
8.
Pursuant to Article 19 of Regulation (EU) 2022/2065, the obligation laid down in
Article 28(1) of Regulation (EU) 2022/2065 does not apply to providers of online
platforms that qualify as micro or small enterprises, except where their online
platform has been designated by the Commission as a very large online platform in
accordance with Article 33(4) of that Regulation (
9
).
9.
Other provisions of Regulation (EU) 2022/2065 also aim at ensuring the protection
of minors online (
10
). These include, among others, several provisions in Section 5 of
Chapter III of Regulation (EU) 2022/2065, which imposes additional obligations on
providers of very large online platforms (‘VLOPs’) and very large online search
(
9
) Recommendation 2003/361/EC defines a small enterprise as an enterprise which employs fewer than 50
persons and whose annual turnover and/or annual balance sheet total does not exceed EUR 10 million.
A microenterprise is defined as an enterprise which employs fewer than 10 persons and whose annual
turnover and/or annual balance sheet total does not exceed EUR 2 million. The Commission recalls here
Recital 10 of Regulation (EU) 2022/2065 which states that Regulation (EU) 2022/2065 is without
prejudice to Directive (EU) 2010/13. The aforementioned Directive requires all video-sharing platform
(VSP) providers, whatever its qualification as micro or small enterprises, to establish and operate age
verification systems for users of video-sharing platforms with respect to content which may impair the
physical or mental development of minors.
(
10
) This includes the obligations contained in the following provisions of Regulation (EU) 2022/2065:
Article 14 on Terms and Conditions, Articles 16 and 22 on Notice and action mechanisms and Statement
of Reasons, Article 25 on Online interface design and organisation, Articles 15 and 24 on Transparency,
Article 26 on Advertisements, Article 27 on Recommender systems and Article 44 on Standards.
4
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0006.png
engines (‘VLOSEs’) (
11
). These guidelines do not aim to interpret those provisions
and providers of VLOPs and VLOSEs should not expect that adopting the measures
described below, either partially or in full, suffices to ensure compliance with their
obligations under Section 5 of Chapter III of Regulation (EU) 2022/2065, as those
providers may need to put in place additional measures which are not set out in these
guidelines and which are necessary for them to comply with the obligations stemming
from those provisions (
12
).
10. Article 28(1) of Regulation (EU) 2022/2065 should also be seen in the light of other
Union legislation and non-binding instruments which aim to address the risks to
which minors are exposed online (
13
). Those instruments also contribute to achieving
the objective of ensuring a high level of privacy, safety and security of minors online,
and thus complement the application of Article 28(1) of Regulation (EU) 2022/2065.
These guidelines should not be understood as interpreting or pre-empting any
obligations arising under those instruments or under Member State legislation.
Supervision and enforcement of those instruments remain the sole responsibility of
the competent authorities under those legal frameworks. In particular, as clarified in
recital 10 of Regulation (EU) 2022/2065, that Regulation is without prejudice to other
acts of Union law regulating the provision of information society services in general,
regulating other aspects of the provision of intermediary services in the internal
market or specifying and complementing the harmonised rules set out in Regulation
(EU) 2022/2065, such as Directive 2010/13/EU, as well as Union law on consumer
(
11
) This includes, but is not limited to, the following provisions of Regulation (EU) 2022/2065: Articles 34
and 35 on Risk assessment and Mitigation of risks, Article 38 on Recommender systems, Article 40 on
Data access and scrutiny and Article 44 (j) on standards for targeted measures to protect minors online.
(
12
) This includes, but is not limited to, Articles 34 and 35 on Risk assessment and Mitigation of risks, Article
38 on Recommender systems and Article 40 on Data access and scrutiny.
(
13
) This approach includes the Better Internet for Kids strategy (BIK+), Directive 2010/13/EU (“the
Audiovisual Media Services Directive”), Regulation (EU) 2024/1689 (“the AI Act”), Regulation (EU)
2016/679 (“GDPR”), the Directive 2011/93/EU on combating the sexual abuse and sexual exploitation
of children, the Directive 2005/29/EC on unfair commercial practices (the “UCPD”), the EU Digital
Identity Wallet and the short-term age verification solution, the forthcoming action plan against
cyberbullying, the EU-wide inquiry on the broader impacts of social media on well-being, the ProtectEU
Strategy, the EU Roadmap to fight drug trafficking and organised crime, the EU Internet Forum, the EU
Strategy for a more effective fight against child sexual abuse, the EU Strategy combating trafficking in
human beings 2021-2025. Further, Regulation (EU) 2022/2065 is without prejudice to Union law on
consumer protection and product safety, including Regulations (EU) 2017/2394 and (EU) 2019/1020
and Directives 2001/95/EC and 2013/11/EU. Directive 2005/29 on unfair commercial practices, notably
Articles 5 to 9 also protect minors and, e.g., point 28 of Annex I prohibits, in an advertisement, a direct
exhortation to children to buy advertised products or persuade their parents or other adults to buy
advertised products for them. The Commission also recalls the European Commission Fitness Check of
EU consumer law on digital fairness.
5
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0007.png
protection and on the protection of personal data, in particular Regulation (EU)
2016/679.
11. While these guidelines set out measures that aim at ensuring a high level of privacy,
safety and security for minors online, providers of online platforms are encouraged to
adopt those measures for the purposes of protecting all users, and not just minors.
Creating a privacy preserving, safe and secure online environment for all users will
inherently result in more privacy, safety and security for minors online, while
adopting measures ensuring the respect of their specific rights and needs in line with
Article 28 of Regulation (EU) 2022/2065.
12. By adopting these guidelines, the Commission declares that it will apply these
guidelines to the cases described therein and thus impose a limit on the exercise of its
discretion whenever applying Article 28(1) of Regulation (EU) 2022/2065. As such,
these guidelines may therefore be considered a significant and meaningful benchmark
on which the Commission will base itself when applying Article 28(1) of Regulation
(EU) 2022/2065 and determining the compliance of providers of online platforms
accessible to minors with that provision (
14
). The Digital Services Coordinators and
competent national authorities may also draw inspiration from these guidelines when
applying and interpreting Article 28(1) of Regulation 2022/2065. Nevertheless,
adopting and implementing the measures set out in these guidelines, either partially
or in full, shall not automatically entail compliance with that provision.
13. Any authoritative interpretation of Article 28(1) of Regulation (EU) 2022/2065 may
only be given by the Court of Justice of the European Union, which amongst others
has jurisdiction to give preliminary rulings concerning the validity and interpretation
of EU acts, including Article 28(1) of Regulation (EU) 2022/2065.
14. Throughout the development of the guidelines the Commission has consulted with
stakeholders (
15
), including with the Board and its working group on protection of
(
14
) Adopting and implementing any of the measures set out in these guidelines does not entail compliance
with the GDPR or any other applicable data protection law. In determining compliance with Article
28(1) of Regulation (EU) 2022/2065, responsible authorities are therefore encouraged to cooperate with
data protection authorities.
(
15
) The Commission has developed the guidelines by conducting thorough desk research, gathering
stakeholder feedback through a call for evidence, workshops and targeted public consultation. The
Commission also relied on the expertise of the European Centre of Algorithmic Transparency
throughout the processes. Moreover, the Commission consulted with young people, including Better
6
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0008.png
minors. In accordance with Article 28(4) of Regulation (EU) 2022/2065, the
Commission consulted the Board on a draft of these guidelines prior to their adoption
on 2 July 2025.
15. The measures described in Sections 5 to 8 of these guidelines are not exhaustive.
Other measures may also be deemed appropriate and proportionate to ensure a high
level of privacy, safety and security for minors in accordance with Article 28(1) of
Regulation (EU) 2022/2065, such as measures resulting from compliance with other
pieces of Union legislation (
16
) or adherence to national guidance on the protection of
minors or technical standards (
17
). In addition, new measures may be identified in the
future that enable providers of online platforms accessible to minors to better comply
with their obligation to ensure a high level of privacy, safety and security of minors
on their service.
3
S
TRUCTURE
16. Section 4 of these guidelines sets out the general principles which should govern all
measures that providers of online platforms accessible to minors put in place to ensure
a high level of privacy, safety, and security of minors on their service. Sections 5 to
8 of these guidelines set out the main measures that the Commission considers that
such providers should put in place to ensure such a high level of privacy, safety and
Internet for Kids youth ambassadors and organised focus groups with children in seven Member States,
with the support of the Safer Internet Centres.
(
16
) This includes for example the Directives and Regulations cited in footnote 13, the forthcoming
guidelines by the European Data Protection Board (EDPB) on processing of minor personal data in
accordance with Regulation (EU) 2016/679 (GDPR).
(
17
) An Coimisiún um Chosaint Sonraí. (2021).
Fundamentals for a child-oriented approach to data
processing.
Available:
https://www.dataprotection.ie/sites/default/files/uploads/2021-
12/Fundamentals%20for%20a%20Child-
Oriented%20Approach%20to%20Data%20Processing_FINAL_EN.pdf; Coimisiún na Meán. (2024).
Online safety code.
Available: https://www.cnam.ie/app/uploads/2024/11/Coimisiun-na-Mean-Online-
Safety-Code.pdf; IMY (Swedish Authority for Privacy Protection). (2021).
The rights of children and
young people on digital platforms.
Available: https://www.imy.se/en/publications/the-rights-of-
children-and-young-people-on-digital-platforms/; Dutch Ministry of the Interior and Kingdom
Relations. (2022).
Code for children's rights.
Available: https://codevoorkinderrechten.nl/wp-
content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf; CNIL. (2021).
CNIL publishes 8
recommendations to enhance protection of children online.
Available: https://www.cnil.fr/en/cnil-
publishes-8-recommendations-enhance-protection-children-online; Unabhängiger Beauftragter für
Fragen des sexuellen Kindesmissbrauchs. (n.d.).
Rechtsfragen Digitales.
Available: https://beauftragte-
missbrauch.de/themen/recht/rechtsfragen-digitales; CEN-CENELEC (2023)
Workshop Agreement
18016 Age Appropriate Digital Services Framework;
OECD. (2021).
Children in the digital
environment - Revised typology of risks.
Available: https://www.oecd.org/en/publications/children-in-
the-digital-environment_9b8f222e-en.html.
7
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0009.png
security. These include Risk review (Section 5), Service design (Section 6),
Reporting, user support and tools for guardians (Section 7) and Governance (Section
8).
4
G
ENERAL PRINCIPLES
17. The present guidelines are based on the following general principles, which are
interrelated and should be considered holistically in all the activities by providers of
online platforms that are in scope of these guidelines. The Commission considers that
any measure that a provider of an online platform accessible to minors puts in place
to comply with Article 28(1) of Regulation (EU) 2022/2065 should adhere to the
following general principles.
a.
Proportionality and appropriateness:
Article 28(1) of Regulation (EU)
2022/2065 requires any measure taken to comply with that provision to be
appropriate and proportionate to ensure a high level of privacy, safety, and
security of minors. Since different online platforms may pose different types of
risks for minors, it will not always be proportionate or appropriate for all
providers of online platforms to apply all, or only some of the measures described
in these guidelines. Determining whether a particular measure is proportionate
and appropriate, in particular where it entails an interference with individuals’
fundamental rights, will require a case-by-case review by each provider (i) of the
risks to minors’ privacy, safety and security stemming from its online platform
or parts of it, considering among others the size, reach and type of the service it
provides and its nature, its intended or current use, its specific features and the
user base of the service, (ii) of the impact of the measure on children’s rights and
other rights and freedoms enshrined in the Charter of Fundamental Rights of the
European Union (“the Charter”); and of (iii) the need to base such measures on
the highest available standards and existing good practices, as well as the
perspective and rights of children (see Section 5 on Risk review).
b.
Protection of children’s rights:
These rights are enshrined in the Charter, which
guarantees the protection of children’s rights in implementing Union law, and
the United Nations Convention on the Rights of the Child (“the UNCRC”), which
8
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0010.png
all Member States have ratified (
18
). Children’s rights form an integral part of
human rights, and all those rights are interrelated, interdependent and indivisible.
In line with Article 24 of the Charter, in all actions relating to children, whether
taken by public authorities or private institutions, the best interests of the child
must be a primary consideration. Therefore, to ensure that measures to achieve a
high level of privacy, safety and security for minors on an online platform are
appropriate and proportionate, all children’s rights should be considered and their
best interests taken as a primary consideration. Any discrimination based on any
ground such as sex, race, colour, ethnic or social origin, genetic features,
language, religion or belief, political or any other opinion, membership of a
national minority, property, birth, disability, age or sexual orientation shall be
prohibited, in line with Article 21 of the Charter. Children’s rights include for
instance their right to protection (
19
), non-discrimination, inclusion, privacy,
access to information and education, freedom of expression as well as
participation (
20
) and to have their views taken into account in all matters that
concern them (
21
).
c.
Privacy-, safety- and security-by-design:
providers of online platforms
accessible to minors should integrate high standards of privacy, safety and
security in the design, development and operation of their services (
22
). By-
design concepts aim to harness the influence of providers of online platforms,
(
18
) These rights are elaborated by the United Nations Committee on the Rights of the Child as regards the
digital environment in their General Comments No. 25. Office of the High Commissioner for Human
Rights. (2021). General Comment No. 25 (2021) on children's rights in relation to the digital
environment.
Available:
https://www.ohchr.org/en/documents/general-comments-and-
recommendations/general-comment-no-25-2021-childrens-rights-relation.
(
19
) Children shall have the right to such protection and care as is necessary for their well-being (Article 24
of the Charter).
(
20
) They may express their views freely. Such views shall be taken into consideration on matters which
concern them in accordance with their age and maturity (Article 24 of the Charter).
(
21
) At this regard, the Commission recalls the importance of accessibility, including as regulated in
Directive (EU) 2016/2102 of the European Parliament and of the Council of 26 October 2016 on the
accessibility of the websites and mobile applications of public sector bodies (“Web Accessibility
Directive”), as well as child participation throughout the design, implementation, and evaluation of all
safety, security and privacy measures concerning children online.
(
22
) According to Article 25 GDPR, operators processing minors’ personal data must already implement
appropriate organisational and technical measures to protect the rights of data subject (data protection
by design and default). This obligation is enforced by the competent data protection authorities in line
with Article 51 GDPR. See EDPB guidelines 4/2019 on Article 25 Data Protection by Design and by
Default. Available: https://www.edpb.europa.eu/our-work-tools/our-documents/guidelines/guidelines-
42019-article-25-data-protection-design-and_en.
9
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0011.png
designers and policymakers to shape product and service development in ways
that prioritise values that promote human well-being. They refer to embedding
privacy, safety and security protections by default into the design, operation and
management of organisations, as well as in products and services from the
start (
23
).
d.
Age-appropriate design:
providers of online platforms accessible to minors
should design their services to align with the developmental, cognitive and
emotional needs of minors, while ensuring their safety, privacy, and security.
Age-appropriate designs are suitable for children considering their rights and
well-being as well as their diversity and specific age or stage of development and
take account of the evolving capacities of children (
24
).
5
R
ISK REVIEW
18. The heterogeneous nature of online platforms and diversity of contexts may require
distinct approaches, with certain measures being better suited to some platforms over
others. Where a provider of an online platform accessible to minors is deciding how
to ensure a high level of safety, privacy and security to minors on its platform, and
determining the appropriate and proportionate measures for that purpose, the
Commission considers that that provider should, at a minimum, identify and take into
account:
a.
How likely it is that minors will access its service, notably in view of its nature,
purpose, intended use as well as criteria relevant to determine whether the service
is accessible to minors.
b.
The actual or potential impact on the privacy, safety and security of minors that
the online platform may pose or give rise to, based on the 5Cs typology of online
(
23
) OECD
(2024),
Towards
Digital
https://doi.org/10.1787/c167b650-en.
Safety
by
Design
for
Children.
Available:
(
24
) This requires prioritising features, functionality, content or models that are compatible with children’s
evolving capacities, as well as taking into consideration socio-cultural differences. Age-appropriate
design is crucial for the privacy, safety and security of children: e.g. without age-appropriate information
about it, children may be unable to understand, use or enjoy privacy or safety features, settings or other
tools. CEN-CENELEC (2023)
Workshop Agreement 18016 Age Appropriate Digital Services
Framework,
available
https://www.cencenelec.eu/media/CEN-
CENELEC/CWAs/ICT/cwa18016_2023.pdf ; Ages and developmental stages available,
among others
as Annex to the
Dutch Children’s Code.
Available: https://codevoorkinderrechten.waag.org/wp-
content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf
10
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
risks to children (Annex). This includes an examination of how different aspects
of the platform may give rise to these risks, their likelihood and severity, as well
as consideration of their positive impact on children’s rights and well-being,
taking into consideration the age and evolving capacities of children. For
example, aspects such as the purpose of the platform, its design, interface, value
proposition, marketing, features, functionalities, number and type of users and
uses (actual and expected) may all be relevant. This should include an indication
of the level of risk for minors on the platform (e.g. low, medium or high), based
on clear criteria, in accordance with existing standards and best practices, for
example for child rights impact assessment as mentioned in paragraph 19.
c.
The measures that the provider is already taking to prevent and mitigate these
risks.
d.
Any additional measures that are identified in the review as appropriate and
proportionate to ensure a high level of privacy, safety and security for minors on
their service. The measures that providers may need to take should address the
risks to privacy, safety, and security that originate from the experience of minors
with the service, including those risks that originate from the actions of other
users of the service.
e.
f.
How measures uphold the general principles of Section 4.
Metrics that allow the provider to monitor over time the effectiveness of the
measures they have in place to address certain risks.
g.
The potential positive and negative effects on children’s or other users’ rights of
any measure that the provider currently has in place and any additional measures,
ensuring that these rights are not disproportionately or unduly restricted and
positive effects can be maximised. Children’s or other users’ rights that may be
adversely affected by some measures include, for example, children’s rights to
participation, privacy, protection of personal data, freedom of expression and
information. This is relevant when determining the proportionality of measures.
19. When conducting this review, providers of online platforms accessible to minors
should take into consideration the best interests of the child as a primary
11
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0013.png
consideration (
25
) in line with the Charter and other UNCRC principles (
26
), as well
as to other relevant Union guidance on the matter (
27
). They should include the
perspectives of children by seeking their participation, as well as that of guardians,
representatives of other potentially impacted groups and other relevant experts and
stakeholders.
20. Providers should consider the most up-to-date available information and insight from
scientific and academic sources, including by leveraging other relevant assessments
conducted by the provider. They should adhere to the precautionary principle when
there is reasonable indication that a particular practice, feature or design choice poses
risks to children, taking measures to prevent or mitigate such risks until there is
evidence that its effects are not harmful to children.
21. Providers should carry out the review periodically, and at least on an annual basis or
whenever they make significant changes to the platform’s design (
28
) or become
aware of other circumstances that affect the platform’s design and operation relevant
for ensuring a high level of privacy, safety and security of minors on their online
platform. Providers should make the risk review available to the relevant supervisory
authorities and publish its outcomes without disclosing sensitive operational or
security-related information at the latest before the following review is performed, as
well as consider submitting it to the review of independent experts or relevant
stakeholders.
(
25
) Article 3 of the UNCRC; Article 24 of the Charter: The right of the child to have his or her best interests
assessed and taken as a primary consideration when different interests are being considered, in order to
reach a decision on the issue at stake concerning a child, a group of identified or unidentified children
or children in general. Best interests determinations, when necessary, should not be conducted by the
companies, but based on competent authorities’ action. LSE Digital Futures for Children (2024),
The
Best interests of the child in the digital environment.
Available: https://www.digital-futures-for-
children.net/digitalfutures-assets/digitalfutures-documents/Best-Interests-of-the-Child-FINAL.pdf.
(
26
) Non-discrimination: Children’s rights apply to any child, without any discrimination, as per Article 21
of the Charter.
(
27
) The Commission recalls in particular the EDPB Guidelines on Data Protection Impact Assessment
(DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of
Regulation 2016/679.
(
28
) Examples of significant changes are the introduction of new features affecting user interaction,
modifications to recommender systems, account settings, moderation, reporting or other design features
that would appreciably change children’s experience on the platform, changes in data collection
practices, expansion to new user groups, integration of generative AI tools, or changes related to age
assurance measures or their providers.
12
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0014.png
22. Existing standards and tools to carry out child rights impact assessments can support
providers in carrying out this review. These include, for example, the templates, forms
and other guidance provided by UNICEF (
29
), the Dutch Ministry of the Interior and
Kingdom Relations (BZK) (
30
), or the European standardisation body CEN-
CENELEC (
31
). The Commission may issue additional guidance or tools to support
providers in carrying out the review, including through specific tools for child rights
impact assessments. Until the publication of this guidance, providers can use existing
tools and best practices for these assessments.
23. For providers of VLOPs and VLOSEs this risk review can also be carried out as part
of the general assessment of systemic risks under Article 34 of Regulation (EU)
2022/2065, which will complement and go beyond the risk review pursued in
accordance with the present guidelines.
6
6.1
S
ERVICE DESIGN
Age assurance
6.1.1 Introduction and terminology
24. In recent years, technology has seen fast developments allowing providers of online
platforms to assure themselves in more and less accurate, reliable and robust ways of
the age of their users. These measures are commonly referred to as “age
assurance” (
32
).
(
29
) UNICEF. (2024).
Children's rights impact assessment: A tool to support the design of AI and digital
technology
that
respects
children's
rights.
Available:
https://www.unicef.org/childrightsandbusiness/workstreams/responsible-technology/D-CRIA; (2021)
MO-CRIA:
Child Rights Impact Self-Assessment Tool for Mobile Operators,
Available:
https://www.unicef.org/reports/mo-cria-child-rights-impact-self-assessment-tool-mobile-operators
(
30
) Dutch Ministry of the Interior and Kingdom Relations (BZK). (2024).
Child Rights Impact Assessment
(Fillable Form).
Available: https://www.nldigitalgovernment.nl/document/childrens-rights-impact-
assessment-fill-in-document/.
(
31
) See in particular chapter 14 of CEN-CENELEC (2023)
Workshop Agreement 18016 Age Appropriate
Digital
Services
Framework,
Available:
https://www.cencenelec.eu/media/CEN-
CENELEC/CWAs/ICT/cwa18016_2023.pdf.
(
32
) European Commission: Directorate-General for Communications Networks, Content and Technology,
Center for Law and Digital Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024).
Mapping age assurance typologies and requirements – Research report.
Available:
https://data.europa.eu/doi/10.2759/455338
13
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0015.png
25. The Commission considers measures restricting access based on age to be an effective
means to ensure a high level of privacy, safety and security for minors on online
platforms. For this purpose, age assurance tools can help providers to enforce access
restrictions for users below a certain age, in order to protect minors from accessing
age-inappropriate content online, such as gambling or pornography, or from being
exposed to other risks such as grooming.
26. Age assurance tools can also help providers to prevent adults accessing certain
platforms that are designed for minors, except for when doing so for legitimate
parental, educational, or supervisory purposes, thus reducing the risk of adults posing
as minors and/or seeking to harm minors.
27. Finally, age assurance tools can be used to underpin the age-appropriate design of the
service itself, thereby fostering safer and more child-suitable online spaces. In these
instances, the tools can be used to ensure that children only have access to certain
content, features or activities that are appropriate for their consumption, taking into
account their age and evolving capacities.
28. It is important to distinguish between, on the one hand, the age restriction that limits
access to the platform or to parts thereof to users below or above a certain age, and,
on the other hand, the age assurance methods that are used to determine a user’s age.
29. The most common age assurance measures currently available and applied by online
platforms fall into three broad categories: self-declaration, age estimation, and age
verification.
a.
Self-declaration
consists of methods that rely on the individual to supply their
age or confirm their age range, either by voluntarily providing their date of birth
or age, or by declaring themselves to be above a certain age, typically by clicking
on a button online.
b.
Age estimation
consists of methods which allow a provider to establish that a
user is likely to be of a certain age, to fall within a certain age range, or to be
over or under a certain age (
33
).
(
33
) ibid; CEN-CENELEC. (2023).
Workshop Agreement 18016 Age Appropriate Digital Services
Framework:
https://www.cencenelec.eu/media/CEN-CENELEC/CWAs/ICT/cwa18016_2023.pdf.
14
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0016.png
c.
Age verification
is a system that relies on physical identifiers or verified sources
of identification that provide a high degree of certainty in determining the age of
a user.
30. The main difference between age estimation and age verification measures is the level
of accuracy. Whereas age verification provides certainty about the age of the user,
age estimation provides an approximation of the user’s age. The accuracy of age
estimation technologies may vary and improve as technology progresses.
6.1.2 Determining whether to put in place access restrictions supported by age
assurance measures
31. Before deciding whether to put in place any access restrictions based on age,
supported by age assurance methods, providers of online platforms accessible to
minors should always conduct an assessment to determine whether such a measure is
appropriate to ensure a high level of privacy, safety and security for minors on their
service and whether it is proportionate, or whether such a high level may be achieved
already by relying on other less far-reaching measures (
34
). In this regard, the
Commission is of the view that providers should consider access restrictions based on
age, supported by age assurance measures as a complementary tool to measures set
out in other sections of these guidelines. In other words, access restrictions and age
assurance alone cannot be substitutes for measures recommended elsewhere in these
guidelines.
32. Such an assessment should ensure that any restriction to the exercise of fundamental
rights and freedoms of the recipients, especially minors, is proportionate.
Consequently, the Commission considers that providers of online platforms should
make the result of such an assessment publicly available on the online interface of its
service, both if the assessment concludes that no access restriction supported by age
assurance is required or that such a restriction would be an appropriate and
proportionate measure.
33. The Commission notes that a lower accuracy of age estimation solutions does not
automatically equate to a lower impact on the fundamental rights and freedoms of
recipients, as less accurate solutions may process more personal data than more
(
34
) The review of risks and child rights impact assessment tools outlined in Section 5 on Risk review can
help providers of online platforms to conduct this assessment.
15
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0017.png
accurate ones. They may also prevent some children from accessing online platforms
that they may otherwise be able to access due to the lower level of accuracy.
Therefore, when considering age estimation methods that require the processing of
personal data, providers of online platforms accessible to minors should ensure that
data protection principles, especially data minimisation, are properly implemented
and remain robust over time and take into account the European Data Protection
Board (EDPB) statement on Age Assurance (
35
).
34. The Commission is of the view that, in order to ensure a high level of privacy, safety
and security of minors on their services, providers of online platforms accessible to
minors that consider access restrictions based on age assurance methods necessary
and proportionate should provide information about any age assurance solutions they
identified and their adequacy and effectiveness. They should also provide an overview
of the performance metrics used to measure this, such as false positive and false
negative rates, and accuracy and recall rates.
35. Participation of children in the design, implementation, and evaluation of age
restrictions and age assurance methods should be foreseen.
36. Online platforms accessible to minors might have only some content, sections, or
functions that pose a risk to minors or may have parts of their platform where the risk
can be mitigated by other measures and/or parts where it cannot. In these cases,
instead of age-restricting the service as a whole, providers of such online platforms
should assess which content, sections or functions on their platform carry risks for
minors and implement access restrictions supported by age assurance methods to
reduce these risks for minors in proportionate and appropriate ways. For example,
parts of social media services with content, sections or functions that may pose a risk
to minors, such as adult-restricted sections of a social media, or sections with adult-
restricted commercial communications or adult-restricted product placements by
influencers should only be made available to adult users whose age has been verified
accordingly.
(
35
) See
EDPB
statement
1/2025
on
Age
Assurance.
Available:
https://www.edpb.europa.eu/system/files/2025-04/edpb_statement_20250211ageassurance_v1-
2_en.pdf
16
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0018.png
6.1.3
Determining which age assurance methods to use
6.1.3.1 Age verification
37. In the following circumstances, in view of the fact that the protection of minors
constitutes an important policy objective of the Union to which Regulation (EU)
2022/2065 gives an expression, as reflected in its Recital 71, the Commission
considers the use of access restrictions supported by
age verification
methods an
appropriate and proportionate measure to ensure a high level of privacy, safety, and
security of minors:
a.
Where certain products or services pose a high risk to minors and those risks
cannot be mitigated by less restrictive measures, considering applicable Union
and national laws, such as by way of example:
i.
ii.
the sale of alcohol, tobacco or nicotine-related products, drugs
access to any type of pornographic content,
iii. access to gambling content.
b.
Where, due to identified risks to minors, the terms and conditions or any other
contractual obligations of the service require a user to be 18 years or older to
access the service even if there is no formal age requirement established by law.
c.
Any other circumstances in which the provider of an online platform accessible
to minors has identified risks to minors' privacy, safety, or security, including
content, conduct and consumer risks as well as contact risks (e.g., arising from
features such as live chat, image/video sharing, anonymous messaging), where
these risks cannot be mitigated by other less intrusive measures as effectively as
by access restrictions supported by age verification (
36
).
d.
Where Union or national law, in compliance with Union law, prescribes a
minimum age to access certain products or services offered and/or displayed in
(
36
) These risks can be identified via the review of risks set out in Section 5.
17
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0019.png
any way on an online platform, including specifically defined categories of
online social media services (
37
).
38. Age estimation methods can complement age verification technologies and can be
used in addition to the former, or as temporary alternative in particular in cases where
verification measures that respect the criteria of effectiveness of age assurance
solutions outlined in Section 6.1.4, with particular emphasis on protecting users’ right
to privacy and data protection as well as accuracy, are not yet readily available. This
transitory period should not extend beyond the first review of these guidelines (
38
).
For example, platforms offering adult-restricted content may use ex ante age
estimation methods if they can prove that such methods are comparable to those of
age verification, in respect of the criteria set out in Section 6.1.4, in the absence of
effective age verification measures (
39
). The Commission may in due course
supplement the present guidelines with a technical analysis on the main existing
methods of age estimation that are currently available in view of the criteria outlined
in Section 6.1.4.
6.1.3.2 Age verification technologies
39. Age verification should be treated as a separate, distinct process that is not connected
with other data collection activities exercised by online platforms. Age verification
should not entitle providers of online platforms to store personal data beyond
information about the user’s age group.
40.
As further elaborated under Section 6.1.4, any age assurance method should be robust,
thus not easily circumventable, to be considered appropriate and proportionate. A
(
37
) In this context the Commission recalls the obligations on Member States stipulated by Directive EU
2015/1535 of the European Parliament and of the Council of 9 September 2015 laying down a procedure
for the provision of information in the field of technical regulations and of rules on Information Society
services (OJ L 241, 17.9.2015, p. 1) and the relevant procedures for draft technical regulations
established therein.
(
38
) An overview of different methods of age estimation is available in European Commission: Directorate-
General for Communications Networks, Content and Technology, Center for Law and Digital
Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024)
Mapping age assurance
typologies and requirements – Research report.
Available: https://data.europa.eu/doi/10.2759/455338;
(
39
) The Commission is currently testing an EU age verification solution to facilitate age verification to the
standard required in these guidelines, before the EU Digital Identity Wallet becomes available. Other
solutions, compatible with the standard set out in these guidelines may be available commercially, or in
individual Member States but not in others. Providers of online platforms that prove this circumstance
should anyway start testing and using age verification methods that respects the criteria of Section 6.1.4
as soon as this becomes available. This transitory period may be adjusted in light of the roll-out of the
EU age verification solution.
18
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0020.png
method that is easy for minors to circumvent will not be considered an effective age
assurance measure.
41.
Methods that rely on verified and trusted government-issued IDs, without providing
the platform with additional personal data, may constitute an effective age verification
method, in so far as they are based on anonymised age tokens (
40
). Such tokens should
be issued after reliable verification of the person’s age, and they should be issued by
an independent third-party rather than the provider of the online platform, especially
when it offers access to adult content. The Commission considers that cryptographic
protocols such as key rotation or zero-knowledge proofs (
41
) constitute a suitable basis
for providing age assurance without transmitting personal data.
42. Member States are currently in the process of providing each of their citizens,
residents and businesses an EU Digital Identity Wallet (
42
). The upcoming EU Digital
Identity Wallets provide safe, reliable, and private means of electronic identification
within the Union. Once they are deployed, they may be used to share only specific
information with a service, such as that a person is over a specified age.
The EU Digital Identity Wallet
Once implemented, the EU Digital Identity Wallets will provide safe, reliable, and private means of
electronic identification for everyone in the Union. Every Member State is required to provide at
least one wallet to all its citizens, residents, and businesses, which should allow them to prove
who they are, and to safely store, share and sign important digital documents by the end of 2026.
All EU Digital Identity Wallets embed the opportunity to receive a token of age, and Member States
can implement services to issue such tokens.
(
40
) The service provider only needs to know whether the user is over or under an age threshold. This should
be implemented by a tokenised approach based on the participation of a third-party provider, in which
the service provider only sees the functional result of the age assurance process (e.g. ’over‘ or ‘under‘
the age threshold). A third-party provider performs an age check and provides the user with an “age
token” that the user can present to the service provider without needing to prove their age again. The
age token may contain different user’s attributes and information about when, where or how the age
check was performed. See also EDPB statement 1/2025 on Age Assurance. Available:
https://www.edpb.europa.eu/system/files/202504/edpb_statement_20250211ageassurance_v1-
2_en.pdf
(
41
) A zero-knowledge proof is a protocol in which one party (the prover) can demonstrate another party (the
verifier) that some given statement is true, without conveying to the verifier any information beyond the
mere fact of that statement's truth.
(
42
) As provided for under Section 1 of Chapter II of Regulation (EU) No 910/2014, as amended by
Regulation (EU) 2024/1183.
19
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0021.png
43. To facilitate age verification before the EU Digital Identity Wallets become available,
the Commission is currently testing an EU age verification solution as a standalone
age verification measure that respects the criteria of effectiveness of age assurance
solutions outlined in Section 6.1.4. Once finalised, the EU age verification solution
will provide a compliance example and a reference standard for a device-based
method of age verification. Providers of online platforms that are expected to use age
verification solutions for their services, are therefore encouraged to participate in
available testing of early versions of the EU age verification solution, which may
inform those providers as to the best means of ensuring compliance with Article 28
of Regulation (EU) 2022/2065.
44. Implementation of the reference standard (
43
) set by the EU age verification solution
can be offered through apps published by public or private entities or integrated in the
upcoming EU Digital Identity Wallets. Implementation of this standard will constitute
an age verification technology that is privacy-preserving, data-minimising, non-
traceable and interoperable, in compliance with the criteria of effectiveness of age
assurance solutions outlined in Section 6.1.4.
EU age verification solution
The EU age verification solution, including an app, will be an easy-to-use age verification method
that can be used to prove that a user is 18 or older (18+). The solution will bridge the gap until the
EU Digital Identity Wallet is available. This solid, privacy-preserving and data minimising solution
will aim to set a standard in terms of privacy and user friendliness.
The EU age verification solution provides a compliance benchmark for the accuracy of an age
assurance solution while minimising the impact on the rights and freedoms of the recipients.
Users will be able to easily activate the app and receive the proof in several different ways. The
proof only confirms if the user is 18 years or older. It does not give the precise age, nor does it
include any other information about the user. The user can present the 18+ proof to the online
platform in a privacy-preserving way without data flows to the proof provider. In addition,
mechanisms will be put in place to prevent tracking across online platforms. The use of the app is
simple. When requesting access to adult online content, the user presents the 18+ proof via the app
to the online platform. Following verification of its validity, the online platform grants the user access.
The user’s identity and actions are shielded from disclosure throughout the whole process. The
trusted proof provider is not informed about which online services the user seeks to access with the
18+ proof. Likewise, 18+ online service providers do not receive the identity of the user requesting
access, only a proof that the user is 18 or older.
The EU age verification solution will also be technically capable of providing other attributes, such
as liveness tests. In countries where valid methods for attestations of ages below 18 years are
supported, the EU age verification solution can also provide for age verification below the age of 18.
(
43
) The EU reference standard is available at https://ageverification.dev
20
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0022.png
45. Providers of online platforms accessible to minors may use other age verification
methods to ensure a high level of privacy, safety, and security of minors, provided
that
t
hey are compatible with the EU reference standard (as described in paragraphs
43 and 44 above) and meet the criteria outlined in Section 6.1.4. The EU age
verification solution is an example of a method meeting those criteria.
46. To ensure compliance with the principles of data minimisation, purpose limitation,
and user trust, providers of online platforms are encouraged to adopt
double-blind
age
verification methods. A double-blind method ensures that (i) the online platform does
not receive additional means to identify the user and, instead only receives
information allowing it to confirm whether they meet the required age threshold and
that (ii) the age verification provider does not obtain knowledge of the services for
which the proof of age is used. Such methods may rely on local device processing,
anonymised cryptographic tokens, or zero-knowledge proofs (
44
).
6.1.3.3 Age estimation
47. The Commission considers the use of age estimation methods, when provided by an
independent third party or through systems appropriately and independently audited
notably for security and data protection compliance, as well as when done ex ante if
necessary to ensure the effectiveness of the measure, to be an appropriate and
proportionate measure to ensure a high level of privacy, safety, and security of minors
in the following circumstances:
a.
Where, due to identified risks to minors’ privacy, safety and security, the online
platform service’s terms and conditions or similar contractual obligations of the
service require a user to be above a required minimum age that is lower than 18
(
44
) Such methods are strongly aligned with the EDPB’s call in paragraph 34 of its Statement 1/2025 on Age
Assurance for solutions that prevent linking and profiling. These privacy-preserving approaches are also
favoured by academic research as scalable, inclusive, and effective for minimising risks to minors while
respecting fundamental rights.
21
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0023.png
to access the service, based on the provider’s assessment of risks for minors on
the platform (
45
) (
46
).
b.
Where the provider of the online platform has identified medium risks to minors
on their platform as established in its risk review (see Section 5 on Risk
Review) (
47
) and those risks cannot be mitigated by less restrictive measures. The
Commission considers this will be the case where the risk is not high enough to
require access restriction based on age verification but not low enough that it
would be appropriate to not have any access restriction or to have access
restriction that is not supported by any age assurance methods or is only
supported by self-declaration. Self-declaration is not considered to be an
appropriate age-assurance measure as further explained below.
Good practice
MegaBetting (
48
) is an online platform that allows users to bet on the outcome of real-world
events. The provider restricts its service to users above 18 years, in line with national law. To
ensure that its online platform is not accessible to minors, it relies on the EU age verification
solution that only tells the provider whether the user is at least 18 years old. This information is
created by a trusted issuer based on the national eID of the user and is received from an
application on the user’s phone. The provider considers therefore that the system meets the
criteria of being highly effective whilst preserving the privacy of the user.
Poor practice
SadMedia is a social media online platform. The provider of SadMedia decided to restrict its
services to minors who are at least 13 years old. This was based on its assessment of medium
risks that the platform could pose to minors’ privacy, safety and security. SadMedia’s terms and
conditions set out this restriction. To enforce this restriction, the provider of SadMedia relies on
an age estimation model that it developed, and that it claims can predict the age of the user with
a margin of error of ±2 years. As a result of this margin of error, many minors below the indicated
age can access the service and many minors who meet the required age cannot access the
service. SadMedia’s age assurance measure is not highly effective and therefore does not
ensure a high level of privacy, safety and security for minors on its service.
(
45
) Where age verification is used in these instances, it would be without prejudice to any separate
obligations on the provider, e.g. requiring it to assess whether the minor as a consumer was old enough
to legally enter into a contract. This depends on the applicable law of the Member State where the minor
is resident.
(
46
) In some cases, it may be possible for the provider to verify that the minor was signed up by their
guardians.
(
47
) These risks can be identified via the review of risks set out in Section 5.
(
48
) All good and poor practice examples in these guidelines refer to fictious online platforms.
22
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0024.png
48. Where the provider of an online platform accessible to minors has determined that
access restrictions supported by age assurance are necessary to achieve a high level
of privacy, safety and security for minors on their service, the Commission considers
that it should offer on its platform more than one age assurance method, to provide
the user with a choice between methods, provided that any such method meets the
criteria outlined in Section 6.1.4. This will help to avoid the exclusion of users who,
despite being eligible to access an online platform, cannot avail themselves of a
specific age assurance method. In order to increase effectiveness and user-
friendliness, the appropriate age assurance method should be carried out, where
possible at account creation, and the age information then used to contribute to an
age-appropriate experience on the platform, in addition to other protective measures
mentioned in these guidelines. Furthermore, providers of online platforms should
provide a redress mechanism for users to complain about any incorrect age
assessments by the provider (
49
).
Poor practice
SadMedia uses an age estimation solution as one of a range of measures that aims to contribute
to a high level of privacy, safety and security. When the age estimation system provides a
negative result, indicating that the user is too young to use the service, a pop-up is presented to
the user which states “Disagree with the result? Please try again!” The user is then able to redo
the age estimation test using the same method. In this example, the age assurance measure
would not be considered appropriate or proportionate as no possibility is given to the recipient
to use another age assurance method nor is a way of redress provided to the recipient to
challenge an incorrect assessment.
6.1.4 Assessing the appropriateness and proportionality of any age assurance
method
49. Before considering whether to put in place a specific age verification or estimation
method supporting access restrictions, providers of online platforms accessible to
minors should consider the following features of that method:
a.
Accuracy.
How accurately any given method determines the age of the user.
(
49
) The provider may wish to integrate this mechanism into their internal complaint-handling system under
Article 20. See also Section 7.1 of this document.
23
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0025.png
The accuracy of an age verification or estimation method should be assessed
against appropriate, clear, and publicly available metrics. These metrics are
necessary to evaluate the extent to which the method can correctly determine
whether a user is above or below a certain age, or a person's age range (
50
).
Providers of online platforms should periodically review whether the technical
accuracy of the method used still matches the state-of-the-art.
b.
Reliability.
How reliable a given method works in practice in real-world
circumstances.
For a method to be reliable, it should be available continuously at any time, and
work in different real-world circumstances, beyond ideal lab conditions.
Providers of online platforms accessible to minors should assess, before
employing a specific age assurance solution, that any data relied upon as part of
the age assurance process comes from a reliable source. For example, a self-
signed proof of age would not be considered reliable.
c.
Robustness.
How easy it is to circumvent a given method.
A method that is
easy
for minors to circumvent will not be considered robust
enough and will therefore not be considered effective. Such level of “easiness”
shall be assessed by providers of online platforms accessible to minors on a case-
by-case basis, considering the age of the minors to which the specific measures
are addressed. Providers of online platforms accessible to minors should also
assess whether the age assurance method provides safety and security, in line
with the state-of-the-art, to ensure the integrity of the age data being processed.
d.
Non-Intrusiveness.
How intrusive is a given method on users’ rights.
Providers of online platforms accessible to minors should periodically assess the
impact the chosen method will have on recipients' rights and freedoms, including
their right to privacy, data protection, and freedom of expression (
51
). According
to the European Data Protection Board, and in line with Article 28(3) of
(
50
) Inaccurate age assurance may lead to the exclusion of recipients that would be as such eligible to use a
service or allow ineligible recipients to access the service despite the age assurance measure in place.
(
51
) Inappropriate age assurance may create undue risks to recipients’ rights to data protection and privacy
whereas blanket age assurance could limit access to services beyond what is actually necessary.
24
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0026.png
Regulation (EU) 2022/2065 (
52
), a provider should only process the age-related
attributes that are strictly necessary for the specific purpose and age assurance
should not be used to provide additional means for providers to identify, locate,
profile or track natural persons (
53
). If the method is more intrusive than another
method that provides the same level of assurance and effectiveness, the less
intrusive method should be chosen. This includes an assessment of whether the
method provides full transparency about the process in line with Article 12 of
Regulation (EU) 2016/679 and/or provides information about the user at risk. In
no circumstances can the data processed for the purposes of ascertaining
whether a user is above or below a certain age be stored or used for other
purposes.
e.
Non-discrimination.
How a given method can discriminate against some users.
Providers of online platform accessible to minors should make sure that the
chosen method is appropriate and available for all minors, regardless of
disability, language, ethnic, gender, religious and minority backgrounds.
50. Where age assurance measures do not achieve the criteria set out above, they cannot
be deemed to be appropriate and proportionate.
51. Age assurance solutions which can be easily circumvented should not be considered
as ensuring a high level of privacy, safety and security for minors. Such assessment
should be conducted depending on the impact that the platform may have on the
privacy, safety and security of minors. The storage of a proof of age should also
depend on the risks associated with the relevant platforms. For example, adult-
restricted online platforms should not allow sharing of user account credentials and
thus conduct age assurance at each instance when their service is accessed.
52. The Commission considers that
self-declaration
(
54
) does not meet all the
requirements above, in particular the requirement for robustness and accuracy.
(
52
) See Recital 71 of Regulation (EU) as well 2022/2065 which highlights the need for providers to observe
the data minimisation principle provided for in Article 5(1)(c) of Regulation (EU) 2016/679.
(
53
) See EDPB statement 1/2025 on Age Assurance point 2.3 and 2.4.
(
54
) European Commission: Directorate-General for Communications Networks, Content and Technology,
Center for Law and Digital Technologies (eLaw), LLM, Raiz Shaffique, M. and van der Hof, S. (2024)
Mapping age assurance typologies and requirements – Research report.
Available:
https://data.europa.eu/doi/10.2759/455338
25
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
Therefore, it does not consider self-declaration to be an appropriate age assurance
method to ensure a high level of privacy, safety, and security of minors in accordance
with Article 28(1) of Regulation (EU) 2022/2065.
53. Furthermore, the Commission considers that the fact that a third party is used to carry
out age assurance should be explained to minors – as in any case – in an accessible,
visible way and in a child-friendly language (see Section 8.4 on Transparency). In
addition, it remains the responsibility of the provider to ensure that the method used
by the third party is effective, in line with the considerations set out above. This
includes, for example, where the provider intends to rely on solutions provided by
operating systems or device operators.
6.2
Registration
54. Registration or authentication may influence whether and how minors are able to
access and use a given service in a safe, age-appropriate and rights-preserving way.
The Commission is of the view that, when it has been determined that age assurance
is necessary in order to provide a high level of privacy, safety and security, as well as
to provide an age-appropriate experience, registration or authentication can be a first
point of use to carry out such process in a proportionate way.
55. Where registration is not required, and cognizant of the fact that any unregistered user
could be a minor below the minimum age required by the online platform to access
the service and/or age-inappropriate content on the service, the provider of the
relevant online platform accessible to minors should configure the settings of any
unregistered users in a way which guarantees the highest levels of privacy, safety and
security, considering in particular the recommendations set out in Sections 6.3.1 and
6.3.2 and treating the best interests of the child as a primary consideration, including
having regard to contact risks associated with an adult potentially posing as a child.
56. Where registration is required or offered as a possibility to access an online platform
accessible to minors, the Commission considers that the provider of that platform
should:
a.
Explain to users the benefits and risks of registration and, where relevant, why
registration is necessary (see Section 8.4 on Transparency).
26
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0028.png
b.
Ensure that the registration process is easy for all minors to access and navigate,
according to their evolving capacities, including those with disabilities or
additional accessibility needs, and in a language that they can understand.
c.
Ensure that the registration process includes measures to help users to understand
whether they are old enough to use the service. In line with point d. below, these
measures should only be presented to users after an age assurance method,
including self-declaration, has been applied.
d.
Avoid encouraging or enticing users who are below the minimum age required
by the online platform accessible to minors to create accounts or to access the
service and take measures to reduce the risk of this happening (
55
).
e.
Ensure that it is easy for minors to log out and to have their account deleted at
their request.
f.
Use the registration process as one opportunity to carry out age assurance if
necessary, in view of recommendations in Sections 5 and 6.1 (
56
), as well as
highlight the safety features of the platform or service, the rules of conduct along
with their respective consequences for violating terms, any identified risks to a
minor’s privacy, safety or security and resources available to support users.
g.
Ensure that the registration process does not encourage or entice children to
make available or share on their profile more information than necessary for the
functioning of the service, and that consent from the child’s parent or guardian
is sought where necessary under Union or Member State law.
(
55
) This is without prejudice to additional requirements stemming from other laws, such as Article 12 of
Regulation (EU) 2016/679.
(
56
) As outlined in Section 6.1, the Commission does not consider self-declaration to be an appropriate age
assurance method to ensure a high level of privacy, safety, and security of minors in accordance with
Article 28(1) of Regulation (EU) 2022/2065
27
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0029.png
6.3
Account settings
6.3.1
Default settings
57. Default settings are an important tool that providers of online platforms accessible to
minors may use to mitigate risks to minors’ privacy, safety and security, such as, for
example, the risk of unwanted contact by individuals seeking to harm minors.
Evidence suggests that users tend not to change their default settings, which means
that the default settings remain for most users and thus become crucial in driving
behaviour (
57
). The Commission therefore considers that providers of online
platforms accessible to minors should:
a.
Ensure that privacy, safety and security by design principles are consistently
applied to all account settings for minors.
b.
Set accounts for minors to the highest level of privacy, safety and security by
default. This includes designing default settings in such a way as to ensure safe
and age-appropriate settings for minors, taking into account their evolving
capacities. These settings should ensure that by default for all minors, as a
minimum:
i.
accounts only allow interaction such as likes, tags, comments, direct
messages, reposts and mentions by accounts they have previously accepted.
ii.
no account can download or take screenshots of contact, location or account
information, or content uploaded or shared by minors to the platform.
iii. only accounts that the minor has previously accepted can see their content,
posts and account information.
iv. no one can see the minor’s activities such as ‘liking’ content or ‘following’
another user.
(
57
) Willis, L. E. (2014). Why not privacy by default?
Berkeley Technology Law Journal, 29(1),
61.
Available: https://www.btlj.org/data/articles2015/vol29/29_1/29-berkeley-tech-l-j-0061-0134.pdf; Cho, H.,
Roh, S., & Park, B. (2019). Of promoting networking and protecting privacy: Effects of defaults and
regulatory focus on social media users’ preference settings.
Computers in Human Behavior, 101,
1-13.
Available: https://doi.org/10.1016/j.chb.2019.07.001. Examples of settings that may put minors’ privacy,
safety or security at risk include, but are not limited to, enabling location sharing, switching to a public
profile, allowing other users to view their contact or follower lists, allowing sharing of media files, and
hosting or participating in a live stream.
28
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
v.
geolocation, microphone, photo access and camera, contact synchronisation
as well as all not-strictly necessary tracking features are turned off.
vi. the default autoplay of videos and hosting live streams are turned off.
vii. push notifications are turned off by default and are always off during core
sleep hours, adapting the core sleep hours to the age of the minor. When
push notifications are actively enabled by the user, they should only notify
the user about interactions arising from the user’s direct contacts and
content from accounts or channels that the user actively follows or engages
with (for example, push notifications should never be inauthentic and
always mentions precisely the user or creator the notification comes from).
viii. features that may contribute to excessive use, such as the number of “likes”
or “reactions”, “streaks”, the “... is typing” function and “read receipts,” are
turned off.
ix. any functionalities that increase users' agency over their interactions are
enabled. This might include, for example, information or friction that slows
down content display, posting and user interaction, giving users an
opportunity to think before they decide if they want to see more content, or
to think before they post.
x.
recommendations of other accounts are turned off.
xi. filters that can be associated to negative effects on body image, self-esteem
and mental health are turned off.
c.
Consider whether, depending on minors’ ages and evolving capacities and the
outcome of a provider’s risk review, it is necessary to go beyond the minimum
standard for default settings set out in this Section 6.3.1, and design and
implement default settings that are more restrictive. For example, by designing
default settings for younger minors where no other user is allowed to engage in
certain types of interactions.
d.
Regularly test and update default settings, ensuring that they remain effective
after all updates, and against emerging online risks and trends, including any
29
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0031.png
risks to minors’ privacy, safety and security identified by the provider in the
course of their review of risks (see Section 5 on Risk review).
e.
Ensure that minors are not in any way encouraged or enticed to change their
settings to lower levels of privacy, safety and security, and that any options to
change default settings are presented in a neutral way.
f.
Ensure that minors are provided with incremental degrees of control over their
settings, according to their age, evolving capacities and needs, to support their
growing autonomy and provide them with more agency (
58
).
g.
Ensure that settings are explained to minors in a child-friendly and accessible
way (see Section 6.4 on Online interface and other tools).
58. Where minors change their default settings or opt into features that put their privacy,
safety or security at risk, the Commission considers that the provider of online
platform should:
a.
Empower minors with the ability to choose between temporarily changing their
default settings, for example for a period of time or for current use in that session
and permanently changing their default settings.
b.
Enable easy return to default settings, such as a one-click reset or a history-based
undo feature for settings that have been changed.
c.
Present warning signals at the point at which the minor changes their settings,
clearly explaining the potential consequences of their changes.
d.
Periodically provide reminders to minors about the potential consequences of
their change and periodically provide them with the opportunity to return to their
default settings.
(
58
) Minors experience different developmental stages and have different levels of maturity and
understanding at different ages. This is recognised
among others
in the UN Committee on the Rights of
the Child General Comment No. 25 on children’s rights in relation to the digital environment 2021, para.
19-21. A practical table on ages and developmental stages is available, among others as Annex to the
Dutch
Children’s
Code.
Available
at:
https://codevoorkinderrechten.nl/wp-
content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf.
30
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
e.
Automatically turn off geolocation, microphone and camera as well as not
strictly necessary tracking features after the session ends, if a minor turns them
on.
f.
When geolocation, microphone and camera are switched on, make this obvious
to minors throughout the period during which they are switched on.
6.3.2
Availability of settings, features and functionalities
59. The Commission considers that providers of online platforms accessible to minors
should:
a.
Consider whether some settings, features or functionalities should be removed
from minors’ accounts altogether and/or whether any of the default settings set
out in the previous Section 6.3.1 should be made irreversible or unchangeable
for all minors or for minors of certain ages, taking into account their age and
evolving capacities, and remove and/or make irreversible such settings on the
basis of that assessment. When making this assessment, providers of online
platforms accessible to minors should assess the manner in which those settings
and functionalities may impact the high level of privacy, safety and security of
minors on their platform.
b.
Ensure that irrespective of the account settings chosen by minors:
i.
minors can never be easily found or contacted by accounts they have not
previously accepted as contacts.
ii.
minors’ personal contact details, including email or telephone number, are
never disclosed to other users unless explicitly permitted by the minor.
iii. minors’ accounts are never included in contact suggestions to adults. Adult
accounts or accounts likely to be fake minor accounts are not recommended
to minors.
iv. accounts that the minor has not previously accepted as contacts can never
see their profile information, biography, activities and history such as
‘likes’ and ‘views’, lists of friends and followers and accounts that the
minor follows, and that such information always becomes unavailable if the
account is blocked or otherwise un-accepted.
31
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
c.
Ensure that minors are provided with the possibility to restrict the visibility of
their profile photo and of individual pieces of content that they publish, as well
as the possibility to restrict the visibility of their content generally.
d.
Ensure that minors are provided with the possibility to accept or reject any
tagging by other users, whether in content, comments or otherwise.
e.
Implement measures to prevent minors from inadvertently accepting unwanted
contacts by, for example, requiring users to include a message when they request
to connect with a minor.
6.4
Online interface design and other tools
60. The Commission considers that measures allowing minors to take control of their
online experiences are an effective means of ensuring a high level of privacy, safety
and security of minors for the purposes of Article 28(1) of Regulation (EU)
2022/2065.
61. Without prejudice to the obligations of providers of VLOPs and VLOSEs under
Section 5 of Chapter III of Regulation (EU) 2022/2065 and independently of the
providers of online platforms’ obligations as regards the design, organisation and
operation of their online interfaces deriving from Article 25 of that Regulation, the
Commission considers that providers of online platforms accessible to minors should
adopt and implement functionalities allowing minors to decide how to engage with
their services. These functionalities should provide the right balance between child
agency and an adequate level of privacy, safety and security. This should include, for
example:
a.
Ensuring that online interface design offers an age-appropriate experience for
minors.
b.
Ensuring that minors are not exposed to persuasive design features that are aimed
predominantly at engagement and that may lead to extensive use or overuse of
the platform or problematic or compulsive behavioural habits. This includes the
possibility to scroll indefinitely, the superfluous requirement to perform a
specific action to receive updated information on an application, automatic
32
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0034.png
triggering of video content, notifications artificially timed to regain minors’
attention, notifications that are artificial, including those that pretend to be
another user or social notifications about content that the user has never engaged
with, signs communicating scarcity and/or urgency (
59
), and the creation of
virtual rewards for performing (repeated) actions on the platform.
c.
Introducing customisable, visible, easy-to access and use, child-friendly and
effective time management tools to increase minors’ awareness of their time
spent on online platforms. To be effective, these tools should deter minors from
spending more time on the platform. These could also include nudges that
favour safer options. There should also be systematic implementation of active
notifications informing minors of the time spent online.
d.
Ensuring that any tools, features, functionalities, settings, prompts, options and
reporting, feedback and complaints mechanisms are child-friendly, age-
appropriate, easy to find, access, understand and use for all minors, including
those with disabilities and/or additional accessibility needs, are engaging, and do
not require changing devices to complete any action involved.
e.
Ensuring that, if AI features, such as AI chatbots and filters, are integrated into
an online platform accessible to minors, they are not activated automatically and
minors are not encouraged or enticed to use them, and that such systems are in
line with their evolving capacities and designed in a way that is safe for them. In
this regard, the Commission considers that AI features should only be made
available on online platforms accessible to minors after an assessment of the
risks those AI features may pose to minors’ privacy, safety and security, and that
they should be easy to turn off and it should be clear when they are not.
f.
Ensuring that technical measures are implemented to warn (
60
) minors that
interactions with an AI feature is different from human interactions and that these
(
59
) The Commission recalls that Directive 2005/29/EC prohibits unfair commercial practices, including in
its Annex I, point 7, falsely stating that a product will only be available for a very limited time, or that
it will only be available on particular terms for a very limited time, in order to elicit an immediate
decision and deprive consumers of sufficient opportunity or time to make an informed choice.
(
60
) The Commission recalls the obligation for providers of AI systems that are intended to interact directly
with natural persons to ensure these are designed and developed in such a way that natural persons
concerned are informed they are interacting with an AI system according to Article 50(1) of Regulation
(EU) 2024/1689 (“the AI Act”). Any measure taken upon this recommendation should be understood
33
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0035.png
features can provide information that is factually inaccurate and can be
misleading. This warning should be easily visible, drafted in child-friendly
language, and directly accessible from the interface and throughout the entire
duration of the minor’s interaction with the AI feature. For example, AI chatbots
should not be displayed prominently, they should not be part of suggested
contacts or grouped with users the minor is connected to. Providers of online
platforms should ensure that minors and their guardians have options to opt out
of the use of AI chatbots and should not be nudged towards using those
features (
61
). Such AI features cannot be used to influence or nudge minors
towards commercial content or purchases.
Poor practice
SadFriends is a social media platform where minors’ profiles are subject to the same settings
as adults. Upon sign-up, minors’ account information and content are visible to other users on
and off the platform. Minors can be contacted by other users who have not been accepted as
contacts by the minor. These other users can send them messages and comment on their
content. When minors turn on their geolocation to share their location with their friends, their
location becomes visible to all accounts they are friends with and remains activated after they
close the session, which means that other users can see where they are until the minor
remembers to turn off their geolocation.
As a result, malicious actors start targeting minors on SadFriends. Unknown adults reach out to
minors and engage with them, building an emotional connection and gaining their trust. Minors
are groomed and coerced into creating and sharing child sexual abuse images with their
abusers.
6.5
Recommender systems and search features
62. Recommender systems (
62
) determine the manner in which information is prioritised,
optimised and displayed to minors. As a result, such systems have an important
impact on whether and to what extent minors encounter certain types of content,
contacts or conducts online. Recommender systems may pose and exacerbate risks to
according to and without prejudice with the measures taken to comply with Article 50(1) of the AI Act,
including its own supervisory and enforcement regime.
(
61
) The Commission recalls that the Guidelines on prohibited artificial intelligence practices established by
Regulation (EU) 2024/1689 (AI Act).
(
62
) For the purpose of this Section, the Commission recalls that, in accordance with Article 3(s) of
Regulation (EU) 2022/2065, recommender systems include systems deployed for content
recommendations,
product
recommendations,
advertisement
recommendations,
contact
recommendation, search autocomplete and results.
34
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0036.png
minors’ privacy, safety and security online by, for example, amplifying content that
can have a negative impact on minors’ safety and security (
63
).
63. The Commission recalls the obligations for all providers of all categories of online
platforms concerning recommender system transparency under Article 27 of
Regulation (EU) 2022/2065 and the additional requirements for providers of VLOPs
and VLOSEs under Articles 34 (1), 35(1), and 38 of Regulation (EU) 2022/2065 in
this respect (
64
).
64. In order to ensure a high level of privacy, safety and security specifically for minors
as required under Article 28 (1) of Regulation (EU) 2022/2065, the Commission
considers that providers of online platforms accessible to minors should put in place
the following measures:
6.5.1
Testing and adaptation of the design and functioning of recommender
systems for minors
65. Providers of online platforms accessible to minors that use recommender systems in
the provision of their service should:
a.
Regularly test and adapt their recommender systems to enhance the privacy,
safety and security of minors and in accordance with the risk review provided
under Section 5, which includes a consideration of children’s broader rights.
Such testing and adaptation should be conducted by consulting minors,
guardians and independent experts.
(
63
) Munn, L. (2020). Angry by design: Toxic communication and technical architectures. Humanities and
Social Sciences Communications, 7(53). Available: https://doi.org/10.1057/s41599-020-00550-7; Milli,
S. et al. (2025). Engagement, user satisfaction, and the amplification of divisive content on social media.
PNAS Nexus, 4(3) pgaf062. Available: https://doi.org/10.1093/pnasnexus/pgaf062; Piccardi, T. et al.
(2024). Social Media Algorithms Can Shape Affective Polarization via Exposure to Antidemocratic
Attitudes and Partisan Animosity. Available: 10.48550/arXiv.2411.14652; Harriger, J. A., Evans, J. L.,
Thompson, J. K., & Tylka, T. L. (2022). The dangers of the rabbit hole: Reflections on social media as
a portal into a distorted world of edited bodies and eating disorder risk and the role of algorithms. Body
Image, 41, 292-297. Available: https://doi.org/10.1016/j.bodyim.2022.03.007; Amnesty International.
(2023). Driven into darkness: How TikTok’s ‘For You’ feed encourages self-harm and suicidal ideation.
Available: https://www.amnesty.org/en/documents/pol40/7350/2023/en/; Hilbert, M., Ahmed, S., Cho,
J., & Chen, Y. (2024). #BigTech @Minors: Social media algorithms quickly personalize minors’
content, lacking equally quick protection. Available: http://dx.doi.org/10.2139/ssrn.4674573. Sala, A.,
Porcaro, L., Gómez, E. (2024). Social Media Use and adolescents' mental health and well-being: An
umbrella review, Computers in Human Behavior Reports, Volume 14, 100404, ISSN 2451-9588.
Available: https://doi.org/10.1016/j.chbr.2024.100404
(
64
) The Commission also recalls that other Union or national law may impact the design and functioning of
recommender systems, with a view to ensure protection of legal interests within their remits, which
contribute to a high level of privacy, safety and protection of fundamental rights online.
35
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
b.
Take into account specific needs, characteristics, disabilities and additional
accessibility needs of minors, also with due consideration to their age group,
when defining the objectives, parameters and evaluation strategies of
recommender systems. Parameters and metrics related to accuracy, diversity,
inclusivity and fairness
should be prioritised.
c.
Ensure that recommender systems do not rely on the collection of any
behavioural data that captures the minor's activities off the platform.
d.
Ensure that, if a recommender system relies on the processing of behavioural
data about a minor, the suggestions of specific information to minor recipients
of the service or the prioritisation of that information does not rely on the
processing of behavioural personal data that is so extensive as to capture all or
most of the minor’s activities on the platform, which may give rise to the feeling
that the minor's private life is being continuously monitored.
e.
Ensure that recommender systems rely on ’implicit engagement-based signals’
only after having assessed whether it is in the best interests of the minor, taking
into account the principles of data minimisation and transparency, and provided
that such use is clearly defined and subject to appropriate safeguards as further
defined in the recommendations above.
f.
For the purposes of the present guidelines, ‘implicit engagement-based signals’
shall be understood as referring to signals and data that infer user preferences
from their activities (browsing behaviour on a platform), such as time spent
viewing content and click-through rates.
g.
Prioritise ‘explicit user-provided signals’ to determine the content displayed and
recommended to minors. The selection of such signals should be justified in the
best interests of the minor, taking into account the principles of data
minimisation and transparency, which will help to ensure that they contribute to
a high level of safety and security for minors. For the purposes of the present
guidelines, ‘explicit user-provided signals’ shall be understood as referring to
user feedback and interactions that indicate users’ explicit preferences, both
36
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0038.png
positive and negative, including the stated and deliberative selection of topics of
interest, surveys, reporting (
65
), and other quality-based signals.
h.
Implement measures to prevent minors’ exposure to content recommendations
that could pose a risk to their safety and security, particularly when encountered
repeatedly, such as content promoting unrealistic beauty standards or dieting,
content that glorifies or trivialises mental health issues, such as anxiety or
depression, discriminatory content, radicalisation content and distressing content
depicting violence or encouraging minors to engage in dangerous activities. This
includes content that has been reported or flagged by users, trusted flaggers or
other actors or content moderation tools, and whose lawfulness and adherence to
the platform’s terms and conditions have not yet been verified, in accordance
with the relevant obligations under Regulation (EU) 2022/2065 and with Section
6.7.
i.
Implement measures to ensure that recommender systems do not enable or
facilitate the dissemination of illegal content or the commitment of criminal
offences against and by minors.
j.
Ensure that minors’ search results and suggestions for contacts prioritise
accounts whose identity has been verified and contacts are connected to the
network of the minor, or contacts in the same age range as the minor.
k.
Ensure that search features, including but not limited to text autocomplete on the
search bar and suggested terms and key phrases, do not recommend content that
is illegal and/or qualifies as harmful to the privacy, safety or security of minors,
for instance by blocking search terms that are well-known to trigger content that
is deemed to be harmful to minors’ privacy, safety and/or security, such as
particular words, slang, hashtags or emojis (
66
). Upon queries related to such
(
65
) For example, minors’ feedback about content, activities, individuals, accounts or groups that make them
feel uncomfortable or that they want to see more or less of should be taken into account in the ranking
of the recommender systems. This includes feedback such as “Show me less/more”, “I don’t want to
see/I am not interested in”, “I don’t want to see content from this account,” “This makes me feel
uncomfortable,” “Hide this,” “I don’t like this,” or “This is not for me.” See also Section 7.1 on user
reporting, feedback and complaints of the present guidelines.
(
66
) Examples of terms can be found in the Knowledge Package on Combating Drug Sales Online, which
was developed as part of the EU Internet Forum and compiles more than 3 500 terms, emojis and slangs
used by drug traffickers to sell drugs online - see reference in the EU Roadmap to fight against drug
trafficking and organised crime, COM/2023/641 final.
37
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
content, providers of online platforms should redirect minors to appropriate
support resources and helplines.
6.5.2
User control and empowerment
66. Providers of online platforms accessible to minors that use recommender systems, in
the provision of their service should adopt the following measures to ensure a high
level of privacy, safety and security of minors:
a.
Provide minors with the opportunity to reset their recommended feeds
completely and permanently.
b.
Within the prioritisation of parameters and metrics related to accuracy, diversity,
inclusivity and fairness, provide information and nudge minors toward searching
for new content after a certain amount of interaction with the recommender
system.
c.
Assess whether, in light of the specific features of the platform and in order to
ensure a high level of privacy, safety, and security of minors on such platform,
it considers it appropriate to ensure that minors can choose an option of their
recommender system that is not based on profiling. This recommendation is
without prejudice to the obligations of providers of VLOPs and VLOSEs under
Article 38 of Regulation (EU) 2022/2065.
d.
In case that, as a result of the assessment referred to in the point above or as a
result of the obligations stemming from Article 38 of Regulation (EU)
2022/2065 in relation to the providers of VLOPs and VLOSEs, they have put in
place an option of their recommender system that is not based on profiling, assess
whether this should be provided as a default setting and if they consider it
appropriate, put in place the necessary safeguards and transparency measures to
inform minors of such option and the potential consequences of turning off such
default setting.
e.
Ensure that relevant reporting and feedback mechanisms set out in Section 7.1
have a swift, direct and lasting impact on the parameters, editing and output of
the recommender systems. This includes permanently removing reported content
and contacts from recommendations (including content reported for hiding and
38
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0040.png
blocked/reported contacts) and reducing the visibility of similar content and
accounts.
67. In addition to the obligations set out in Article 27(1) of Regulation (EU) 2022/2065,
and for providers of VLOPs and VLOSEs the enhanced due diligence obligations laid
down in Articles 34, 35 and 38 of that Regulation, the Commission considers that
providers of online platforms accessible to minors should:
a.
Ensure that any settings and information provided to minors about their
recommender systems, including but not limited to their Terms and Conditions,
are presented in child-friendly and accessible ways, adapted to the age and
evolving maturity of the child, and in a language they could understand (see
Sections 6.4 on Online interface design and other tools and Section 8.4 on
Transparency for more details).
b.
Meaningfully explain why each specific piece of content was recommended to
them, including information about the parameters used and the user signals
collected for that specific recommendation.
c.
Offer minors, in an accessible way and tailored to child-friendly language and
design, the options to modify or influence the parameters of their recommender
systems by, for example, allowing them to select content categories and activities
they are most or least interested in including explanations in child-friendly
language. This should be offered during the account creation process and
regularly throughout the minor’s time on the platform. These preferences should
directly influence the recommendations provided by the system, ensuring that
they align more closely with the minor’s age and best interests (
67
).
6.6
Commercial practices
68. Minors are particularly exposed to the persuasive effects of commercial practices and
have a right to be protected against economically exploitative practices (
68
) by online
(
67
) See Articles 27(1) and (3) of Regulation (EU) 2022/2065.
(
68
) UN Committee on the Rights of the Child General Comment No. 25, para 112; UNICEF. (2019).
Discussion
paper:
Digital
marketing
and
children’s
rights.
Available:
39
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0041.png
platforms. They are confronted with commercial practices by online platforms, facing
diverse, dynamic and personalised persuasive tactics through, for example,
advertisement, product placements, the use of in-app currencies, influencer
marketing, sponsorship or AI-enhanced nudging (
69
). This can have a negative effect
on minors’ privacy, safety and security when using the services of an online platform.
69. In line with, and without prejudice to, the existing horizontal legal framework, in
particular the Unfair Commercial Practices Directive 2005/29/EC that is fully
applicable to all commercial practices also towards minors (
70
) and the more specific
rules in Regulation (EU) 2022/2065 on advertising (Articles 26, 28(2) and 39) and
dark patterns (Article 25), the Commission considers that providers of online
platforms accessible to minors should adopt the following measures to ensure a high
level of privacy, safety, and security of minors, on their service for the purposes
Article 28(1) of Regulation (EU) 2022/2065:
a.
Ensure that minors’ lack of commercial literacy is not exploited by considering
minors’ age, vulnerabilities and limited capacity to engage critically with
commercial practices on the platform and provide relevant support (
71
).
https://www.unicef.org/childrightsandbusiness/media/256/file/Discussion-Paper-Digital-
Marketing.pdf.
(
69
) This makes it difficult for them, for instance, to distinguish between commercial and non-commercial
content, to resist peer pressure to buy in-game or in-app content that are attractive for minors or even
necessary to progress in the game, or to understand the real currency value of in-app currencies or that
the occurrence of the most desirable content such as upgrades, maps and avatars may be less frequent in
randomised in-app or in-game purchases than less desirable content. M. Ganapini, E. Panai (2023)
An
Audit Framework for Adopting AI-Nudging on Children.
Available: https://arxiv.org/pdf/2304.14338.
(
70
) The Commission recalls that per its Article 2(4) Regulation (EU) 2022/2065, it is without prejudice to
Directive 2010/13/EU, Union law on copyright and related rights, Regulation (EU) 2021/784,
Regulation (EU) 2019/1148, Regulation (EU) 2019/1150, Union law on consumer protection and
product safety (including Directive (EU) 2005/29 and Union law on the protection of personal data,
Union law in the field of judicial cooperation in civil matters, Union law in the field of judicial
cooperation in criminal matters and a Directive laying down harmonised rules on the appointment of
legal representatives for the purpose of gathering evidence in criminal proceedings. Further, it shall not
affect the application of Directive 2000/31/EC. Under Article 91 of Regulation (EU) 2022/2065, the
Commission is mandated to evaluate and report, by 17
th
November 2025, on the way that this Regulation
interacts with other legal acts, in particular the acts referred to above.
(
71
) UNICEF provides resources and guidance for platforms related to digital marketing ecosystem,
including UNICEF (2025)
Discussion Paper on digital marketing and children’s rights
Available
https://www.unicef.org/childrightsandbusiness/workstreams/responsible-technology/digital-marketing.
40
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0042.png
b.
Ensure that minors are not exposed to harmful, unethical and unlawful
advertising (
72
). This may entail, for example considering the appropriateness of
advertising campaigns for different age groups, addressing their adverse impact,
and taking adequate security measures to protect minors as well as to ensure that
they have access to information that is in their best interests (
73
).
c.
Regularly review the relevant protective measures in consultation with minors,
guardians and other relevant stakeholders.
d.
Ensure that minors are not exposed to excessive total volumes, frequency and
recommendation of commercial content, that can lead to excessive or unwanted
spending or addictive behaviours and have detrimental effects on their privacy,
safety and security.
e.
Ensure that minors are not exposed to AI systems integrated in the platform that
influence or nudge children for commercial purposes, particularly through
conversational or advisory formats such as chatbots (
74
).
f.
Ensure that declarations of commercial communication are clearly visible, child-
friendly, age-appropriate and accessible (see Section 8.4 on Transparency) and
consistently used throughout the service, for instance with the use of an icon or
a similar sign to clearly indicate that content is advertising (
75
). These should be
(
72
) The Commission recalls that, for instance, traders are subject to the prohibition under Directive
2005/29/EC Article 5(1) to commit unfair commercial practices and point 28 of Annex I of the Directive
prohibits direct exhortation to children to buy advertised products or persuade their parents or other
adults to do so. This commercial behaviour is in all circumstances considered unfair.
(
73
) Committee on the Rights of the Child’s General comment No. 25 (2021) on children’s rights in relation
to the digital environment provides that the best interests of the child should be “a primary consideration
when regulating advertising and marketing addressed to and accessible to children. Sponsorship, product
placement and all other forms of commercially driven content should be clearly distinguished from all
other content and should not perpetuate gender or racial stereotypes.”
(
74
) The Commission recalls that such AI systems could constitute prohibited practices under Article 5(1)(b)
of Regulation (EU) 2024/1689, if they exploit vulnerabilities of children in a manner that causes or is
reasonably likely to cause significant harm. Any measures taken according to this recommendation
should go beyond measures taken to prevent the application of that prohibition. The supervision and
enforcement of measures taken to comply with Article 50(1) of Regulation (EU) 2024/1689 remains the
responsibility of the competent authorities under that Regulation.
(
75
) The Commission recalls that according to Article 6 and 7 of Directive 2005/29/EC, the disclosure of the
commercial element must be clear and appropriate, taking into account the medium in which the
marketing takes place, including the context, placement, timing, duration, language, or target audience.
See also the
Guidance on the interpretation and application of Directive 2005/29/EC.
41
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0043.png
regularly tested and reviewed in consultation with minors, their guardians and
other relevant stakeholders.
g.
Ensure that minors are not exposed to marketing and communication of products
or services that can have an adverse impact on their privacy, safety and security,
including as identified in the provider’s risk review, including those associated
with negative impacts on their physical and mental health (see Section 5 on Risk
review).
h.
Ensure that minors are not exposed to hidden or disguised advertising, whether
placed by the provider of the online platform or the users of the service (
76
). In
this context, the Commission recalls that providers of online platforms are also
obliged, under Article 26(2) of Regulation (EU) 2022/2065, to provide recipients
of the service with a functionality to declare whether the content they provide is
or contains commercial communications (
77
). Examples of disguised commercial
communications may include, but are not limited to, product placements by
influencers, product showcases and other forms of subtle promotion that may
deceive or manipulate minors into purchasing products or services.
i.
Ensure that children are not exposed to techniques which can have the effect of
reducing transparency of economic transactions and may be misleading for
minors, such as certain virtual currencies (
78
), and other tokens or coins, that can
be exchanged with real money (or, where applicable, for the purchase of another
virtual currency) and used to purchase virtual items, thus also cause unwanted
spending.
(
76
) The Commission recalls that Directive 2005/29/EC Article 7(2), and in Annex I, point 22, prohibits
falsely claiming or creating the impression that the trader is not acting for purposes relating to his trade,
business, craft or profession, or falsely representing oneself as a consumer. It also recalls Directive
2010/13/EU that prohibits to directly exhort minors to buy or hire a product or service, encourage them
to persuade their parents or others to purchase the goods or services being advertised, exploit the special
trust minors place in parents, teachers or other persons. According to recital 10 of Regulation 2022/2065
the Regulation should be without prejudice to Union law on consumer protection including Directive
2005/29 concerning unfair business-to-consumer commercial practices in the internal market.
(
77
) The Commission also recalls that Directive 2010/13/EU provides that video sharing platforms need to
have a functionality to declare that content uploaded contains audiovisual commercial communications.
(
78
) The Commission recalls that the concept of virtual currency is defined in virtual currency is defined in
Directive (EU) 2018/843 on anti-money-laundering.
42
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0044.png
j.
Ensure that minors, when accessing online platforms or parts and features thereof
that are presented or appear as being free (
79
), are not exposed to in-app or in-
game purchases that are or appear to be necessary to access or use the service. If
minors are exposed to any other in app or in-game purchases, they should always
be priced in the national currency.
k.
Ensure that minors are not exposed to practices that can lead to excessive or
unwanted spending or overuse of the platform or compulsive or addictive
behaviours, by ensuring that minors are not exposed to virtual items such as paid
loot boxes, other products, where they offer random or unpredictable outcomes
or gambling-like features, and by introducing separation or friction between
content and the purchasing of related products.
l.
Ensure that minors are not exposed to manipulative design techniques (
80
), such
as scarcity (
81
), intermittent or random rewards, or persuasive design
techniques (
82
), that can lead to excessive, impulsive or unwanted spending or
addictive behaviours.
m. Ensure that minors are not exposed to unwanted purchases, e.g. by considering
deploying effective tools for guardians or submitting any financial commitment
made by minors under a certain age to the review or consent of guardians (see
Section 7.3 on Tools for guardians).
n.
Review the platform’s policy to offer economic transactions, based on the
evolving capacities of children, considering that certain age groups should not
(
79
) The Commission recalls that Directive 2005/29/EC in its Annex I, point 20, prohibits describing a
product as ‘gratis’, ‘free’, ‘without charge’ or similar if the consumer has to pay anything other than the
unavoidable cost of responding to the commercial practice and collecting or paying for delivery of the
item.
(
80
) As set out in Article 25 of Regulation (EU) 2022/2065. The Commission recalls that according to Article
25(2) the prohibition in Article 25(1) shall not apply to practices covered by Directive 2005/29/EC or
Regulation (EU) 2016/679.
(
81
) The Commission recalls that Directive 2005/29/EC in its Annex I, point 7, prohibits falsely stating that
a product will only be available for a very limited time, or that it will only be available on particular
terms for a very limited time, in order to elicit an immediate decision and deprive consumers of sufficient
opportunity or time to make an informed choice. Thereby traders are subject to the prohibition to use
scarcity techniques including scarcity techniques
(
82
) The Commission recalls that, in the case of games, under Articles 8 and 9 of Directive 2005/29/EC
traders should not exploit behavioural biases or introduce manipulative elements relating to, e.g. the
timing of offers within the gameplay (offering micro-transactions during critical moments in the game),
the use of visual and acoustic effects to put undue pressure on the player.
43
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0045.png
be exposed or allowed to enter into economic transactions as they do not yet
possess the ability to comprehend spending and money.
6.7
Moderation
70. Moderation can reduce minors’ exposure to content and behaviour that is harmful to
their privacy, safety and security, including illegal content or content that may impair
their physical or mental development, and it can contribute to crime prevention.
71. The Commission recalls the obligations related to: terms and conditions set out in
Article 14 of Regulation (EU) 2022/2065; transparency reporting provided in Article
15 of that Regulation for providers of intermediary services, which includes providers
of online platforms; notice and action mechanisms and statements of reasons provided
respectively in Article 16 and 17 of that Regulation for providers of hosting services,
including online platforms; the obligations related to trusted flaggers (
83
) for providers
of online platforms set out in Article 22 of that Regulation. It also recalls the 2025
Code of Conduct on Countering Illegal Hate Speech Online+ and the Code of Conduct
on Disinformation which constitute Codes of Conduct within the meaning of Article
45 of Regulation (EU) 2022/2065.
72. In addition to those obligations, the Commission considers that providers of online
platforms accessible to minors should put in place the following measures to ensure
a high level of privacy, safety, and security of minors on their service for the purposes
Article 28(1) of Regulation (EU) 2022/2065, while taking the best interests of the
child as a primary consideration:
a.
Define clearly and transparently what the platform considers as content and
behaviour that is harmful for minors’ privacy, safety and security, in cooperation
with minors, civil society and independent experts, including academia. This
should include any content and behaviour that is illegal under EU or national
law. Providers of online platforms accessible to minors should communicate
(
83
) Trusted flaggers are entities with particular expertise and competence in detecting certain types of illegal
content, and the notices they submit within their designated area of expertise must be given priority and
processed by providers of online platforms without undue delay. The trusted flagger status is awarded
by the Digital Services Coordinator of the Member State where the entity is established, provided that
the entity has demonstrated their expertise, competence, independence from online platforms, as well as
diligence, accuracy and objectivity in submitting notices.
44
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
information concerning their standards and expectations regarding content and
behaviour clearly to minors using their service and this information should be
available during the set-up of an account and easy to locate on the platform.
b.
Establish moderation policies and procedures that set out how content and
behaviour that is harmful for the privacy, safety and security of minors is
detected and how it will be moderated aiming at limiting minors’ exposure to
harmful content. Providers of online platforms should also ensure that these
policies and/or procedures are enforced in practice.
c.
Assess and review policies and procedures to ensure that they remain effective
as technologies and online behaviours change. In particular, the Commission
considers that providers of online platforms accessible to minors should take into
account the following factors when prioritising moderation: the likelihood and
seriousness of the content causing harm to a minor’s privacy, safety and/or
security, the impact of the harm on that minor, specific vulnerabilities and the
number of minors who may be harmed. Additionally, reports made by minors
should be prioritised.
d.
Ensure human review is available in addition to automated content review and
any other relevant tools for reported accounts or content that the provider
suspects may pose a risk of harm to minors’ privacy, safety or security.
e.
Ensure that content moderation teams are well-trained and resourced and that
moderation mechanisms are active and functioning at all times (24 hours a day,
7 days a week) to deliver effective moderation, including at least one employee
who is on call to respond to urgent requests and emergencies at all times.
f.
Ensure that content moderation systems and practices are available and
operational in the official language(s) of the Member State the service is
provided in.,
g.
Put in place effective technologies, internal mechanisms and preventative
measures to reduce the risk of content and behaviour that are harmful to minors’
privacy, safety or security from being recommended to minors, including by
implementing effective technical solutions to tackle known harmful and illegal
content, such as hash matching and URL detection. Providers should also
45
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0047.png
explore potential added benefits of emerging technical solutions such as AI
classifiers to detect new or altered content and conduct.
h.
Implementing technical solutions to prevent the AI systems on their platform
from allowing users to access, generate and disseminate content that is harmful
for the privacy, safety and/or security of minors.
i.
Integrating into any AI systems safeguards that detect and prevent prompts
that the provider has identified in their moderation policies as being harmful
to minors’ privacy, safety and/or security. This may include, for example,
the use of prompt classifiers, content moderation and other filters (
84
).
ii.
Cooperating with other providers of online platforms and relevant
stakeholders for the purpose of detecting policy-violating and illegal
content and preventing cross-platform dissemination and conduct.
iii. Where a provider of an online platform accessible to minors hosts financial
transactions, it should provide a specific channel for reporting fraud and
suspicious financial transactions.
73. Providers of online platforms accessible to minors should share metrics on content
moderation, for example how often they receive user reports, how often they
proactively detect content and conduct violations, the types of content and conduct
being reported and detected and how the platform responded to these issues.
74. None of the above measures should result in a general obligation to monitor content
which providers of online platforms accessible to minors either transmit or store (
85
).
Poor practice
SadShare is a social media platform that allows users to upload and share visual content with
others. The platform’s policies do not include robust content moderation mechanisms to detect
and prevent the upload of harmful and explicit content, including child sexual abuse material.
This lack of moderation therefore exposes minors to illegal content, and it makes it possible for
(
84
) The Commission recalls that such AI systems could constitute prohibited practices under Article 5(1)(b)
of Regulation (EU) 2024/1689, if they exploit vulnerabilities of children in a manner that causes or is
reasonably likely to cause significant harm. Any measures taken according to this recommendation
should go beyond measures taken to prevent the application of that prohibition. The supervision and
enforcement of measures taken to comply with Article 50(1) of Regulation (EU) 2024/1689 remains the
responsibility of the competent authorities under that Regulation.
(
85
) See Article 8(1) of Regulation 2022/2065.
46
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0048.png
malicious users to (re-)use existing images. This in turn fuels the demand for child sexual abuse
material that inadvertently induces other users to abuse and harm minors to create new material
.
7
7.1
R
EPORTING
,
USER SUPPORT AND TOOLS FOR GUARDIANS
User reporting, feedback and complaints
75. Effective, visible and child-friendly user reporting, feedback and complaint tools
enable minors to express and address features of online platforms that may negatively
affect the level of their privacy, safety and security.
76. The Commission recalls the obligations laid down in Regulation (EU) 2022/2065,
including the obligations to put in place notice and action mechanisms in Article 16,
to provide a statement of reasons in Article 17, to notify suspicions of criminal offence
in Article 18, to put in place an internal complaint-handling system in Article 20 and
out of court dispute settlement in Article 21, as well as the rules on trusted flaggers in
Article 22.
77. In addition to those obligations, the Commission considers that providers of online
platforms accessible to minors should put in place the following measures to ensure
a high level of privacy, safety, and security of minors on their service for the purposes
Article 28(1) of Regulation (EU) 2022/2065:
a.
Implement reporting, feedback and complaints mechanisms that:
i.
Are effective, visible, child-friendly and easily accessible (see Section 6.4
on Online interface design and other tools and Section 4 on General
principles).
ii.
Allow minors to report content, activities, individuals, accounts, or groups
they believe may violate the platform’s terms and conditions. This includes
any content, user or activity that is considered by the platform to be harmful
to minors’ privacy, safety, and/or security (see Section 5 on Risk review
and Section 6.7 on Moderation).
iii. Allow all users to report content, activities, individuals, accounts, or groups
that they deem inappropriate or undesirable for minors, or where they are
47
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0049.png
uncomfortable with the idea of such content, activities, individuals,
accounts or groups being accessible to minors.
iv. Allow all users to report a suspected underage account, where a minimum
age is stated in the platform’s terms and conditions.
v.
Allow minors to provide feedback about all content, activities, individuals,
accounts or groups that they are shown on their accounts and that make
them feel uncomfortable or that they want to see more or less of. These
options could include phrases such as "Show me less/more", "I don’t want
to see/I am not interested in", "I don’t want to see content from this
account," "This makes me feel uncomfortable," "Hide this," "I don’t like
this", or "This is not for me”. Providers of online platforms should ensure
that these options are designed in such a way that they are only visible to
the user, so that they cannot be misused by others to bully or harass minors
on the platform. Providers of online platforms should adapt their
recommender systems in response to this feedback (See Section 6.5.2 on
User control and empowerment) (
86
).
vi. Where the provider uses age assurance methods, allow any user to access
an effective internal complaint-handling system that enables them to lodge
complaints, electronically and free of charge, against an assessment by the
provider of the user’s age. This complaint handling system should fulfil the
conditions set out in Article 20 of Regulation (EU) 2022/2065.
b.
Ensure that the reporting, feedback and complaints mechanisms established
under Article 20 of Regulation (EU) 2022/2065 (
87
):
(
86
) See section 6.5 of the present guidelines for information about how this information should affect the
provider’s recommender systems.
(
87
) Any reference in the remainder of this Section to ‘complaint’ or ‘complaints’ includes any complaints
that are brought against the provider’s assessment of the user’s age and any complaints that are brought
against the decisions referred to in Article 20 of Regulation (EU) 2022/2065. Article 20 of Regulation
(EU) 2022/2065 requires providers of online platforms to provide recipients of the service with access
to an effective internal complaint-handling system against four types of decisions taken by the provider
of the online platform. These are (a) decisions whether or not to remove or disable access to or restrict
visibility of the information; (b) decisions whether or not to suspend or terminate the provision of the
service, in whole or in part, to the recipients; (c) decisions whether or not to suspend or terminate the
recipients’ account; and (d) decisions whether or not to suspend, terminate or otherwise restrict the
ability to monetise information provided by the recipients.
48
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
i.
ii.
Contribute to a high level of privacy, safety and security for minors.
Are aligned with fundamental rights, in particular children’s rights.
iii. Are available for intuitive and immediate access for all minors, including
for those with disabilities and/or additional accessibility needs.
iv. Are easy for minors to use and understand, are age-appropriate and
engaging (see Section 6.4 on Online interface design and other tools and
Section 4 on General principles).
v.
Are available for non-registered users if they may access the online
platform’s content.
c.
Ensure the availability of an option that allows minors to provide their own
reasons for a report or complaint. Providers should avoid reporting categories,
but if they are used, ensure that they are adapted to the youngest users allowed
on the platform.
d.
Ensure that reporting, feedback and complaints are confidential and anonymous
by default, while providing the option for minors to remove anonymity. If
anonymity is removed, the provider should explain to minors when, how and
what information related to reports and/or complaints they share with other users
or third parties.
e.
Prioritise reports that concern the privacy, safety and security of minors.
Providers of online platforms should provide an option to indicate if the minor
thinks a report or compliant is urgent, especially when there is an indication of
an ongoing privacy, safety or security issue. Response times should be
appropriate to the issue being reported or complained about. This should not
negatively affect the priority given to the notices submitted by trusted flaggers,
in accordance with Article 22(1) of Regulation (EU) 2022/2065.
f.
Provide each minor that submits a report or complaint with a confirmation of
receipt of the report or complaint without undue delay. Minors should also be
able to access an age-appropriate explanation of the process that will be followed
when reviewing the report or complaint and an explanation of any actions or
non-actions taken. The information should include an indicative timeframe for
49
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0051.png
deciding the report or complaint and possible outcomes. The Commission is also
of the view that providers of online platforms should provide a mechanism for
tracking progress and communicating with the platforms.
g.
Regularly review the reports, feedback and complaints that they receive. They
should use this information to identify and address any aspects of their platform
that may compromise the privacy, safety and/or security of minors, refine their
recommender systems and moderation practices, improve overall safety
standards, and foster a more trustworthy and responsible online environment.
These actions should be documented to be reviewable.
Poor practice
SadLearn is a popular online platform designed for users between 6 and 18 years old. It offers
a range of educational and entertaining content. To flag content that is against the terms and
conditions of SadLearn, the user must click through four different links. Once the user arrives in
the complaints section, they must choose among 15 different complaints categories making it
difficult for minors to identify and select the right category. There is no free-text category. If users
manage to submit complaints, they do not receive any confirmation or explanation of what will
happen next. Moreover, the reporting tool is only available in English and the language is
adapted to an adult audience
.
7.2
User support measures
78. Putting in place features on online platforms accessible to minors to assist minors to
navigate their services and seek support where needed are an effective means to
ensure a high level of privacy, safety and security for minors. The Commission
therefore considers that providers of online platforms accessible to minors should:
a.
Have clear, easily identifiable and accessible support tools that allow minors to
seek help when encountering suspicious, illegal or inappropriate content,
accounts or behaviour that make them feel uncomfortable. This includes
providing block and mute buttons. The support tools should be child-friendly,
clearly visible, immediately accessible (see Section 6.4 on Online interface and
other tools) and should connect minors directly with the most appropriate
support services for their location and age, such as those that form part of the
national Safer Internet Centres, INHOPE networks and national child helplines.
50
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0052.png
b.
Limit the use of support tools based on AI, as they should not be used as the main
mechanism to interact with children.
c.
Introduce directly visible warning messages, links to relevant national support
lines (
88
) and other authoritative sources when minors search for, upload,
generate, share and receive content that is potentially illegal or harmful for the
privacy, safety and security of minors (as explained in the Section 6.7 on
Moderation). Providers of online platforms should also refer minors to relevant
national support lines when a minor submits a report related to such content. The
referral should be made immediately after the provider of the online platform
becomes aware of the activity or the minor submits a report.
d.
If the online platform includes features or functionalities related to user
connection, posting content or user communication, it should provide minors
with the option to anonymously block or mute any other user or account,
including those that are not connected to them. The blocking systems should be
easy to find and accessible. No information about the user or their account should
be available to any accounts that the user has blocked.
e.
If the online platform enables comments on content, it should provide minors
with the option to restrict the types of users who can comment on their content
and content about them and/or prevent other users from commenting on their
content and content about them, both at the time of posting and thereafter, even
if the possibility to comment is restricted to accounts previously accepted as
contacts by the minor (as recommended in Section 6.3 on Account settings).
f.
If the online platform offers group functions, it should ensure that minors join a
group only after being notified of the invitation and upon accepting that they
wish to be part of that group.
Good practice
NiceSpace is a social media platform for users above 13. When users sign up, they are
presented with an interactive tutorial “SafeSpace 101” which explains the platform’s
privacy, safety and security features, including blocking and muting options, comment
(
88
) Such as those that form part of the national Safer Internet Centres and INHOPE networks or other
national child helplines such as https://childhelplineinternational.org/.
51
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0053.png
control and group invitations. NiceSpace also features a prominent “Help” button,
connecting the users directly with their local Safer Internet Centre helpline. When
searching for potentially harmful content, NiceSpace warns users with contextual prompts
and redirects them to safer resources. All information is adapted to the youngest user
allowed on the platform.
7.3
Tools for guardians
79. Tools for guardians are software, features, functionalities, or applications designed to
help guardians accompany their minor’s online activity, privacy, safety and well-
being, while respecting children’s agency and privacy.
80. The Commission considers that tools for guardians should be treated as
complementary to safety by design and default measures and to any other measures
put in place to comply with Article 28(1) of Regulation (EU) 2022/2065, including
those described in these guidelines. Compliance with the obligation of providers of
online platforms accessible to minors to ensure a high level of privacy, safety and
security on their services must never rely exclusively on tools for guardians. Tools
for guardians should not be used as the sole measure to ensure a high level of privacy,
safety and security of minors on online platforms, nor be used to
replace
any other
measures put in place for that purpose. Such measures may fail to reflect the realities
of children’s lives, particularly in cases of split custody, foster care, or where
guardians are absent or disengaged. Moreover, the effectiveness of parental consent
is limited when the identity or legal authority of the consenting adult is not reliably
verified. Providers of online platforms accessible to minors must therefore implement
appropriate measures to protect minors and should not be restricted to relying on
parental oversight. Nevertheless, the Commission notes that, when used in
combination with other measures, tools for guardians may contribute to such a high
level.
81. Therefore, the Commission considers that providers of online platforms accessible to
minors should put in place guardian control tools for the purposes Article 28(1) of
Regulation (EU) 2022/2065 which should:
a.
Be age-appropriate and in line with the evolving capacities of minors. Tools for
guardians should be grounded in communication, learning and empowerment
rather than control and enable autonomy and agency of minors. They should be
52
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
effective and not disproportionately restrict minors’ rights to privacy or access
services, considering the best interests of the minor, as a primary consideration.
b.
Be easy to use, access and activate for example by allowing the guardian to use
the tool without creating an account on the service.
c.
d.
Apply regardless of the device or operating system used to access the service.
Provide a clear notification to minors of their activation by guardians and put
other safeguards in place considering their potential misuse by guardians such
as, for example, providing a clear sign to the minor in real time when any
monitoring functionality is activated.
e.
Ensure that changes can only be made with the same degree of authorisation
required in the initial activation of the tools.
f.
Be compatible with the availability of interoperable one-stop-shop tools for
guardians gathering all settings and tools.
82. Tools for guardians may include features for managing default settings, setting screen
time limits (see Section 6.4 on Online interface design and other tools), seeing the
accounts that the minor communicates with, managing account settings, setting
spending limits for the minor by default where applicable, or other features to
supervise uses of the online platforms that may be detrimental to the minor’s privacy,
safety and security.
8
G
OVERNANCE
83. Good platform governance is an effective means to ensure that the protection of
minors is duly prioritised and managed across the platform, thus contributing to
ensuring the required high level of privacy, safety and security of minors.
8.1
Governance (general)
84. The Commission considers that providers of online platforms accessible to minors
should put in place effective governance practices as a means of ensuring a high level
of privacy, safety and security for minors on their services for the purposes Article
28(1) of Regulation (EU) 2022/2065. This includes, but is not limited to:
53
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0055.png
a.
Implementing internal policies that outline how the provider of the online
platform seeks to ensure a high level of privacy, safety and security for minors
on its service.
b.
Assigning to a dedicated person or team the responsibility for ensuring a high
level of minors’ privacy, safety and security. This person or team should have
sufficient resources as well as sufficient authority to have direct access to the
senior management body of the provider of the online platform and should also
be a central point of contact for regulators, users and trusted flaggers in matters
related to minors’ privacy, safety and security.
c.
Fostering a culture of privacy, safety and security for minors on the service. This
includes:
d.
Fostering and prioritising a culture of child participation in the design and
functioning of the platform. This should be done in safe, ethical, inclusive and
meaningful ways, in children’s best interests, and should provide for feedback
mechanisms to explain to minors how their views have been taken into
account (
89
).
e.
Raising awareness of how the provider upholds children’s rights on its platform
and the risks that minors on the platform may face to their privacy, safety and/or
security (
90
).
f.
Providing persons responsible for minors’ privacy, safety and security,
developers, persons in charge of moderation and/or those receiving reports or
complaints from minors, with relevant training and information (
91
).
(
89
) UNICEF’s spotlight guidance on stakeholder engagement with children offers concrete steps on
responsible child participation activities. UNICEF. (2025).
Spotlight guidance on best practices for
stakeholder
engagement
with
children
in
D-CRIAs.
Available:
https://www.unicef.org/childrightsandbusiness/media/1541/file/D-CRIA-Spotlight-Guidance-
Stakeholder-Engagement.pdf.
(
90
) This approach is in line with the Better Internet for Kids strategy (BIK+), which emphasises the
importance of awareness and education in promoting online safety and supports the implementation of
Regulation (EU) 2022/2065 in this respect. Furthermore, the Safer Internet Centres, stablished in each
Member State, demonstrate the value of awareness-raising efforts in preventing and responding to online
harms and risks.
(
91
) This training might cover, for example, children’s rights, risks and harms to minors’ privacy, safety and
security online, as well as effective prevention, response and mitigation practices.
54
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0056.png
g.
Having procedures to ensure regular monitoring of compliance with Article
28(1) of Regulation (EU) 2022/2065.
h.
Ensuring that any technological and organisational solutions employed to
implement these guidelines are ‘state-of-the-art’ and are aligned with national
guidance on the protection of minors (
92
), children’s rights and the highest
available standards (
93
).
i.
Putting in place a process for the regular collection and recording of data on
harms and risks related to privacy, safety, and security of minors on the platform,
which should be periodically reported to the provider’s management as well as
to the person or team designated for the protection of minors. This is without
prejudice to the obligations of providers of VLOPs and VLOSEs stemming from
Articles 34 and 35 of Regulation (EU) 2022/2065.
j.
Exchanging between platforms and providers, as well as with Digital Services
Coordinators, trusted flaggers, civil society organisations, academia and other
relevant stakeholders, good practices and technological solutions that are aimed
at ensuring a high level of privacy, safety and security for minors. Cross-platform
collaboration should include risk detection, design standards, and research
collaboration with trusted actors.
(
92
) An Coimisiún um Chosaint Sonraí. (2021).
Fundamentals for a child-oriented approach to data
processing.
Available:
https://www.dataprotection.ie/sites/default/files/uploads/2021-
12/Fundamentals%20for%20a%20Child-
Oriented%20Approach%20to%20Data%20Processing_FINAL_EN.pdf; Coimisiún na Meán. (2024).
Online safety code.
Available: https://www.cnam.ie/app/uploads/2024/11/Coimisiun-na-Mean-Online-
Safety-Code.pdf; IMY (Swedish Authority for Privacy Protection). (2021).
The rights of children and
young people on digital platforms.
Available: https://www.imy.se/en/publications/the-rights-of-
children-and-young-people-on-digital-platforms/; Dutch Ministry of the Interior and Kingdom
Relations. (2022).
Code for children's rights.
Available: https://codevoorkinderrechten.nl/wp-
content/uploads/2022/02/Code-voor-Kinderrechten-EN.pdf; CNIL. (2021).
CNIL publishes 8
recommendations to enhance protection of children online.
Available: https://www.cnil.fr/en/cnil-
publishes-8-recommendations-enhance-protection-children-online; Unabhängiger Beauftragter für
Fragen des sexuellen Kindesmissbrauchs. (n.d.).
Rechtsfragen Digitales.
Available: https://beauftragte-
missbrauch.de/themen/recht/rechtsfragen-digitales.
(
93
) CEN-CENELEC (2023)
Workshop Agreement 18016 Age Appropriate Digital Services Framework;
OECD. (2021).
Children in the digital environment - Revised typology of risks.
https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html.
55
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0057.png
8.2
Terms and conditions
85. Terms and conditions provide a framework for governing the relationship between
the provider of the online platform and its users. They set out the rules and
expectations for online behaviour and play an important role in establishing a safe,
secure and privacy respecting environment (
94
).
86. The Commission recalls the obligations for all providers of intermediary services as
regards terms and conditions set out in Article 14 of Regulation (EU) 2022/2065,
which includes the obligation for providers of intermediary services to explain the
conditions for, and any restrictions on, the use of the service in a clear, plain,
intelligible, user-friendly and unambiguous language. In addition, Article 14(3) of
that Regulation specifies that intermediary services primarily directed to minors or
predominantly used by them, should provide this information in a way that minors
can understand (
95
) (
96
).
87. Moreover, the Commission considers that providers of online platforms accessible to
minors should ensure that the terms and conditions of the service they provide:
a.
Include information about:
i.
ii.
The steps that users need to take from account creation to its deletion.
Community guidelines that promote a positive, safe and inclusive
atmosphere and that explain what conduct is expected and prohibited on
their service, and what the consequences of non-compliance are.
iii. The types of content and behaviour that are considered to be harmful for
minors’ privacy, safety and/or security. This includes but is not limited to
(
94
) The P2089.2™ Standard for Terms and Conditions for Children's Online Engagement provides
processes and practices to develop terms and conditions that help protect the rights of children in digital
spheres.
(
95
) The Commission also recalls the requirements for video-sharing platform providers to protect minors
from programmes, user-generated videos and audiovisual commercial communications which may
impair their physical, mental or moral development in Article 28b of Directive 2010/13/EU. These
requirements are to be evaluated and, potentially, reviewed by 19 December 2026.
(
96
) As indicated in the Introduction of these guidelines, certain provisions of Regulation (EU) 2022/2065
including points (5) and (6) of article 14, impose additional obligations on providers of very large online
platforms (“VLOPs”). To the extent that the obligations expressed therein also relate to the privacy,
safety and security of minors within the meaning of Article 28(1), the present guidelines build on these
provisions.
56
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0058.png
illegal content that is harmful for minors’ privacy, safety and/or security
and the dissemination of this content.
iv. How minors are protected from this content and behaviour.
v.
The tools that are used to prevent, mitigate and moderate content, conduct
and features that are illegal or harmful for the privacy, safety and security
of minors, and the complaints process.
b.
c.
Are easy to find and searchable throughout the user’s experience on the platform.
Do not unduly restrict any rights of minors, including their right to freedom of
expression and information.
d.
Are upheld and implemented in practice.
88. In addition, the Commission considers that the providers of online platforms
accessible to minors should ensure changes to the terms and conditions are logged
and published (
97
).
Good practice
HappyExplore is an online platform where minors can play games, create and explore creatures
and worlds that they can share with each other. HappyExplore has a character called “Pixel
Pioneer” which teaches users how to be responsible explorers. All users are encouraged to take
the “Kindness pledge”, where they learn and promise to behave kindly and safely online. Pixel
Pioneer also explains the importance of moderation and safety decisions to the users as they
explore the platform, such as why they should think carefully before sharing their creatures or
worlds.
8.3
Monitoring and evaluation
89. The Commission considers that providers of online platforms accessible to minors
should adopt effective monitoring and evaluation practices to ensure a high level of
privacy, safety and security for minors on their service for the purposes Article 28(1)
of Regulation (EU) 2022/2065. This includes, but is not limited to:
a.
Regularly monitoring and evaluating the effectiveness of any elements of the
platform that concern the privacy, safety and security of minors on the platform.
(
97
) For example, by publishing them in the Digital services terms and conditions database: https://platform-
contracts.digital-strategy.ec.europa.eu/
57
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0059.png
This includes, for example, the platform’s online interface, systems, settings,
tools, functionalities and features and reporting, feedback and complaints
mechanisms, and measures taken to comply with Article 28(1) of Regulation
(EU) 2022/2065 (
98
). Providers should consider making these evaluations
available for review and input by independent third parties such as experts or
other relevant stakeholders.
b.
Regularly consulting with minors, guardians, academia, civil society
organisations, child rights experts and other relevant stakeholders on the design
and evaluation of any elements of the platform that concern the privacy, safety
and security of minors on the platform. This should include testing these
elements with minors and taking their feedback into account. To contribute to
non-discrimination and accessibility, providers should, where possible, involve
minors from a diverse range of cultural and linguistic backgrounds, of different
ages, with disabilities and/or additional accessibility needs in these
consultations.
c.
Adjusting the design and functioning of the aforementioned elements based on
the results of these consultations and on technical developments, research,
changes in user behaviour or policy, product and usage evolutions, and changes
to the harms and risks to the privacy, safety and security of minors on their
platform.
8.4
Transparency
90. The Commission recalls the transparency obligations under Articles 14, 15 and 24 of
Regulation (EU) 2022/2065. In view of minors’ developmental stages and evolving
capacities, additional considerations concerning the transparency of an online
platform’s functioning are required to ensure compliance with Article 28(1) of that
Regulation.
(
98
) As indicated in the Introduction of these guidelines (Section 1), certain provisions of Regulation (EU)
2022/2065 including Section 5 of Chapter III impose additional obligations on providers of very large
online platforms (“VLOPs”) and very large search engines (“VLOSEs”). To the extent that the
obligations expressed therein also relate to the privacy, safety and security of minors within the meaning
of Article 28(1), the present guidelines build on these provisions, and VLOPs should not expect that
adopting the measures described in the present guidelines, either partially or in full, suffices to ensure
compliance with their obligations under Section 5 of Chapter III of Regulation (EU) 2022/2065.
58
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
91. The Commission considers that providers of online platforms accessible to minors
should make all necessary and relevant information on the functioning of their
services easily accessible for minors to ensure a high level of privacy, safety and
security on their services. It considers that providers of online platforms should make
available to minors and, where relevant, their guardians, on an accessible interface on
their online platforms the following information:
a.
Information about any measures put in place to ensure a high level of privacy,
safety and/or security of minors on the platform. This includes information
about:
i.
any age assurance methods used, how these methods work, and any third
party used to provide any age verification or estimation methods.
ii.
the functioning of the recommender systems used across the platform and
the different options available to users (see Section 6.5.2 on User control
and empowerment).
iii. the processes for responding to any reports, feedback and complaints made
or brought by minors, including indicative timeframes, and the possible
outcomes and impact of these processes.
iv. the AI tools, products and features that are incorporated into the platform,
their limitations and the potential consequences of their use.
v.
the registration process where one is offered.
vi. any tools for guardians that are offered, explaining how to use them and
how they protect minors online, and what types of information about the
minor’s online activity guardians can obtain via the use of such tools.
vii. how content that breaches the platform’s terms and conditions is moderated
and the consequences of this moderation.
viii. how to use the different reporting, complaints, redress and support tools
referred to in the present guidelines.
ix. the online platform’s terms and conditions.
59
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0061.png
x.
any other measures recommended in the present guidelines and put in place
by the provider of the online platform.
xi. any other measures adopted, or changes made to their services to ensure a
high level of privacy, safety or security of minors on the platform.
b.
Ensure that this information, all warnings and any other communication
recommended in the present guidelines are:
i.
child-friendly, age-appropriate, easy-to-understand and easily accessible to
all minors, including those with disabilities and/or additional accessibility
needs.
ii.
presented clearly in a way that is easy to understand and is as simple and
succinct as possible. For example, where the terms and conditions refer to
a specific feature, the key information about this feature is presented when
the minor engages with it.
iii. presented to the minor in ways that are easy to review and that provide for
immediate and intuitive access, at the points at which they become relevant.
iv. presented in the official language(s) of the Member State the service is
provided in.
v.
engaging for minors. This may require the use of graphics, videos, and/or
characters or other techniques.
vi. given to minors gradually and overtime to maximise retention by the user.
c.
Any measures and changes implemented to comply with Article 28(1) of
Regulation (EU) 2022/2065 could be communicated internally and made public
to the extent possible.
Good practice
HappyTerms is an online platform addressed at 13- to 18-year-olds. It offers minors the
opportunity to participate in communities and to exchange ideas and information about shared
interests. HappyTerms displays information about its terms and conditions with clear headings
accompanied by explanatory icons and colourful pictures. The rules are broken down into short,
easy-to-read sections and use simple language to explain the rules. There are also infographics
that help minors to understand what they are agreeing to, and that pop up when they become
relevant to a given feature or settings change. Users can also find rules and by clicking on “What
60
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0062.png
I need to know”, an icon that links the user to the relevant rules, related tools and useful links
from any part of the platform. HappyTerms also offers an interactive quiz where minors can
check if they have understood the terms and conditions.
9
R
EVIEW
92. The Commission will review these guidelines as soon as this is necessary and at the
latest after a period of 12 months, in view of practical experience gained in the
application of Article 28(1) of Regulation (EU) 2022/2065 and the pace of
technological, societal, and regulatory developments in this area.
93. The Commission will encourage providers of online platforms accessible to minors,
Digital Services Coordinators, national competent authorities, the research
community and civil society organisations to contribute to this process. Following
such a review, the Commission may, in consultation with the European Board for
Digital Services, decide to amend these guidelines.
61
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0063.png
Annex
5C Typology of online risks to children
94. The OECD (
99
) and researchers (
100
) have classified the risks (
101
) that minors can
encounter online, in order for providers of online platforms accessible to minors,
academia and policy makers to better understand and analyse them. This classification
of risks is known as the 5Cs typology of online risks to children. It helps in identifying
risks and includes 5 categories of risks: content, conduct, contact, consumer risks,
cross-cutting risks. These risks may manifest when appropriate and proportionate
measures are not in place to ensure a high level of privacy, safety and security for
minors on the service, causing potential infringement of a number of children’s rights.
95.
5C typology of online risks to children (
102
)
Risks for children in the digital environment
Risk categories Content
Conduct
Cross-cutting
risks
Contact
Consumer
Additional privacy, safety and security risks
Advanced technology risks
Risks on health and wellbeing
Misuse risks
Hateful content
Harmful content
Illegal content
Disinformation
Hateful
behaviour
Harmful
behaviour
Illegal behaviour
User-generated
problematic
behaviour
Hateful
encounters
Harmful
encounters
Illegal
encounters
Other
problematic
encounters
Marketing risks
Commercial
profiling risks
Financial risks
Security risks
Risk
manifestation
(
99
) OECD. (2021).
Children in the digital environment - Revised typology of
https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html
risks.
(
100
) Livingstone, S., & Stoilova, M. (2021).
The 4Cs: Classifying Online Risk to Children.
(CO:RE Short
Report Series on Key Topics). Hamburg: Leibniz-Institut für Medienforschung | Hans-Bredow-Institut
(HBI); CO:RE - Children Online: Research and Evidence. https://doi.org/10.21241/ssoar.71817
(
101
) See also a risk analysis provided by the the Bundeszentrale für Kinder- und Jugendmedienschutz
(BZKJ). (2022).
Gefährdungsatlas. Digitales Aufwachsen. Vom Kind aus denken. Zukunftssicher
handeln. Aktualisierte und erweiterte 2. Auflage. - Bundeszentrale für Kinder- und Jugendmedienschutz.
Available:
https://www.bzkj.de/resource/blob/197826/5e88ec66e545bcb196b7bf81fc6dd9e3/2-
auflage-gefaehrdungsatlas-data.pdf
(
102
) OECD. (2021).
Children in the digital environment - Revised typology of risks.
p.7.
https://www.oecd.org/en/publications/children-in-the-digital-environment_9b8f222e-en.html
62
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
96.
Content risks:
Minors can be unexpectedly and unintentionally exposed to content
that potentially harms them: a. hateful content; b. harmful content c; illegal content;
d. disinformation. These types of content are widely considered to have serious
negative consequences to minors’ mental health and physical wellbeing, for example
content promoting self-harm, suicide, eating disorders or extreme violence.
97.
Conduct risks:
Refer to behaviours minors may actively adopt online, and which can
pose risks to both themselves and others such as a. hateful behaviour (e.g., minors
posting/sending hateful content/messages); b. harmful behaviour (e.g., minors
posting/sending violent or pornographic content); c. illegal behaviour (e.g., minors
posting/sending child sexual abuse material or terroristic content); and d. user-
generated problematic behaviour (e.g., participation in dangerous challenges;
sexting).
98.
Contact risks:
Refer to situations in which minors are victims of the interactions, as
opposed to the actor: a. hateful encounters; b. harmful encounters (e.g. the encounter
takes place with the intention to harm the minor); c. illegal encounters (e.g. can be
prosecuted under criminal law); and d. other problematic encounters. Examples of
contact risks include, but are not limited to, online grooming, online sexual coercion
and extortion, sexual abuse via webcam, cyberbullying and trafficking in human
beings for the purposes of sexual exploitation. These risks also extend to online fraud
practices such as phishing, marketplace fraud, and identity theft.
99.
Consumer risks:
Minors can also face risks as consumers in the digital economy: a.
marketing risks (e.g. loot boxes, advergames.); b. commercial profiling risks (e.g.
product placement or receiving advertisements intended for adults such as dating
services); c. financial risks (e.g. fraud or spending large amounts of money on without
the knowledge or consent of their guardians); d. security risks and e. risks related to
the purchase and consumption of drugs, medicines, alcohol, and other illegal or
dangerous products. Consumer risks also include risks related to contracts, for
example the sale of users’ data or unfair terms and conditions.
100.
Cross cutting risks:
These are risks that cut across all risk categories and are
considered highly problematic as they may significantly affect minors’ lives in
multiple ways. They are:
63
EUU, Alm.del - 2024-25 - Bilag 635: Kopi af DIU alm. del - bilag 140 vedr. orientering om retningslinjer til artikel 28 i DSA med fokus på aldersverifikation
3053975_0065.png
a.
Advanced technology risks
involve minors encountering new dangers as
technology develops, such as AI chatbots that might provide harmful
information or be used for grooming by exploiting vulnerabilities, or the use of
biometric technologies that can lead to abuse, identity fraud and exclusion.
b.
Health and wellbeing risks
include potential harm to minors' mental,
emotional, or physical well-being. For example, increased obesity/anorexia and
mental health issues linked to the use or excessive use of online platforms, which
may in some cases result in negative impacts for minors’ physical and mental
health and wellbeing, such as addiction, depression, anxiety disorders,
deregulated sleep patterns and social isolation.
c.
Additional privacy and data protection risks
stem from access to information
about minors and the danger of geolocation features that predators could exploit
to locate and approach minors.
101. Other cross cutting risks (
103
) can also include:
a.
Additional safety and security risks
relate to minors’ safety, particularly
physical safety, as well as all cybersecurity issues.
b.
Misuse risks
relate to risks or harms to minors stemming from the misuse of the
online platform, or its features.
(
103
) Livingstone, S., & Stoilova, M. (2021).
The 4Cs: Classifying Online Risk to Children.
(CO:RE Short
Report Series on Key Topics). Hamburg: Leibniz-Institut für Medienforschung | Hans-Bredow-Institut
(HBI); CO:RE - Children Online: Research and Evidence. https://doi.org/10.21241/ssoar.71817
64