Udvalget for Digitalisering og It 2024-25
DIU Alm.del Bilag 164
Offentligt
3068374_0001.png
Social media, search engines, and online marketplaces offer many benefits to users. In the EU, these
online platforms are regulated through the Digital Services Act to limit real-life risks, such as online
fraud, the sale of unsafe products, and risks to the safety, security and privacy of minors. A key focus
of the Digital Services Act is on transparency and accountability of the policies and decisions of online
platforms, and on the empowerment of users to be informed about and potentially appeal content
moderation decisions. Protection of free speech online is guaranteed through concrete safeguards
on the speech of users online against arbitrary or untransparent content moderation decisions, in
accordance also with the principles of the EU Charter of Fundamental Rights.
The core elements of the EU Digital Services Act (DSA)
The DSA empowers users online.
If online platforms decide to moderate a piece of content or
an account, by removing content, down-ranking it, or through shadow-banning, users have a
right to know why and how to appeal. If disputes with online platforms on moderation
decisions persist, users also have the right to external appeals through
out-of-court dispute
settlement bodies
a faster and more cost-effective way to settle disputes than court
proceedings.
Do users in the EU appeal the content moderation decisions of online platforms?
Internal platform data delivered under the DSA shows that users actively use this right and that
online platforms regularly overturn their own decisions. For example, thanks to the new appeal
mechanism that online platforms have to provide under the DSA, EU users challenged more
than 16 million content removal decisions by Meta and TikTok in the second half of 2024.
Thanks to the DSA almost 35% of the removals were overturned and content was restored.
The DSA mandates new flagging mechanisms for illegal content or goods online.
With the
DSA, users can report illegal content, goods or services via an easy-to-use mechanism directly
DIU, Alm.del - 2024-25 - Bilag 164: Notat om Europa-Kommissionen håndhævelse af forordningen om det indre marked for digitale tjenester (DSA) samt faktaark, fra digitaliseringsministeren
3068374_0002.png
on the online platforms. Online platforms retain the ultimate decision on whether or not to
take action on the content, but they must now inform users of their content moderation
decision and include the available ways to appeal to it.
The DSA does not define what content is illegal.
What is illegal in the EU is defined in national
law or, exceptionally, EU law, not in the DSA. The DSA does not regulate content. It mandates
online platforms to have processes and procedures in place to deal with notices of illegal
content. Very large online platforms, that is those with more than 45 million average monthly
users in the EU, must assess the societal risks to which their services give rise, such as risks to
the well-being of minors or risks to public security
with in-built safeguards for free
expression.
How much online content is actually moderated, and by whom?
Thanks to the transparency reporting obligations under the DSA, it is possible to see that:
The
vast majority
of online activity is
not moderated at all.
Of the content that is moderated (between 17 million and 40+ million content
moderation decisions per day), more than
99% is done proactively by online platforms
based on their own Terms & Conditions.
Less than 1% of moderation actions result from users reporting content as illegal
under European laws. These actions include online platforms choosing to keep the
content up and rejecting the user notice.
Less than 0.00015% of moderation actions by very large online platforms and very
large search engines are based on orders (less than 3 000 orders) from regulatory
authorities
not governments
to act against illegal content. 10 very large online
platforms did not receive any orders at all. In case providers moderate content in
response to orders, they must inform the user of the moderation decision and allow
the user to challenge it.
The DSA contains world-leading protections for freedom of expression.
The DSA bans general
monitoring obligations, preventing all online platforms, including smaller ones, from being
forced to systematically monitor the online content of our citizens or be held liable for illegal
content they are not aware of. At the same time, the DSA imposes clear due diligence
obligations on online platforms, and requires the largest platforms and search engines to
regularly assess how their operations could affect free speech online, and to protect it while
addressing these risks. Over-removal of lawful content could constitute a violation of the DSA.
Online platforms retain the right to make final content moderation decisions, provided they
comply with the law and their terms and conditions.
The DSA protects all users in the EU
with special attention paid to empowering and
safeguarding minors.
The DSA obliges all online platforms accessible to minors, regardless of
their size, to ensure a high level of privacy, safety and security of minors. Very large online
platforms must also assess and effectively mitigate systemic risks related specifically to the
protection of minors, such as the impact of their systems on their mental health. Detailed
guidelines to ensure the protection of minors online
have recently been published.
The DSA allows for unprecedented transparency and scrutiny into the behaviour of
platforms.
Under the DSA, online platforms must outline the criteria for their recommender
systems in their Terms and Conditions, with very large online platforms providing users with
recommender system options that are not based on profiling. The DSA bans ads based on
DIU, Alm.del - 2024-25 - Bilag 164: Notat om Europa-Kommissionen håndhævelse af forordningen om det indre marked for digitale tjenester (DSA) samt faktaark, fra digitaliseringsministeren
3068374_0003.png
sensitive data like sexuality, religion, or race, and prohibits targeted advertising to children.
The DSA also outlaws deceptive online designs, or dark patterns. Online platforms must
publish detailed content moderation data, such as the number of actions taken and their legal
basis, as well as appeal outcomes. They must report on risk mitigation efforts, including
impacts on free speech, and maintain public advertising databases, enabling comprehensive
scrutiny and data access for EU researchers.
The DSA boosts innovation
and saves money for companies.
The DSA creates a single legal
framework that applies across the EU, so that companies do not have to comply with 27
different legal regimes.
This ‘passporting’ right makes
it easier for them to scale up across the
whole bloc. The due diligence obligations in the DSA are proportionate to the size of the online
platform concerned, with exemptions for micro and small enterprises, and specific obligations
for
very large online platforms and search engines
in view of their reach and potential systemic
risks they pose to society.
The rules reflect industry ‘best practices’,
and can be shaped through
Codes of Conduct, which are non-binding voluntary self-regulatory instruments. The DSA
therefore establishes a level playing field for all online platforms, which prevents a race to the
bottom on user experience.
The DSA applies only in the EU.
EU law only applies in the European Union, and the DSA is no
exception. Online platforms have both the right and ability to implement content moderation
policies tailored to regional laws and cultural contexts. Moreover, in the case of content
removal based on orders by authorities, the DSA makes it clear that these orders should
generally
apply only within the issuing Member State’s jurisdiction.
The right to privacy and to human dignity, the protection of minors, security, democracy and freedom
of speech are at the core of our democracies, offline as much as online. The DSA ensures these
fundamental rights are respected online, by defining clear rules and responsibilities for the services
operating in our digital space.
What did the DSA achieve so far?
The DSA applies to the largest online platforms in the EU since August 2023 and to all other
online platforms since February 2024.
Online platforms have since then
implemented key DSA provisions
in the EU, such as offering
clear, plain-language terms and conditions; providing more content moderation information;
allowing to disable the personalised feed on their platforms; and disabling targeted ads
addressed to minors.
To verify the compliance of the largest online platforms, defined as very large platforms and
very large search engines under the DSA, the Commission has already
opened several
investigations.
In some cases, these have already led to remarkable changes. For example, in
2024,
TikTok
introduced the Rewards Programme,
but later
withdrew it
after the European
Commission launched an investigation into concerns about its potentially addictive design,
especially for children. Following
the launch of the investigation
against
AliExpress
in February
2024, the platform made
several commitments
to improve its safety and comply with the
DSA. These include enhancing the transparency of its advertising and recommender systems
and improving its mechanism for users to flag illegal content.