Social media, search engines, and online marketplaces offer many benefits to users. In the EU, these
online platforms are regulated through the Digital Services Act to limit real-life risks, such as online
fraud, the sale of unsafe products, and risks to the safety, security and privacy of minors. A key focus
of the Digital Services Act is on transparency and accountability of the policies and decisions of online
platforms, and on the empowerment of users to be informed about and potentially appeal content
moderation decisions. Protection of free speech online is guaranteed through concrete safeguards
on the speech of users online against arbitrary or untransparent content moderation decisions, in
accordance also with the principles of the EU Charter of Fundamental Rights.
The core elements of the EU Digital Services Act (DSA)
•
The DSA empowers users online.
If online platforms decide to moderate a piece of content or
an account, by removing content, down-ranking it, or through shadow-banning, users have a
right to know why and how to appeal. If disputes with online platforms on moderation
decisions persist, users also have the right to external appeals through
out-of-court dispute
settlement bodies
–
a faster and more cost-effective way to settle disputes than court
proceedings.
Do users in the EU appeal the content moderation decisions of online platforms?
Internal platform data delivered under the DSA shows that users actively use this right and that
online platforms regularly overturn their own decisions. For example, thanks to the new appeal
mechanism that online platforms have to provide under the DSA, EU users challenged more
than 16 million content removal decisions by Meta and TikTok in the second half of 2024.
Thanks to the DSA almost 35% of the removals were overturned and content was restored.
•
The DSA mandates new flagging mechanisms for illegal content or goods online.
With the
DSA, users can report illegal content, goods or services via an easy-to-use mechanism directly