
Impact of EU Digital Services Act on Combatting Hate Speech Workshop
Explore the implications of the EU Digital Services Act on combating hate speech through a workshop held by the MKD Agency for Audio-Visual Media Services. Learn about the regulations, scope of application, and regulatory systems under the DSA in addressing hate speech online.
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
New Rules for the Internet The Impact of the EU Digital Services Act on Combatting Hate Speech Workshop in the MKD Agency for Audio Visual Media Services
Digital Services Act (DSA) Scope of Application The DSA forms a comprehensive set of compliance rules that apply across the whole EU Obligations are applicable to all services offered to recipients that are located within the EU, irrespective of where the providers have their place of establishment Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms The DSA mainly addresses online intermediaries and platforms, e.g. online marketplaces, social networks, content-sharing platforms The DSA does not define hate speech. Obligations refer to illegal content , which is any information that does not comply with Union law or the law of a Member State.
Digital Services Act Regulatory Systems The DSA contains graduated obligations Dependent on type, size and impact of the service Core provisions adress online platforms, including market places Very large online plattforms (VLOPs) and search engines (VLOSEs) are the most strongly regulated All intermediary services Hosting services, Online platforms Very large online platforms and search engines
Digital Services Act Regulatory Systems Very large online platforms (VLOPs) and search engines (VLOSEs) Average monthly active recipients of the service in the Union equal or higher than 45 million COM adopted decisions to desginate VLOPs and VLOSEs By now: 22 VLOPs (Alibaba AliExpress - Amazon Store - Apple AppStore - Booking.com - Facebook - Google Play - Google Maps - Google Shopping Instagram - LinkedIn - Pinterest Pornhub Snapchat Stripchat - TikTok X Xvideos - Wikipedia - YouTube Zalando - Pornhub) 2 VLOSEs (Bing Google Search)
Digital Services Act Overview Chapter I: General Provisions Chapter II: Liability of Providers of Intermediary Services Chapter III: Due Diligence Obligations for a Transparent and Safe Online Environment Chapter IV: Implementation, Cooperation, Penalties and Enforcement Chapter V: Final Provision
Digital Services Act Measures against hate speech Chapter I: General Provisions Chapter II: Liability of Providers of Intermediary Services Chapter III: Due Diligence Obligations for a Transparent and Safe Online Environment Chapter IV: Implementation, Cooperation, Penalties and Enforcement Chapter V: Final Provision
Digital Services Act Measures against hate speech Art. 6 - Basic principle: a hosting provider (e.g. a online platform) is not liable for the content stored at the request of its users, unless the hosting provider has actual knowledge of illegal activities or content Art. 9 - The DSA does not provide a legal basis for judicial or administrative orders against one or more specific items of illegal content Art. 10 The DSA does not provide a legal basis for judicial or administrative orders to provide specific information about one or more specifc individual recipient The DSA only provides minimum standards for these orders (e.g. language, information of the parties concerned)
Digital Services Act Measures against hate speech Chapter I: General Provisions Chapter II: Liability of Providers of Intermediary Services Chapter III: Due Diligence Obligations for a Transparent and Safe Online Environment Chapter IV: Implementation, Cooperation, Penalties and Enforcement Chapter V: Final Provision
Digital Services Act Measures against hate speech Art. 16 Notice and action mechanism Addressees: Providers of hosting services Paragraph 1 (notice): shall put mechanisms in place to allow any individual or entity to notify them of the presence of information that the individual or entity consider to be illegal content Easy to access, user-friendly, submission of notices exclusively by electronic means Paragraph 6 (action): providers shall process any notices and take their decision in a timely, diligent, non-arbitrary and objective manner
Digital Services Act Measures against hate speech Art. 17 Statement of reasons Addressees: Providers of hosting services Provider shall provide a clear and specific statement of reason to any affected recipient for any of the following restrictions any restrictions of the visibility of specific items of information (removal of content, disabling access to content, or demoting content) suspension or termination of the provision of the service suspension or termination of the recipient of the services account Includes measures against illegal content as well as violations of the provider s community standards
Digital Services Act Measures against hate speech Art. 18 Notification of suspicions of criminal offences Addressees: Providers of hosting services Provider shall promptly inform the law enforcement or the judicial authorities of the Member State concerned where it becomes aware of any information given rise to suspicion that a criminal offence involving the threat to the life or safety of a person or persons has taken place, is taking place or is likely to take place Member State concerned: in which the offence is suspected to have taken place or where the offender or the victim resides or is located
Digital Services Act Measures against hate speech Art. 23 Measures and protection against misuse Addressees: Providers of online platforms Shall suspend, for a reasonable period of time and after having issued a prior warning, the provisions of their services to recipients that frequently provide manifestly illegal content Evident to a layperson, without any substantive analysis, that the content is illegal Minimum standard: providers are free to establish stricter rules in their community standards.
Digital Services Act Measures against hate speech Art. 28 Online protection of minors Addressees: Providers of online platforms accessible to minors Shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors should consider best practices and available guidance (communication of the Commission: new European strategy for a better internet for kids) Should not present advertisements based on profiling using personal data.
Digital Services Act Measures against hate speech Art. 34 - Risk assessment Addressees: Providers of very large online platforms and search engines Shall identify, analyse and assess any systemic risks stemming from the design or functioning of the service and its related system At least once a year Shall preserve the documents for at least three years and give them, upon request, to the COM and the DSC
Digital Services Act Measures against hate speech Four categories of systemic risks 1. Dissemination of illegal content 2. Impact on the exercise of fundamental rights (esp. human dignity, freedom of expression and information, right to private life, data protection, right to non-discrimination) 3. Negative effects on democratic processes, civic discourse, electoral processes and public security 4. Negative effect on the protection of public health, minors and serious negative consequences for the physical and mental well-being of a person, or gender-based violence
Digital Services Act Measures against hate speech Art. 35 - Mitigation of risks Addressees: Providers of very large online platforms and search engines Shall put in place reasonable and effective mitigation measures, tailored to the specific identified systemic risk Adapting design, features or functioning of their service Adapting terms and conditions Adapting content moderation processes Taking awareness-raising measures Ensuring that an item of information falsely appears to a person to be authentic or truthful is distinguishable through prominent markings
Digital Services Act Enforcement Chapter I: General Provisions Chapter II: Liability of Providers of Intermediary Services Chapter III: Due Diligence Obligations for a Transparent and Safe Online Environment Chapter IV: Implementation, Cooperation, Penalties and Enforcement Chapter V: Final Provision
Digital Services Act Enforcement - Competences National Digital Services Coordinators (DSC) European Commission Exclusive powers to supervise and enforce special provisions against VLOPs and VLOSEs MS in which main establishment of the provider is located shall have the exclusive powers Powers to supervise and enforce any other provisions against VLOPs and VLOSEs For VLOPS only where the COM has not initiated proceedings MS shall designate one or more competent authorities to be responsible by 17 February 2024 DSC shall act with complete independence
Digital Services Act Enforcement Proceedings initiated by the COM 18.12.2023: COM opens formal proceedings against X under the DSA In areas linked to risk management, content moderation, dark patterns, advertising transparency and data access for researchers. If proven, these failures would constitute infringements of Articles 34(1), 34(2) and 35(1), 16(5) and 16(6), 25(1), 39 and 40(12) of the DSA. 19.02.2024: COM opens formal proceedings against TikTok under the DSA In areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content. If proven, these failures would constitute infringements of Articles 34(1), 34(2), 35(1) 28(1), 39(1), and 40(12) of the DSA.
Digital Services Act Enforcement - The German DSA Obligations of the DSA are directly applicable in any MS But MS are obliged to designate one or more competent authorities lay down the rules on penalties lay down rules for procedures An independent coordination office for digital services is to be created within the Bundesnetzagentur (Germany's main authority for infrastructure, promoting competition in the markets for energy, telecommunications, post and railways) The Act is currently discussed in the Parliament and is expected to act into force in April/May 2024
Thank you for your attention! Contact Federal Ministry of Justice Division III B 7 Mohrenstr. 37 10117 Berlin Contact person Claire Moselage, LL.M. moselage-cl@bmj.bund.de www.bmj.de Tel. +49 (0) 30 18 580 9372