CHAPTER III - Due Diligence Obligations for a transparent and safe online environment (Art. 11 - 48)
- Section 1 – Provisions applicable to all providers of intermediary services
- Article 11 – Points of contact for Member States’ authorities, the Commission and the Board
- Article 12 – Points of contact for recipients of the service
- Article 13 – Legal representatives
- Article 14 – Terms and conditions
- Article 15 – Transparency reporting obligations for providers of intermediary services
- Section 2 – Additional provisions applicable to providers of hosting services, including online platforms
- Article 16 – Notice and action mechanisms
- Article 17 – Statement of reasons
- Article 18 – Notification of suspicions of criminal offences
- Section 3 – Additional provisions applicable to providers of online platforms
- Article 19 – Exclusion for micro and small enterprises
- Article 20 – Internal complaint-handling system
- Article 21 – Out-of-court dispute settlement
- Article 22 – Trusted flaggers
- Article 23 – Measures and protection against misuse
- Article 24 – Transparency reporting obligations for providers of online platforms
- Article 25 – Online interface design and organisation
- Article 26 – Advertising on online platforms
- Article 27 – Recommender system transparency
- Article 28 – Online protection of minors
- Section 4 – Additional provisions applicable to providers of online platforms allowing consumers to conclude distance contracts with traders
- Article 29 – Exclusion for micro and small enterprises
- Article 30 – Traceability of traders
- Article 31 – Compliance by design
- Article 32 – Right to information
- Section 5 – Additional obligations for providers of very large online platforms and of very large online search engines to manage systemic risks
- Article 33 – Very large online platforms and very large online search engines
- Article 34 – Risk assessment
- Article 35 – Mitigation of risks
- Article 36 – Crisis response mechanism
- Article 37 – Independent audit
- Article 38 – Recommender systems
- Article 39 – Additional online advertising transparency
- Article 40 – Data access and scrutiny
- Article 41 – Compliance function
- Article 42 – Transparency reporting obligations
- Article 43 – Supervisory fee
- Section 6 – Other provisions concerning due diligence obligations
- Article 44 – Standards
- Article 45 – Codes of conduct
- Article 46 – Codes of conduct for online advertising
- Article 47 – Codes of conduct for accessibility
- Article 48 – Crisis protocols
- Section 1 – Competent authorities and national Digital Services Coordinators
- Article 49 – Competent authorities and Digital Services Coordinators
- Article 50 – Requirements for Digital Services Coordinators
- Article 51 – Powers of Digital Services Coordinators
- Article 52 – Penalties
- Article 53 – Right to lodge a complaint
- Article 54 – Compensation
- Article 55 – Activity reports
- Section 2 – Competences, coordinated investigation and consistency mechanisms
- Article 56 – Competences
- Article 57 – Mutual assistance
- Article 58 – Cross-border cooperation among Digital Services Coordinators
- Article 59 – Referral to the Commission
- Article 60 – Joint investigations
- Section 3 – European Board for Digital Services
- Article 61 – European Board for Digital Services
- Article 62 – Structure of the Board
- Article 63 – Tasks of the Board
- Section 4 – Supervision, investigation, enforcement and monitoring in respect of providers of very large online platforms and of very large online search engines
- Article 64 – Development of expertise and capabilities
- Article 65 – Enforcement of obligations of providers of very large online platforms and of very large online search engines
- Article 66 – Initiation of proceedings by the Commission and cooperation in investigation
- Article 67 – Requests for information
- Article 68 – Power to take interviews and statements
- Article 69 – Power to conduct inspections
- Article 70 – Interim measures
- Article 71 – Commitments
- Article 72 – Monitoring actions
- Article 73 – Non-compliance
- Article 74 – Fines
- Article 75 – Enhanced supervision of remedies to address infringements of obligations laid down in Section 5 of Chapter III
- Article 76 – Periodic penalty payments
- Article 77 – Limitation period for the imposition of penalties
- Article 78 – Limitation period for the enforcement of penalties
- Article 79 – Right to be heard and access to the file
- Article 80 – Publication of decisions
- Article 81 – Review by the Court of Justice of the European Union
- Article 82 – Requests for access restrictions and cooperation with national courts
- Article 83 – Implementing acts relating to Commission intervention
- Section 5 – Common provisions on enforcement
- Article 84 – Professional secrecy
- Article 85 – Information sharing system
- Article 86 – Representation
- Section 6 – Delegated and implementing acts
- Article 87 – Exercise of the delegation
- Article 88 – Committee procedure
Art. 34 DSA
Risk assessment
- Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services.
They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:
(a) the dissemination of illegal content through their services;
(b) any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non- discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;
(c) any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
(d) any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being. - When conducting risk assessments, providers of very large online platforms and of very large online search engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:
(a) the design of their recommender systems and any other relevant algorithmic system;
(b) their content moderation systems;
(c) the applicable terms and conditions and their enforcement;
(d) systems for selecting and presenting advertisements;
(e) data related practices of the provider.
The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions.
The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State. - Providers of very large online platforms and of very large online search engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital Services Coordinator of establishment.