Source: European Parliament
The crisis response mechanism outlined in Article 36 of the Digital Services Act (DSA)[1] only applies in extraordinary circumstances that pose a serious threat to public security or public health in the EU.
Activation of this mechanism requires a recommendation from the European Board for Digital Services[2], which comprises independent national regulators. No such recommendation has so far been issued and, therefore, the provision has never been enforced.
If the Board recommends activating the crisis response mechanism, the Commission may request providers of very large online platforms or search engines to assess if their services significantly contribute to the serious threat. If so, they are to identify appropriate mitigation measures.
The DSA does not prescribe the measures to be taken by the provider, but sets out effective safeguards for fundamental rights. As stated in Article 36, proposed measures must be strictly necessary, justified, proportionate and respect the fundamental rights enshrined in the Charter[3]. Such measures must also be limited to a reasonable period not exceeding three months.
The Commission monitors the application of mitigation measures, regularly updates the Board, and reports to the European Parliament and the Council on the application of the measures taken by providers.
Regarding content removal, the DSA does not define what type of content users may or may not post online. The DSA lays down the world’s strongest safeguards of users’ rights online, e.g. by requiring platforms to publish online[4] statements of reasons for any content moderation decisions by platforms, and provide complaint mechanisms for appealing content moderation decisions, allowing users to contest decisions and ensuring due process.