Source: European Parliament
The Digital Services Act (DSA)[1] sets out rules for providers of intermediary services to tackle illegal content, while safeguarding freedom of expression.
The DSA imposes enhanced ‘due diligence’ obligations on providers of very large online platforms, including conducting risk assessments and to put in place mitigation measures tailored to the risks identified including risks related to recommender systems.
On 7 January 2025, Meta publicly announced the introduction of a system in the United States based on ‘Community Notes’ as a replacement to the third-party fact-checking Meta had previously used. Based on information available to the Commission, this policy does not currently apply in the EU.
In addition, Meta has informed the Commission of changes to its content policy and political content control on Facebook and Instagram.
These changes apply globally, including in the EU. The Commission has received Meta’s ad hoc risk assessment reports in relation to these changes and is reviewing them.
In 2024, the Commission initiated formal proceedings against Meta, including in relation to the suspicion that Meta demotes political content in the recommender systems of Facebook and Instagram.[2] The Commission is monitoring the functioning of Meta’s services to ensure compliance with the DSA.
Finally, the Code of Conduct on Disinformation is a robust set of commitments to fight disinformation while respecting the freedom of expression.[3]
Following the request of the signatories of the Code the Commission and the European Board for Digital Services, in February 2025, endorsed the Code as a code of conduct within the meaning of Article 45 DSA.
Adherence to the Code may therefore constitute a mitigation measure within the meaning of Article 35 DSA.