Ireland’s Mandate on Social-Media to Curb Terrorist Content: The Terrorist Content Online Regulation and its Bearing on Free Speech
Srishti Sherpa – LLM in Intellectual Property Law, University of Edinburgh (2024–2025)
Background
Freedom of expression is actively upheld by regulatory authorities and rightfully so. However, the concern is that online interactions, such as – sharing content, engaging with extremist narratives or communicating with members of banned groups – could be misinterpreted as support rather than mere expression. This blog aims at exploring the fragile space between terrorist content online and the freedom of expression guaranteed under Article 10 of the European Convention on Human Rights (ECHR) – can they coexist?
In November 2024, the Coimisiún na Meán (the Irish national media authority, hereinafter referred to as “the Commission”) announced that it found TikTok, Elon Musk’s X (formerly Twitter) and Meta’s Instagram to be “exposed to terrorist content” under the Terrorist Online Content Regulation (“TCOR”). Meta’s Facebook was added to the list the following month. As a result, these platforms were ordered to take corrective measures.
Following this ruling, social media sites will have to take certain actions to prevent their services from being used to spread terrorist content. Along with this, they also have to report to the Commission on the actions they have taken within 3 months of receiving the ruling. These actions must respect users’ fundamental rights and be appropriate, targeted and successful. A hosting service provider that is exposed to terrorist content must, among other things, include clauses in its terms and conditions that address the exploitation of its services for the public distribution of terrorist content. This move aligns with broader EU efforts, particularly through the Digital Services Act (DSA) which seeks to impose stricter content moderation responsibilities on online platforms.
The Spread of Propaganda
Social media platforms, despite their vibrant communicative potential, have been increasingly used to further terrorist’s goals and spread propaganda. This was confirmed by Von Behr et al (2013), whose analysis suggested that the internet acts as a medium providing “key source of information, communication and of propaganda for their extremist beliefs”.
Following the Hamas assaults on Israel in October of last year and the arrests in Germany in connection with a purported coup plan, there has been an increased concern about terrorist content on social media. The challenge lies in pinpointing when online engagement turns into dangerous propaganda. Given their reach and influence, platforms can easily amplify content that radicalises impressionable users.
Understanding the TCOR
In 2018, the European Council pushed for laws to combat illegal content online. In response, the European Commission introduced TCOR, which came into force in 2022 as part of its Online Safety Framework. TCOR enables national authorities to issue removal orders and obliges hosting services to act promptly. A provider is “exposed” if it receives two or more final removal orders within 12 months. In such cases, the provider must amend its policies and take proactive measures to prevent further violations.
Article 2(7) of the TCOR defines “terrorist content” as any “material that incites, solicits, or contributes to terrorist offences, promotes participation in terrorist groups or provides instructions for committing acts of terrorism”. Yet, critics argue that terms like “glorification” are vague. UN rapporteurs and legal experts have raised concerns about the lack of clarity and the burden placed on private platforms to police content. The risk? Over-removal and uneven enforcement across EU states.
The Human Rights Perspective: Article 10 ECHR
While TCOR aims to standardise the rules governing the removal of terrorist content, it must align with Article 10 of the ECHR, which safeguards the right to hold opinions, and receive and impart information. Propagation of terrorist content could very easily be given life under the freedom to “share and receive ideas”. However, this right must also be exercised responsibly, that is, it will be subjected to limitations if there it raises concerns about national security, public order, health or morals.
Concerns remain about vague terminology and excessive reliance on automated moderation, which can lead to censorship of lawful speech. To counter this, a narrower, more precise definition of terrorist content—aligned with Article 19(3) of the International Covenant on Civil and Political Rights (ICCPR) —would offer stronger legal certainty. Adopting the Rabat Plan of Action’s six-part threshold test could help distinguish between expression and incitement.
TCOR Meets the DSA: Strengthening Accountability and Safeguards
The Digital Services Act (DSA), fully enforced since 2024, introduces transparency obligations, risk assessments, and independent audits for Very Large Online Platforms (VLOPs). These mechanisms can complement TCOR by enhancing procedural fairness and user protections. Platforms under the DSA must provide users with clear explanations for content removals and offer accessible appeals processes. This limits the risk of arbitrary censorship and allows for meaningful redress.
Together, TCOR and the DSA create a layered enforcement regime: TCOR focuses on urgent response and content takedown, while the DSA ensures accountability, transparency, and fairness. Integrated effectively, the two frameworks can work in tandem to address security threats while respecting fundamental rights.
Conclusion: Towards a safer future
TCOR represents a vital step in tackling the spread of terrorist propaganda online. But its success hinges on legal clarity, proportionate enforcement, and strong safeguards. The DSA’s introduction of transparency and redress mechanisms offers a path forward. Together, these laws can form a balanced regulatory system—where freedom of expression and digital safety are not mutually exclusive but mutually reinforcing. Ultimately, with clear legal definitions, robust procedural safeguards, and proportional enforcement, the TCOR and the protection of freedom of expression under Article 10 ECHR are not mutually exclusive—they can and must coexist.