Who is responsible for tackling online incitement to racist violence?

By Joël Le Déroff, Senior Advocacy Officer at ENAR

31 March 2016 - When we talk about online hate speech, a number of complex questions emerge on how the victims and the organisations that support them can or should react, what is the role of IT and social media companies and how laws can best be enforced.

“Hate speech” usually refers to forms of expression that are motivated by, demonstrate or encourage hostility towards a group - or a person because of their perceived membership of that group. Hate speech may encourage or accompany hate crime. The two phenomena are interlinked. Hate speech that directly constitutes incitement to racist violence or hatred is criminalised under European law.

In the case of online incitement, some questions make the reactions of the victims, of the law enforcement and prosecution authorities particularly complex.

Firstly, should we rely on self-regulation, based on IT and social media companies’ terms of services? They are a useful regulation tool, but they do not equate law enforcement. If we rely only on self-regulation, it means that in practice, legal provisions will stop having an impact in the realm of online public communication. Even if hateful content was regularly taken down, perpetrators would enjoy impunity. In addition, the criteria for the removal of problematic content would end up being defined independently from the law and from the usual proportionality and necessity checks that should apply to any kind of restriction of freedoms.

Secondly, do IT and social media companies have criminal liability if they don’t react appropriately? They are not the direct authors or instigators of incitement. However, EU law provides that "Member States shall take the measures necessary to ensure that aiding and abetting in the commission of the conduct [incitement] is punishable." [1] How should this be interpreted? Can it make online service providers responsible?

We urgently need efficient reactions against the propagation of hate speech, by implementing relevant legislation and ensuring investigation and prosecution.

Lastly, using hate speech law provisions is difficult in the absence of investigation and prosecution guidelines, which would allow for a correct assessment of the cases. How should police forces be equipped to deal with the reality of online hate speech, and how should IT and social media companies cooperate?

There is no easy answer. One thing is clear, though. We urgently need efficient reactions against the propagation of hate speech, by implementing relevant legislation and ensuring investigation and prosecution. Not doing this can lead to impunity and escalation, as hate incidents have the potential to reverberate among followers of the perpetrator, spread fear and intimidation, and increase the risk of additional violent incidents.

The experience of ENAR’s members and partners provides evidence that civil society initiatives can provide ideas and tools. They can also lead the way in terms of creating counter-narratives to hate speech. At the same time, NGOs are far from having the resources to systematically deal with the situation. Attempts by public authorities and IT companies to put the burden of systematic reporting and assessment of cases on NGOs would amount to shirking their own responsibilities.

Among the interesting civil society experiences, the “Get the Trolls Out” project (coordinated by Media Diversity Institute and the International Center For Journalists) makes it possible to flag cases to website hosts and report to appropriate authorities. CEJI also publishes op-eds, produces counter-narratives and uses case reports for pedagogical purposes.

Run by a consortium of NGOs and universities, C.O.N.T.A.C.T. is another project that allows victims or witnesses to report hate incidents in as many as 10 European countries (Cyprus, Denmark, Greece, Italy, Lithuania, Malta, Poland, Romania, Spain and the UK). However, despite the fact that it is funded by the European Commission, the reports are not directly communicated to law enforcement authorities.

Attempts by public authorities and IT companies to put the burden of systematic reporting and assessment of cases on NGOs would amount to shirking their own responsibilities.

The Light On project has developed tools to identify and assess the gravity of racist symbols, images and speech in the propagation of stigmatising ideas and violence. The project has also devised training and assessment tools for the police and the judiciary.

But these initiatives do not have the resources to trickle down and reach out to all the competent public services in Europe. Similarly, exchanges between the anti-racism movement and IT companies are far from systematic. In this area as well, some practices are emerging, but there have been problematic incidents where social media such as Twitter and Facebook refused to take down content breaching criminal law. These cases do not represent the norm, and are not an indication of general ill-will. Rather, they highlight the fact that clarifications are needed, based on the enforcement of human rights based legislative standards on hate speech.

Cooperation is essential. The implementation of criminal liability for IT companies which refuse to take down content inciting to violence and hatred is one tool. However, this is complex – some companies aren’t based in the EU – and it cannot be the one and only solution.

A range of additional measures are needed, including allocating targeted resources within law enforcement bodies and support services, such as systematically and adequately trained cyber police forces and psychologists. Public authorities should also build on civil society experience and create universally accessible reporting mechanisms, including apps and third-party reporting systems. NGO initiatives have also provided methodologies related to case processing, which can be adapted to the role of different stakeholders, from community and victim support organisations to the different components of the criminal justice system.

Targeted awareness raising is extremely important as well, to help the same stakeholders to distinguish what is legal from what isn’t. In all these actions, involving anti-racism and community organisations is a pre-condition for effectiveness.

[1] Article 2 (2) of the Framework Decision 2008/913/JHA on combating racism and xenophobia.


Posting comments on this webzine requires the disclosure of one’s real identity. In addition, although debate and controversy are encouraged, the administrator reserves the right to delete messages in the following cases:

1. If the comment has a racist content and/or incites to hatred

2. If the author uses disrepectful or offensive language

3. If the comment is totally irrelevant to the debated issue

4. If the author makes repetitive comments and uses this space as a platform rather than a place for debate


This forum is moderated beforehand: your contribution will only appear after being validated by a site administrator.

Any message or comment?
  • (To create paragraphs, you simply leave blank lines.)

Who are you? (optional)