TY - JOUR
T1 - Countering online hate speech
T2 - How does human rights due diligence impact terms of service?
AU - Nave, Eva
AU - Lane, Lottie
PY - 2023/11
Y1 - 2023/11
N2 - The Internet is a global forum largely governed by private actors driven by profit concerns, often disregarding the human rights of historically marginalised communities. Increased attention is being paid to the corporate human rights due diligence (HRDD) responsibilities applicable to online platforms countering illegal online content, such as hate speech. At the European Union (EU) level, cross-sector initiatives regulate the rights of marginalised groups and establish HRDD responsibilities for online platforms to expeditiously identify, prevent, mitigate, remedy and remove online hate speech. These initiatives include the Digital Services Act, the Audiovisual Media Services Directive, the proposed Directive on Corporate Sustainability Due Diligence, the proposed Artificial Intelligence Act and the Code of conduct on countering illegal hate speech online. Nevertheless, the HRDD framework applicable to online hate speech has focused mostly on the platforms’ responsibilities throughout the course of their operations - guidance regarding HRDD requirements concerning the regulation of hate speech in the platforms’ Terms of Service (ToS) is missing. This paper employs a conceptualisation of criminal hate speech as explained in the Council of Europe Committee of Ministers’ Recommendation CM/Rec(2022)16, Paragraph 11, to develop specific HRDD responsibilities. We argue that online platforms should, as part of emerging preventive HRDD responsibilities within Europe, respect the rights of historically oppressed communities by aligning their ToS with the conceptualisation of criminal hate speech in European human rights standards.
AB - The Internet is a global forum largely governed by private actors driven by profit concerns, often disregarding the human rights of historically marginalised communities. Increased attention is being paid to the corporate human rights due diligence (HRDD) responsibilities applicable to online platforms countering illegal online content, such as hate speech. At the European Union (EU) level, cross-sector initiatives regulate the rights of marginalised groups and establish HRDD responsibilities for online platforms to expeditiously identify, prevent, mitigate, remedy and remove online hate speech. These initiatives include the Digital Services Act, the Audiovisual Media Services Directive, the proposed Directive on Corporate Sustainability Due Diligence, the proposed Artificial Intelligence Act and the Code of conduct on countering illegal hate speech online. Nevertheless, the HRDD framework applicable to online hate speech has focused mostly on the platforms’ responsibilities throughout the course of their operations - guidance regarding HRDD requirements concerning the regulation of hate speech in the platforms’ Terms of Service (ToS) is missing. This paper employs a conceptualisation of criminal hate speech as explained in the Council of Europe Committee of Ministers’ Recommendation CM/Rec(2022)16, Paragraph 11, to develop specific HRDD responsibilities. We argue that online platforms should, as part of emerging preventive HRDD responsibilities within Europe, respect the rights of historically oppressed communities by aligning their ToS with the conceptualisation of criminal hate speech in European human rights standards.
U2 - 10.1016/j.clsr.2023.105884
DO - 10.1016/j.clsr.2023.105884
M3 - Article
SN - 0267-3649
VL - 51
JO - Computer Law & Security Review
JF - Computer Law & Security Review
M1 - 105884
ER -