Online harms: The new legal framework for addressing 'hate speech' in France and in Germany
Published on 2nd Jun 2020
While the European Commission continues to progress a wide-ranging new legal framework for online content, a number of Member States are taking action at a national level to combat specific harms, such as hate speech, in order to increase online safety and step up the fight against such harmful content.
We recently reported that the UK government was pressing ahead with its plans for an Online Harms Bill, including codes of conduct on online terrorism and child sexual exploitation, albeit on a slower timetable than originally envisaged. This Insight summarises how France and Germany intend to combat hate speech.
What laws are being proposed?
|🇫🇷||On 13 May 2020, despite strong disagreements between the French parliament chambers, a law was adopted by the National Assembly to tackle the spread of hate speech on the Internet by involving platform operators, search engine operators, Internet users and Internet service providers. The goal is to increase platform and search engine operators’ accountability, under the scrutiny of an independent administrative authority with strong enforcement powers. The law is currently under review by the Constitutional Council.|
|🇩🇪||In an effort to tackle hate speech on social networks, Germany introduced its Network Enforcement Act in 2017, obliging large social networks to put in place robust reporting systems for certain illegal content, block or delete such content when reported, publish regular reports on their efforts to keep illegal content off their platforms, and designate a local representative authorized to receive administrative and court documents. Two separate amendments to this law are currently in the legislative process and likely to be adopted – one to oblige social networks to report illegal content to authorities, and one to enhance platforms’ report handling and accountability obligations. A third legislative proposal, with unclear chances of success, would expand the scope of application to online games and require all covered platforms to identify their registered users.|
Who will it apply to?
|🇫🇷||The new legal framework applies not only to social networks but to any platform that offers the ability to comment, interact or share content, and whose activity in France exceeds thresholds to be set out by the French Council of State’s decrees. Laetitia Avia, the member of Parliament who submitted the draft law, expressed the wish to target platforms having 2 million unique visitors per month. Also, despite a divergence between both French parliamentary chambers on the need to address the search engine operators, the latter are required to abide by this new legal framework.|
The Network Enforcement Act currently applies to general-purpose for-profit platforms that enable users to share content. Special-purpose platforms (such as professional networks, online shops and online games) and journalistic offerings are not covered, even if they too enable users to post their own content. Services for one-to-one communication, such as messenger apps, are also exempt. Social networks with fewer than two million registered users are exempt from most but not all obligations under the Network Enforcement Act.
The proposed amendments would expressly add video sharing platforms to the scope of the legislation, although because of the country of origin principle in the EU’s Audiovisual Media Services Directive, many of its provisions will not apply to video sharing platforms domiciled in a different EU jurisdiction. Finally, online games would be covered in the future if a third amendment was adopted.
What harmful content items are targeted?
|🇫🇷||The law provides for an exhaustive list of so-called hate speech content. This list refers to offences that violate listed sections of the French Criminal Code and of the French Law on the Freedom of the Press of 29 July 1881, in other words, offences:
|🇩🇪||The Network Enforcement Act covers only content that violates specifically listed sections of the German Criminal Code, but the listed sections include a wide array of sometimes very specific offences:
The proposed amendments do not extend this scope. The legislative amendment adding video sharing platforms in fact restricts the scope of offences further for these specific platforms (essentially to hate speech, glorifying violence and child sexual abuse content).
What will the legal frameworks consist of?
Until now, when platform providers could claim hosting provider status, they had the obligation to promptly remove any obviously illegal content notified to them. Moreover, they were expected to play a more active role regarding the specific offences apologising for crimes against humanity, incitement to racial hatred and child pornography.
As a result of the law – subject to its validation by the Constitutional Council – platform operators will be now under more onerous obligations when it comes to any hate speech content as listed above, with the aim of:
Like the French law to tackle fake news, further guidance is expected from the French regulatory authority regarding the diligence expected from platforms.
In addition, the law aims at facilitating the access to judicial and criminal authorities, in particular through the creation of a specialised digital prosecutor's office.
|🇩🇪||Platform operators are already obliged to implement an easy-to-access reporting tool and a robust procedure for handling these reports, as well as measures to protect the staff that handle reports and the potentially disturbing content that is reported. For instance, social networks must
Platform operators – even those with fewer than two million registered users – are also obliged to designate a representative in Germany that is authorised to receive official documents from authorities and courts.
The current draft legislation would add several crucial requirements, in particular:
If the third, most controversial amendment was adopted, it would also require platforms to collect the full name, address and date of birth of each registered user, and verify this information by checking government-issued photo ID, a qualified electronic signature, or similarly robust means.
Who will be in charge of monitoring the enforcement of the legal frameworks?
|🇫🇷||The platforms’ accountability mechanism will be monitored by the French broadcasting regulatory authority (CSA), which is granted important powers (see below).|
|🇩🇪||The Network Enforcement Act is enforced by the Federal Office of Justice.|
What will be the potential sanctions?
|🇫🇷|| Non-compliance with the law carries the following fines:
|🇩🇪||The Federal Office of Justice can order social networks to modify their complaints handling procedure if they deem it insufficient.Non-compliance with the Network Enforcement Act also carries administrative fines of up to 50 million Euros. It is important to note that these fines are not issued for individual failures to remove a piece of content that should have been removed, but only for structural failures in setting up the complaints handling system as such. However, (multiple) failures to remove content may be an indication that the system is not properly set up (for example, the content moderators may not be properly trained and instructed).|
What safeguards are included to prevent infringing freedom of speech?
|🇫🇷||If the legislator ensured that the authority to determine the balance between freedom of speech and the fight against hate speech remains with the judiciary, although many criticize its weak position in this new legal framework. That said, any person who abusively presents content to platform operators as unlawful within the meaning of the law in order to obtain its removal or disablement, risks being punished by up to one year's imprisonment and a fine of 15,000 Euros.|
|🇩🇪||One of the proposed amendments includes a counter-notice procedure (see above), which together with the new obligation to provide reasons for blocking decisions is designed to safeguard users’ freedom of speech against “over-blocking”.|
|🇫🇷||Some parts of the new legal framework should enter into force as early as 1 July 2020, while others as apply from 1 January 2021. However, this schedule might be disrupted as the French Constitutional Council was petitioned by the French Senators on 18 May 2020 to rule on the conformity of this law before its enactment.|
|🇩🇪||The first two amendments are currently being debated in the federal parliament. While no date for a final vote is known yet, the amendments have the support of the government and are therefore likely to pass. The third proposed amendment comes from two of the federal states, but so far has failed to garner sufficient support from the other federal states to be submitted to debate in parliament. It is therefore less likely to be adopted in its current form.|
What about at European level?
After the launch of a Code of Conduct in May 2016 together with four major operators (Facebook, Microsoft, Twitter and YouTube) in an effort to respond to the proliferation of hate speech online, the European Commission is currently in the process of thrashing out the details of the Digital Services Act. The Digital Services Act, which is likely to be put forward in Q4 2020, is a new framework which will update eCommerce directive and establish new rules governing the internet, including rules aimed at tackling illegal content online and establishing an EU definition of hate speech.
One of the challenges will be therefore how these national initiatives can fit into this new European regulatory framework, the aim of which is precisely to harmonize as far as possible the different regulatory systems for digital platforms within the EU.