Tech, Media and Comms

Online harms: The new legal framework for addressing 'hate speech' in France and in Germany

Published on 2nd Jun 2020

While the European Commission continues to progress a wide-ranging new legal framework for online content, a number of Member States are taking action at a national level to combat specific harms, such as hate speech, in order to increase online safety and step up the fight against such harmful content.

RMT_business_woman

We recently reported that the UK government was pressing ahead with its plans for an Online Harms Bill, including codes of conduct on online terrorism and child sexual exploitation, albeit on a slower timetable than originally envisaged. This Insight summarises how France and Germany intend to combat hate speech.

What laws are being proposed?

🇫🇷 On 13 May 2020, despite strong disagreements between the French parliament chambers, a law was adopted by the National Assembly to tackle the spread of hate speech on the Internet by involving platform operators, search engine operators, Internet users and Internet service providers. The goal is to increase platform and search engine operators’ accountability, under the scrutiny of an independent administrative authority with strong enforcement powers. The law is currently under review by the Constitutional Council.
🇩🇪 In an effort to tackle hate speech on social networks, Germany introduced its Network Enforcement Act in 2017, obliging large social networks to put in place robust reporting systems for certain illegal content, block or delete such content when reported, publish regular reports on their efforts to keep illegal content off their platforms, and designate a local representative authorized to receive administrative and court documents. Two separate amendments to this law are currently in the legislative process and likely to be adopted – one to oblige social networks to report illegal content to authorities, and one to enhance platforms’ report handling and accountability obligations. A third legislative proposal, with unclear chances of success, would expand the scope of application to online games and require all covered platforms to identify their registered users.  

 

Who will it apply to?

🇫🇷 The new legal framework applies not only to social networks but to any platform that offers the ability to comment, interact or share content, and whose activity in France exceeds thresholds to be set out by the French Council of State’s decrees. Laetitia Avia, the member of Parliament who submitted the draft law, expressed the wish to target platforms having 2 million unique visitors per month. Also, despite a divergence between both French parliamentary chambers on the need to address the search engine operators, the latter are required to abide by this new legal framework.
🇩🇪

The Network Enforcement Act currently applies to general-purpose for-profit platforms that enable users to share content. Special-purpose platforms (such as professional networks, online shops and online games) and journalistic offerings are not covered, even if they too enable users to post their own content. Services for one-to-one communication, such as messenger apps, are also exempt. Social networks with fewer than two million registered users are exempt from most but not all obligations under the Network Enforcement Act.

The proposed amendments would expressly add video sharing platforms to the scope of the legislation, although because of the country of origin principle in the EU’s Audiovisual Media Services Directive, many of its provisions will not apply to video sharing platforms domiciled in a different EU jurisdiction. Finally, online games would be covered in the future if a third amendment was adopted.

 

What harmful content items are targeted?

🇫🇷 The law provides for an exhaustive list of so-called hate speech content. This list refers to offences that violate listed sections of the French Criminal Code and of the French Law on the Freedom of the Press of 29 July 1881, in other words, offences:

 

  • promoting wilful attacks on life, on the integrity of the person, sexual assaults, war crimes, crimes against humanity, enslavement;
  • provoking hatred or violence or insulting a person or group of persons based on their origin or membership or non-membership of a given ethnic group, nation, race or religion;
  • provoking hatred or violence or insulting a person or group of persons based on their sex, sexual orientation, gender identity or disability;
  • denying the existence of crimes against humanity;
  • constituting sexual harassment;
  • conveying an minor’s image or representation of a pornographic nature;
  • directly provoking or publicly condone terrorism acts;
  • conveying message of a violent nature, inciting terrorism, pornographic or of such a nature as to undermine human dignity or to incite minors to engage in dangerous games.
🇩🇪 The Network Enforcement Act covers only content that violates specifically listed sections of the German Criminal Code, but the listed sections include a wide array of sometimes very specific offences:

 

  • using propaganda or symbols of anti-constitutional organizations (such as swastikas);
  • publishing instructions to commit major terrorist acts;
  • publishing treasonous falsifications (this could cover “fake news” relating to matters of state);
  • threating the commission of (certain serious) criminal offences, inciting others to commit criminal offences, or offering a reward or stating public approval for (certain serious) criminal offences;
  • forming criminal or terrorist organizations in Germany or abroad;
  • using hate speech against a national, racial, religious or ethnic group or other group within the population;
  • glorifying violence;
  • owning and/or making available child sexual abuse content (“child pornography”);
  • libel and slander; Insulting religious groups or convictions;
  • publishing images taken by invading a third person’s privacy;
  • forging/falsifying digital data that can be used as evidence

The proposed amendments do not extend this scope. The legislative amendment adding video sharing platforms in fact restricts the scope of offences further for these specific platforms (essentially to hate speech, glorifying violence and child sexual abuse content).

 

What will the legal frameworks consist of?

🇫🇷

Until now, when platform providers could claim hosting provider status, they had the obligation to promptly remove any obviously illegal content notified to them. Moreover, they were expected to play a more active role regarding the specific offences apologising for crimes against humanity, incitement to racial hatred and child pornography.

As a result of the law – subject to its validation by the Constitutional Council – platform operators will be now under more onerous obligations when it comes to any hate speech content as listed above, with the aim of:

  • ensuring users have the necessary information, including how to deal with hate speech content, the outcome of notifications of hate speech content, and the penalties incurred in the event of improper notifications, with specific information duties towards minors;
  • simplifying the mechanisms for reporting hate speech content, including an obligation to set up a single, accessible and easy-to-use tool for all users on French territory;
  • reducing the processing time of the reported hate speech content. The platform operators will have either 24 hours or 1 hour (for terrorist and child pornography content) to remove or disable obviously hate speech content. This will require operators to implement procedures as well as human and - where necessary - technological means to ensure that notifications are dealt without delay and properly;
  • ensuring remedy mechanisms are in place for the users who shared the reported content and the users who reported the content;
  • increasing operators' cooperation with French authorities. Operators will have an obligation to store temporarily the reported hate speech content and to make it available to the French judicial authority for the purposes of researching, identifying and prosecuting criminal offences. Operators will also have reporting duties towards French authorities and an obligation to appoint a France-based individual as a point of contact responsible for receiving requests from the French authorities, amongst other things.

Like the French law to tackle fake news, further guidance is expected from the French regulatory authority regarding the diligence expected from platforms.

In addition, the law aims at facilitating the access to judicial and criminal authorities, in particular  through the creation of a specialised digital prosecutor's office.

🇩🇪 Platform operators are already obliged to implement an easy-to-access reporting tool and a robust procedure for handling these reports, as well as measures to protect the staff that handle reports and the potentially disturbing content that is reported. For instance, social networks must

 

  • deal with reports at short notice, deleting obviously illegal content within 24 hours and resolving other reports generally within 7 days;
  • store copies of deleted content for evidence purposes for 10 weeks;
  • inform both the reporting user and the user whose content is affected by a report of the outcome;
  • monitor the reports handling procedure monthly to correct any deficiencies, and offer training and debriefing for content moderators at least twice a year;
  • Publish a report – in German – twice a year to document number of reports, resolution times, and other statistics and information on the report handling procedure.

Platform operators – even those with fewer than two million registered users – are also obliged to designate a representative in Germany that is authorised to receive official documents from authorities and courts.

The current draft legislation would add several crucial requirements, in particular:

  • Easier access: The reporting system must be easily accessible directly from each individual piece of content.
  • Obligation to justify: Any decision taken on a reported piece of content would need to be communicated the user that reported it and the user that originally posted it, along with a justification for the decision.
  • Obligation to inform authorities: Where the platform provider has any reason to believe a reported piece of content violates one of the covered provisions from the criminal code, they would have to report this to a federal authority.
  • Counter-notices: Users whose content is reported would need to have access to a “transparent and effective” counter-notice procedure; upon such counter-notice, a person not involved in the initial decision to remove content would have to review this decision again.
  • Enhanced reporting obligations: The reports would need to include more granular statistics and further details on the procedure, in particular regarding any machine learning algorithms used to assist content moderation.

If the third, most controversial amendment was adopted, it would also require platforms to collect the full name, address and date of birth of each registered user, and verify this information by checking government-issued photo ID, a qualified electronic signature, or similarly robust means.

 

Who will be in charge of monitoring the enforcement of the legal frameworks?

🇫🇷 The platforms’ accountability mechanism will be monitored by the French broadcasting regulatory authority (CSA), which is granted important powers (see below).
🇩🇪 The Network Enforcement Act is enforced by the Federal Office of Justice.

 

What will be the potential sanctions?

🇫🇷  Non-compliance with the law carries the following fines:

 

  • Criminal fines: a fine of up to 250,000 Euros (for individuals) and 1,250,000 Euros (for legal entities) where the operator failed to remove or disable hate speech content within 24 hours (or, where applicable, 1 hour) upon notification. This highly controversial offence has been maintained despite the firm opposition from the French Senate, which considers it as detrimental to the freedom of speech (risking "over-censorship" or "over-blocking").
  • Administrative fines: If the operator does not comply with a formal notice from the CSA to obtain the performance of the new due diligence obligations, the regulator has the power to impose pecuniary fines up to 20 million Euros or 4% of the operator’s global turnover achieved in the previous financial year (whichever is greater). The CSA can also disclose the formal notices and fines it issued (name and shame).
🇩🇪 The Federal Office of Justice can order social networks to modify their complaints handling procedure if they deem it insufficient.Non-compliance with the Network Enforcement Act also carries administrative fines of up to 50 million Euros. It is important to note that these fines are not issued for individual failures to remove a piece of content that should have been removed, but only for structural failures in setting up the complaints handling system as such. However, (multiple) failures to remove content may be an indication that the system is not properly set up (for example, the content moderators may not be properly trained and instructed).

 

What safeguards are included to prevent infringing freedom of speech?

🇫🇷 If the legislator ensured that the authority to determine the balance between freedom of speech and the fight against hate speech remains with the judiciary, although many criticize its weak position in this new legal framework. That said, any person who abusively presents content to platform operators as unlawful within the meaning of the law in order to obtain its removal or disablement, risks being punished by up to one year's imprisonment and a fine of 15,000 Euros.
🇩🇪 One of the proposed amendments includes a counter-notice procedure (see above), which together with the new obligation to provide reasons for blocking decisions is designed to safeguard users’ freedom of speech against “over-blocking”.

 

When?

🇫🇷 Some parts of the new legal framework should enter into force as early as 1 July 2020, while others as apply from 1 January 2021. However, this schedule might be disrupted as the French Constitutional Council was petitioned by the French Senators on 18 May 2020 to rule on the conformity of this law before its enactment.
🇩🇪 The first two amendments are currently being debated in the federal parliament. While no date for a final vote is known yet, the amendments have the support of the government and are therefore likely to pass. The third proposed amendment comes from two of the federal states, but so far has failed to garner sufficient support from the other federal states to be submitted to debate in parliament. It is therefore less likely to be adopted in its current form.

What about at European level?

After the launch of a Code of Conduct in May 2016 together with four major operators (Facebook, Microsoft, Twitter and YouTube) in an effort to respond to the proliferation of hate speech online, the European Commission is currently in the process of thrashing out the details of the Digital Services Act. The Digital Services Act, which is likely to be put forward in Q4 2020, is a new framework which will update eCommerce directive and establish new rules governing the internet, including rules aimed at tackling illegal content online and establishing an EU definition of hate speech.

One of the challenges will be therefore how these national initiatives can fit into this new European regulatory framework, the aim of which is precisely to harmonize as far as possible the different regulatory systems for digital platforms within the EU.

Share

* This article is current as of the date of its publication and does not necessarily reflect the present state of the law or relevant regulation.

Interested in hearing more from Osborne Clarke?