The question of how to regulate the proliferation of harmful online content and create a safer online environment is one that tends to polarise opinion. Some want online content-sharing platforms to be regulated as if they were publishers of such content, while others believe that regulating internet platforms would stifle free speech and innovation.
Although some form of consensus is starting to emerge in the middle, on a global level the law remains fragmented, behind-the-curve, and in some cases unfit for purpose. In the face of huge media, political and consumer pressure, collaboration between the tech industry and regulators has already begun, based on codes of conduct and what some are calling a “duty of care” on the gatekeepers of the internet to keep users safe. But this is just the start, as lawmakers are making bigger strides to try to catch up.
The proposed EU Digital Services Act will be key. It will likely form the key framework from which a global model can emerge for how online platforms are expected to deal with illegal user content. While national models such as the UK Online Safety Bill proposals and the German NetzDG law may go beyond the central European model, global internet platforms will need to establish a consistent approach that can withstand regional variations.