What regulation is on the horizon for the metaverse?
Published on 26th Apr 2022
A live concert staged in the metaverse raises broad legal and regulatory issues
How is the regulation of the metaverse shaping up? It's a question that was addressed at Osborne Clarke's recent Metaverse Week event, where a case study of a live concert in the metaverse helped spotlight the new and upcoming legislation and regulations that are set to have implications for both creators and users within the metaverse. These ramifications are broad and encompass broadcast and media regulation, intellectual property (IP) issues, artificial intelligence (AI), data and privacy regulation, and content and interactions.
Media and IP
If a live concert was planned in the metaverse, what would be the issues to look at from a media regulation and IP perspective? The first issue to consider is rights clearances. Taking the example of a live concert, if you would like to use someone else's music in the metaverse, you may need to prepare bespoke agreements for how content is licensed that align with your terms of service (and note that licenses are typically granted on territorial basis, so consider the need to add the metaverse to the licence for clarity).
Censorship and content standards are other key areas of focus. Legal compliance teams will need to know the type of content that certain governments and countries censor or restrict and will need to consider what artists can or cannot include in their performances.
Attention also needs to be paid to video and media regulation. There is currently no metaverse law, but, as with any new development, there is already a network of existing laws which can apply. In particular, since the metaverse is an audiovisual format, it is worth thinking about how content might interact with existing broadcast and audio visual regulation. The cornerstone of audivisual regulation in the EU is the Audio-Visual Media Services Directive, which was updated in 2018, and national implementations are still ongoing. In addition, if you are a metaverse platform provider, then you need to consider whether the platform qualifies as a video-sharing platform service.
Data and AI
What data and AI regulatory frameworks in the UK and EU could potentially apply to a concert in the metaverse? Data in the metaverse will generally be covered by UK and Europe General Data Protection Regulations (GDPR), however there is very little in GDPR that is specific to AI. GDPR will still have big implications for all metaverse participants, since much more data on participants can be collected inside a metaverse than at an actual concert.
Data privacy issues in the metaverse will be an extension of regulations already used for the internet, and providers need to make sure protocols are in place and data is being collected fairly, transparently, provide privacy notices and have consent for tracking technologies.
Platform providers, people selling virtual art, non-fungible tokens, virtual merchandise and goods, and payment providers sharing personal data need to make sure they have contracts in place that deal properly with data privacy issues. They need to make it a compliance exercise and be aware of possible risks involved.
The Artificial Intelligence Act proposes different levels of regulation depending on the perceived risks posed by types of AI. some use of AI relevant to a metaverse may well be in the unacceptable, high-risk and high-regulation bracket if they amount to subliminal, manipulative or exploitative techniques that cause harm, or involve automated face recognition or other biometrics.
Content and interaction
The growth of content and interaction within the metaverse will bring more privacy challenges. Traditionally, online harms were a focus of media regulators, but privacy regulators have now also discovered this field.
The protection of minors as a privacy issue is a global challenge and a number of authorities have already issued guidelines. The UK regulator, the Information Commissioner's Office, has published the Age Appropriate Design Code, which sets 15 standards that online services need to follow.
Digital content and private devices have changed the way we consume content and brought new methods to protect minors on digital devices. These include putting clear and visible labels on content and technical filters, and the use of video streaming services that show ratings before users click on play. In the games industry, codes of conduct needed to be put in place, moderation policies and procedures for sanctioning disruptive behaviour.
In the metaverse, media consumption occurs in a shared space and this brings with it new challenges, in particular around the content that people may interact which has a much broader range.
Toxic behaviour: evolving responses
The response to toxic users and disruptive behaviour in the metaverse is still evolving. The development of flagging mechanisms, codes of conduct and ways to generally deal with problematic behaviour will pose new and interesting challenges. As always with new developments, it's important to monitor how new legislation and regulations are addressing these challenges.
In general, there are three different types of online content. There is content that is generally banned because it violates criminal laws. There is content that has age restrictions and requirements. And there is content that is suitable for all audiences.
At the moment, regulatory focus lies primarily with service providers; however, regulations are shifting and addressing other actors. For example, the French Senate has adopted a draft bill to strengthen parental control over the internet, which would require operating systems to install parental controls on devices by default. Similar legislation is being discussed in Germany.
Online harms is an area that is under growing scrutiny from media regulators, data privacy authorities, law enforcement agencies, and consumer protection bodies. Data protection authorities are increasingly focused on the protection of minors and this will increase as users produce more data. This is leading to national legislation in the EU and worldwide that is intended to ensure safety and reduce online harms.
Osborne Clarke comment
What does this all mean for companies operating in the metaverse? These issues have global relevance. There is debate around similar regulations outside of the UK and Europe. For example, California legislators are working on a draft Age Appropriate Design Code Act, and other US initiatives are taking shape in the domain of federal regulation of AI and reforms of Safe Harbour protections related to user-generated content.
The "meta takeaway" from this is that it makes sense to think about compliance by design and a framework that could address these various issues while remaining flexible enough to accommodate the differences in regional and national legislation.
International Expansion Associate Christina Nordin helped write this insight.