Public Affairs Policy Matters: Content Regulation

By Malcolm Hutty, LINX Head of Public Affairs

Malcolm Hutty, LINX Head of Public Affairs

Introduction

In a series of three blogs, I want to provide an overview of some of the most significant policy matters that the LINX Public Affairs team is engaging with at the moment. These broadly fall into the categories of content regulation, security and international affairs. In this article we will focus on content regulation.

Content Regulation

As online platforms have become increasingly central to the online experience, governments have responded by demonstrating an increased appetite to regulate the content that they host, arguing that this is necessary to protect users from harmful content. The results can be seen in two significant pieces of legislation proposed at both the UK and EU level, both of which could have far-reaching consequences for online services: the UK Online Harms White Paper and the EU Digital Services Act.

Online Harms White Paper

In April 2019, the UK Government published the Online Harms White Paper, which proposed a new regulatory framework that would impose a “duty of care” on online services to “take reasonable steps to keep their users safe and tackle illegal and harmful content on their services.”

The scope of regulation appears to be wide. The proposed duty of care would apply to any service that either (1) facilitates the hosting, sharing, or discovery of user-generated content; or (2) facilitates online interactions between users. This definition appears to include not only providers of online platforms but also any business operating its own website that includes functionality allowing people to post content or interact with each other.

Unsurprisingly, one of the major concerns that has been raised in response to these proposals has been that imposing such a duty of care risks infringing the right to freedom of expression. The Government has responded to these concerns by indicating that it will distinguish between content that is illegal and content that may be considered harmful but is nonetheless legal. Under the duty of care, operators will be required to expeditiously remove illegal content from their services. They will also be expected to explicitly state what content and behaviours are acceptable on their site (e.g. through their terms and conditions) and then to enforce this consistently.

The Government has proposed that an independent regulatory body should oversee the regulatory framework and has indicated that this role will be assigned to Ofcom, which is both the telecommunications sector regulator and the content regulator for most content broadcast on TV and radio in the UK. Ofcom will be responsible for overseeing operators’ overall compliance with the duty of care rather than adjudicating individual complaints. This means that it will not determine whether a specific piece of content should or should not be removed but will set standards for operator processes for content regulation, as well as ruling on whether they have met those standards. The Government has not set out the sanctioning powers that will be available to Ofcom yet, but it has suggested it will include the power to issue substantial fines and may extend to imposing liability on senior managers and, in certain circumstances, requiring companies to improve systems or even requiring ISPs to block non-compliant services.

The Government has also indicated that operators in-scope of the legislation will need to implement appropriate age verification technologies to prevent children from being exposed to inappropriate content.  However, it is not clear whether this would apply to all services or just those likely to be accessed by children.

Finally, operators may also be required to adopt certain transparency measures. Ofcom will have the power to require companies to submit annual transparency reports explaining the prevalence of harmful content on their platforms and the measures they are taking to address it.

The Government has stated that  legislation will be introduced “as soon as possible”; a further announcement is expected early in 2021.

Digital Services Act

Significant legislation is also being proposed at the European level, with a fundamental review of the core legal framework established by the e-Commerce Directive in 2000. The package of legislation coming as a result of that review is currently known as the “Digital Services Act”, which is likely to be at least two separate pieces of legislation, one covering controls on content, the other dealing with economic effects. Under the latter, the Commission is likely to propose new ex ante rules covering large online platforms, which the Commission believes are acting as gatekeepers and creating “market imbalances”. The Commission is also exploring options to set up a centralised EU tech regulator with the power to enforce the legislation.

The drafting of the legislation is still at an early stage and the Commission is currently running a public consultation to help it to identify the specific issues that may require intervention, which is open until 8 September 2020. The Commission is then expected to release its proposals for the Digital Services Act package at the end of the year.