The UK government has today published its much anticipated policy paper “Online Harms“. This proposes a regulator for Internet content (whether
a new regulator, or repurposing an existing one), to introduce sweeping new regulation of Internet companies and against a broad range of Internet content. The centre-piece obligation for companies is a new “duty of care” to protect their users from broadly-described “harm”, the meaning of which will be set out in a series of Codes of Practice developed by the new regulator.
The government describes this as “novel and ambitious” and “world-leading”.
Introducing the White Paper, Secretary of State for Digital Jeremy Wright said
The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough
Home Secretary Sajid Javid added
The tech giants and social media companies have a moral duty to protect the young people they profit from.
Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online.
That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise.
A twelve-week consultation on the White Paper begins today. A synoposis of some of the stand-out elements follows.
Companies in scope will be those “that allow users to share or discover user-generated content or interact with each other online.”, regardless
“A wide variety of organisations provide these services to users. […] The scope will include companies from a range of sectors, including social media companies, public discussion forums, retailers that allow users to review products online, along with non-profit organisations, file sharing sites and cloud hosting providers.”
The regulator will adopt a “risk based approach” and “the regulator’s initial focus will be on those companies that pose the biggest and clearest risk of harm to users, either because of the scale of the platforms or because of known issues with serious harms”.
“The regulator will be funded by industry in the medium term, and the government is exploring options such as fees, charges or a levy to put
it on a sustainable footing.”
- All illegal content; plus
- Content without clear legal definition, including
- cyberbullying and trolling
- extremist content and activity
- coercive behaviour and intimidation
- promotion of self-harm and FGM
- Children’s access
- to pornography
- to dating sites
- to other inappropriate content
- to social media (under 13)
- A broad range of legal content including
- abuse of public figures
- Emerging challenges such as
- Excessive screen time
- “Designed in addiction”
The list is stated to be neither exhaustive nor static.
Out of scope: GDPR, hacking, “dark web”.
The extent to which private communications will be included within the scope of the regulator’s ambit isn’t entirely clear. The White Paper suggests one-to-one communications will be out of scope, but “a WhatsApp group with hundreds of members” would be in scope. “Small private channels” used to groom children for sex, or or used to encourage terrorism, would be in scope.
There is a consultation question on the definition of private communications.
“Companies should invest in the development of safety technologies to reduce the burden on users to stay safe online.”
- “The government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and
tackle harm caused by content or activity on their services.”
- “Companies must fulfil their new legal duties. The regulator will set out how to do this in codes of practice.”
- “Companies will still need to be compliant with the overarching duty of care even where a specific code does not exist, for example assessing and responding to the risk associated with emerging harms or technology.”
- There will be
- a complaints mechanism for users to complain about content
- (as a consultation option) a right for organisations that aim to protect users to make “supercomplaints”
- strict timelimits for companies to act
- redress mechanisms for users
- “The scope to use the regulator’s findings in any claim against a company in the courts on grounds of negligence or breach of contract. And, if the regulator has found a breach of the statutory duty of care, that decision and the evidence that has led to it will be available to the individual to use in any private legal action.”
- “To inform its reports and to guide its regulatory action, the regulator will have the power to require annual reports from companies” covering a broad and detailed range of aspects of what they do in this area, as well as a power for the regulator to make specific enquiries and compel the disclosure of research about harms.
The regulator will be required to follow the some regulatory principles:
- A Duty for regulator to pay “due regard” to innovation and to protect users rights online, including fundamental rights to freedom of expression.
- A Duty for regulator to respect proportionality
- “A risk based approach”
Enforcement and sanctions
- “Substantial fines” for non-compliance
- ISP blocking could be used to prevent user access to foreign online services that decline to obey the new UK regulation
- Another option for enforcement against non-compliant foreign entities would be disruption of their business activities (e.g. by getting payment
services providers to withhold service, removing from search engines, preventing links on social media)
- The White Paper includes a consultation question on personal liability for senior managers of businesses that fail to discharge their duty of care, including possibly civil fines or even personal criminal liability
- The government is also consulting on imposing a requirement on companies that offer online services to nominate a representative person who is
within the UK or the EEA, and so reachable for enforcement action and personal liability
Fulfilling a duty of care
Essentially, the actions a company would have to undertake in order to be considered to have discharged its duty of care are to be left to be determined through Codes of Practice established by the regulator.The White Paper sets out a range of topics a Code of Practice should cover, broken out by type of harm (terrorism, CSAM, disinformation, harassment etc)