All Posts By

Malcolm Hutty

Online harms White Paper

By | Content Issues

The UK government has today published its much anticipated policy paper “Online Harms“. This proposes a regulator for Internet content (whether
a new regulator, or repurposing an existing one), to introduce sweeping new regulation of Internet companies and against a broad range of Internet content. The centre-piece obligation for companies is a new “duty of care” to protect their users from broadly-described “harm”, the meaning of which will be set out in a series of Codes of Practice developed by the new regulator.

The government describes this as “novel and ambitious” and “world-leading”.

Introducing the White Paper, Secretary of State for Digital Jeremy Wright said

The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough

Home Secretary Sajid Javid added

The tech giants and social media companies have a moral duty to protect the young people they profit from.

Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online.

That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise.

A twelve-week consultation on the White Paper begins today. A synoposis of some of the stand-out elements follows.

Companies covered

Companies in scope will be those “that allow users to share or discover user-generated content or interact with each other online.”, regardless
of size.

“A wide variety of organisations provide these services to users. […] The scope will include companies from a range of sectors, including social media companies, public discussion forums, retailers that allow users to review products online, along with non-profit organisations, file sharing sites and cloud hosting providers.”

The regulator will adopt a “risk based approach” and “the regulator’s initial focus will be on those companies that pose the biggest and clearest risk of harm to users, either because of the scale of the platforms or because of known issues with serious harms”.

Funding

“The regulator will be funded by industry in the medium term, and the government is exploring options such as fees, charges or a levy to put
it on a sustainable footing.”

Harms covered

  • All illegal content; plus
  • Content without clear legal definition, including
    • cyberbullying and trolling
    • extremist content and activity
    • coercive behaviour and intimidation
    • promotion of self-harm and FGM
  • Children’s access
    • to pornography
    • to dating sites
    • to other inappropriate content
    • to social media (under 13)
  • A broad range of legal content including
    • disinformation
    • manipulation
    • abuse of public figures
  • Emerging challenges such as
    • Excessive screen time
    • “Designed in addiction”

The list is stated to be neither exhaustive nor static.

Out of scope: GDPR, hacking, “dark web”.

 

Private communications

The extent to which private communications will be included within the scope of the regulator’s ambit isn’t entirely clear. The White Paper suggests one-to-one communications will be out of scope, but “a WhatsApp group with hundreds of members” would be in scope. “Small private channels” used to groom children for sex, or or used to encourage terrorism, would be in scope.

There is a consultation question on the definition of private communications.

Technical measures

“Companies should invest in the development of safety technologies to reduce the burden on users to stay safe online.”

 

Regulation

  •  “The government will establish a new statutory duty of care to make companies take more responsibility for the safety of their users and
    tackle harm caused by content or activity on their services.”
  • “Companies must fulfil their new legal duties. The regulator will set out how to do this in codes of practice.”
  • “Companies will still need to be compliant with the overarching duty of care even where a specific code does not exist, for example assessing and responding to the risk associated with emerging harms or technology.”
  • There will be
    • a complaints mechanism for users to complain about content
    • (as a consultation option) a right for organisations that aim to protect users to make “supercomplaints”
    • strict timelimits for companies to act
    • redress mechanisms for users
  • “The scope to use the regulator’s findings in any claim against a company in the courts on grounds of negligence or breach of contract. And, if the regulator has found a breach of the statutory duty of care, that decision and the evidence that has led to it will be available to the individual to use in any private legal action.”
  • “To inform its reports and to guide its regulatory action, the regulator will have the power to require annual reports from companies” covering a broad and detailed range of aspects of what they do in this area, as well as a power for the regulator to make specific enquiries and compel the disclosure of research about harms.

The regulator will be required to follow the some regulatory principles:

  • A Duty for regulator to pay “due regard” to innovation and to protect users rights online, including fundamental rights to freedom of expression.
  • A Duty for regulator to respect proportionality
  • “A risk based approach”

Enforcement and sanctions

  • “Substantial fines” for non-compliance
  • ISP blocking could be used to prevent user access to foreign online services that decline to obey the new UK regulation
  • Another option for enforcement against non-compliant foreign entities would be disruption of their business activities (e.g. by getting payment
    services providers to withhold service, removing from search engines, preventing links on social media)
  • The White Paper includes a consultation question on personal liability for senior managers of businesses that fail to discharge their duty of care, including possibly civil fines or even personal criminal liability
  • The government is also consulting on imposing a requirement on companies that offer online services to nominate a representative person who is
    within the UK or the EEA, and so reachable for enforcement action and personal liability

Fulfilling a duty of care

Essentially, the actions a company would have to undertake in order to be considered to have discharged its duty of care are to be left to be determined through Codes of Practice established by the regulator.The White Paper sets out a range of topics a Code of Practice should cover, broken out by type of harm (terrorism, CSAM, disinformation, harassment etc)

Malaysian penalty for “fake news”: 10 years in jail

By | Content Issues, International, News

The Malaysian government has brought forward a bill in Parliament that sets the penalty for publishing so-called “fake news” online with up to ten years in jail plus a fine of 500,000 MYR (about £90,000), Reuters reports.

Kuala Lumpur, capital of Malaysia

“The proposed Act seeks to safeguard the public against the proliferation of fake news whilst ensuring the right to freedom of speech and expression under the Federal Constitution is respected,” the government said in the bill.

The bill gives a broad definition to fake news, covering  “news, information, data and reports which is or are wholly or partly false”. It seeks to apply the law extra-territorially, to anything published on the Internet provided Malaysia or Malaysians are affected by the article.

“Fake news” has become an increasingly popular target of political attack since Donald Trump popularised the term in his battles with CNN and other major broadcasters. In the UK, a Parliamentary Select Committee recently held their first ever hearings in Washington DC on the subject, summoning social media platforms to be lambasted for failing to suppress allegedly “fake news”. The Prime Minister’s office established a new unit to counter fake news in January.

So far, however, no UK government Minister has suggested jailing people for writing something on the Internet that isn’t right.

Council of Europe publishes guidlelines for Internet intermediaries

By | International, News

The Council of Europe has published a Recommendation to Member States on the roles and responsibilities of Internet intermediaries. The Recommendation declares that access to the Internet is a precondition for the ability effectively to exercise fundamental human rights, and seeks to protect users by calling for greater transparency, fairness and due process when interfering with content. The Recommendation also calls for greater respect for user privacy.

The Recommendations’ key provisions aimed at governments include:

  • Public authorities should only make “requests, demands or other actions
    addressed to internet intermediaries that  interferes with human rights and fundamental freedoms” when prescribed by law. This means they should therefore avoid asking intermediaries to remove content under their terms of service or to make their terms of service more restrictive.
  • Legislation giving powers to public authorities to interfere with Internet content should clearly define the scope of those powers and available discretion, to protect against arbitrary application.
  • When internet intermediaries restrict access to third-party content based on a State order, State  authorities should ensure that effective redress mechanisms are made available and adhere to applicable  procedural safeguards.
  • When intermediaries remove content based on their own terms and conditions of  service, this should not be considered a form of control that makes them liable for the third-party content for  which they provide access. 
  • Member States should consider introducing laws to prevent vexatious lawsuits designed to suppress users free expression, whether by targeting the user or the intermediary. In the US, these are known as “anti-SLAPP laws“.

The Recomendations’ provisions aimed at service providers include:

  • A “plain language” requirement for terms of service.
  • A call to include outside stakeholders in the process of drafting terms of service.
  • Transparency on how restrictions on content are applied, when, and detailed information on how algorithmic and automated means are used.
  • Transparency reporting
  • Effective remedies and complaints mechanisms for users who wish to dispute restriction of their service or content. “all remedies should allow for an impartial and independent  review of the alleged violation [of users’ rights to expression]. These should – depending on the violation in question – result in inquiry, explanation, reply, correction, apology, deletion, reconnection or compensation”.

The Council of Europe is an intergovernmental body entirely separate from the European Union. With 47 member states, it seeks to promote democracy, human rights and the rule of law, including by monitoring adherence to the rulings of the European Court of Human Rights. Its Recommendations are not legally binding on Member States, but are very influential in the development of national policy and of the policy and law of the European Union.