U.K. Sets Out Law to Prosecute Bosses in Big Tech Crackdown

(Bloomberg) — The U.K. is introducing long-awaited and sweeping legal proposals to force internet companies to remove illegal content from their platforms, giving regulator Ofcom power to impose massive fines and prosecute executives personally for failures to comply.

Although its central aims and numerous revisions have been in circulation since 2019, the Online Safety Bill will be presented for lawmakers to see in full this Thursday, after a first formal draft was published in May.

The bill is intended to make technology companies more accountable for removing illegal material from their platforms. This includes content that promotes terrorism or suicide, revenge pornography, and child sexual abuse material. Harmful and adult content is also covered by the bill.

Also part of the Online Harms Bill:

  • A requirement for age verification on all websites that host pornography
  • A measure to combat anonymous trolling, or abuse and unwanted contact on social media
  • The criminalization of so-called cyber-flashing
  • Requirement for companies to report CSAM content to the U.K.’s National Crime Agency
  • The right given to users to appeal to platforms if they think their posts have been taken down unfairly
  • News content and journalism will be exempt from the regulation, DCMS said.

Companies will be required to show how they pro-actively tackle such content spreading. Ofcom will be given the power to enter offices and inspect data and equipment to gather evidence, and it’ll be a criminal offence to obstruct an investigator, the Department for Digital, Culture, Media and Sport said in a statement.

In addition to senior executives being made liable for how their companies comply with the law, businesses could face fines of as much as 10% of their annual global revenue for breaching regulations.

“The U.K. government is certainly hoping that the online safety bill will set a global standard,” said Linklaters partner Ben Packer in a statement. “As many platforms will want to maintain a broadly consistent user experience globally, that may end up being in the case.”

CBI Chief Policy Director Matthew Fell said the legislation was “necessary,” but that “in its current form raises some red flags, including extending the scope to legal but harmful content. Not only will this deter investment at a time when our country needs it most but will fail to deliver on the aims of this legislation.”

After being presented to lawmakers, it’s likely the bill will spend several months of further revisions and votes before gaining royal assent, after which it will become law.

Big Tech businesses like Facebook-owner Meta Platforms Inc. are already required to take down illegal content after it has been reported to them. But governments want them to act more quickly. 

The government is giving Ofcom the responsibility to scrutinize and challenge algorithms and systems inside big technology companies that can propagate harm online — rather than asking officials to chase and litigate on individual bad pieces of content.

The biggest platforms and their apps will be classed as “Category One” and also have to clamp down on legal-but-harmful content, the specifics of which will be added by lawmakers later. 

(Updates with comment in 7th paragraph.)

More stories like this are available on bloomberg.com

©2022 Bloomberg L.P.

Close Bitnami banner
Bitnami