Social media companies could face fines of billions of pounds for failing in their duty of care to children and adults.
Under the UK’s long-awaited and long-debated Online Harms Bill, strict guidelines are being introduced covering harmful material such as child sexual abuse, terrorist material, anti-vax conspiracies and more.
Regulator Ofcom will have the power to fine platforms up to £18 million or 10 per cent of global annual turnover, whichever is the higher. In the case of Facebook, which reported earnings of $71 billion last year, that could mean a fine of more than $7 billion.
And while the government has stopped short of introducing criminal penalties for individual senior managers, it says it could still do so if organizations fail to abide properly by the new rules.
The legislation will apply to all organizations that host user-generated content accessible by users in the UK, or that allow users to privately or publicly interact on the internet, such as search engines, social media or messaging platforms, dating apps and video games that have chat services.
The bill is being broadly welcomed.
“The long-awaited online harms bill is a once in a generation opportunity to tackle these dangerous elements of the internet which have real-world consequences,” says shadow secretary of state for digital, culture, media and sport Jo Stevens.
“We need the government to take this seriously whether it is hate speech, disinformation or self-harm content. The internet should be a safe place for everyone and this legislation must be ambitious in its scope.”
And, says Anne Longfield, children’s commissioner for England, “The signs are that this regulation will have teeth, including strong sanctions for companies found to be in breach of their duties, and a requirement on messaging apps to use technology to identify child abuse and exploitation material when directed to by the regulator.”
However, she added, “Much will rest on the detail behind these announcements, which we will be looking at closely.”
And some believe it does not go far enough. Online scams aren’t included, for example, and many campaigners, including the NSPCC, have argued for criminal sanctions against managers.
Meanwhile, warns Will Moy, chief executive of Full Fact, “Parliment needs to scrutinise these proposed laws very carefully. They must be balanced with the protection of people’s freedom of speech, and tackling misinformation shouldn’t just be handed over to a government-appointed regulator.”
The government says the bill will be presented to parliament next year, though it may not be introduced into law until 2022.