Online platforms must start putting in place measures to protect people in the UK from criminal activity, under the UK’s Online Safety Act, says the UK watchdog Ofcom, which under the 2023 Act has the task of setting and enforcing the rules.
The watchdog says that platforms now have to start taking appropriate measures to remove illegal material quickly when they become aware of it, and to reduce the risk of ‘priority’ criminal content from appearing. Ofcom says that it will be assessing platforms’ compliance, and launching what it calls ‘targeted enforcement action’. Platforms had until March 16 to carry out a ‘suitable and sufficient illegal harms risk assessment’.
What’s covered
Services in scope of the Online Safety Act include search engines and ‘user-to-user’ services: social media or video-sharing platforms, messaging, gaming and dating apps, forums and file-sharing sites. The Act lists over 130 ‘priority offences’, and tech firms must assess and mitigate the risk of these occurring on their platforms. Ofcom splits the priority offences into 17 categories:
Terrorism
Harassment, stalking, threats and abuse offences
Coercive and controlling behaviour
Hate offences
Intimate image abuse
Extreme pornography
Child sexual exploitation and abuse
Sexual exploitation of adults
Unlawful immigration
Human trafficking
Fraud and financial offences
Proceeds of crime
Assisting or encouraging suicide
Drugs and psychoactive substances
Weapons offences (knives, firearms, and other weapons)
Foreign interference; and
Animal welfare.
Suzanne Cater, Enforcement Director at Ofcom, says: “Child sexual abuse is utterly sickening and file storage and sharing services are too often used to share this horrific material. Ofcom’s first priority is to make sure that sites and apps take the necessary steps to stop it being hosted or shared. Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action.”
The consumer protection and product testing body Which? has complained in general of a ‘flood of scams‘ online. In particular it has complained of a lack of proper identity checks on property owners listing holiday homes on the website Booking.com, meaning accommodation that does not exist on the platform, that people however pay for. Rocio Concha, Which? Director of Policy and Advocacy, said: “It’s really worrying that so many scams are slipping through the net on Booking.com.
“The illegal harms codes coming into effect on March 17 will require platforms to do more to prevent user-generated fraud but there are several simple changes that Booking.com could make now to tighten its security and close loopholes on its site which are being exploited by scammers.
“Ofcom should take note of these findings as the codes come into force. If these issues persist, Ofcom must make use of its new powers and not hesitate to take action against Booking.com and other platforms who are failing to prevent fraudsters from targeting and scamming their customers.”




