The UK Government outlined its plans to introduce new rules for tech firms in its Online Harms White Paper, to make the UK a safer place to be online. The safety of children is at the heart of the measures.
The new regulations will apply to any company in the world hosting user-generated content online that is accessible by people in the UK or enabling them to privately or publicly interact with others online.
All companies will need to take appropriate steps to address illegal content and activity such as terrorism and child sexual abuse. They will also be required to assess the likelihood of children accessing their services and, if so, provide additional protections for them. This could be by using age verification solutions to ensure children are not accessing platforms which are not suitable for them.
The scope includes social media, video sharing and instant messaging platforms, online forums, dating apps, commercial pornography websites, as well as online marketplaces, peer-to-peer services, consumer cloud storage sites and video games which allow online interaction. Search engines will also be subject to the new regulations.
Tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography. The white paper divides online platforms into two categories:
- Category 1 services include companies with the largest online presences and high-risk features: Facebook, TikTok, Instagram and Twitter. These companies will need to assess the risk of legal content or activity on their services with “a reasonably foreseeable risk of causing significant physical or psychological harm to adults”. They will then need to make clear what type of “legal but harmful” content is acceptable on their platforms in their terms and conditions and enforce this transparently and consistently. All companies will need mechanisms so people can easily report harmful content or activity while also being able to appeal the takedown of content. Category 1 companies will be required to publish transparency reports about the steps they are taking to tackle online harms.
- Category 2 services are platforms which host dating services or pornography and private messaging apps. Less than three per cent of UK businesses will fall within the scope of the legislation and the vast majority of companies will be Category 2 services.
Ofcom is now confirmed as the regulator with the power to fine companies who fail in their duty of care. Fines will be up to £18 million or ten percent of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the UK and the government will reserve the power for senior managers to be held liable.
As well as ensuring that online businesses comply with the new laws proposed by the UK, AgeGO recommends that commercial pornography sites and adult dating platforms should begin using some kind of age verification solution to ensure users of their platforms are 18 years or over. This acts as an additional safety measure to not only to stop underage users from accessing adult content, but also ensuring compliance for their businesses.