Skip to Navigation Skip to Content

The Online Safety Act 2023 aims to protect UK citizens from harmful content found online, but how does it aim to achieve this? Here’s everything you need to know about the act, including how to comply with it if your service is regulated by it.

Sort your terms of service for the Online Safety Act

Terms and Conditions Generator

What is the United Kingdom’s Online Safety Act 2023?

The Online Safety Act 2023 is an act of the Parliament of the United Kingdom created to protect users, especially children, from the onslaught of harmful or illegal content they face online. It imposes new requirements for online service providers, holding them accountable for reducing the amount of harmful content shared on their platforms. And, it grants the government the power to introduce new codes of practice that enhance online safety for UK citizens.

Who does the Online Safety Act apply to?

The Online Safety Act applies to specific user-to-user services and search services, which they refer to as ‘regulated services’. These are user-to-user services and search services that don’t meet the criteria for exemption (see schedule 1 of the act) and that have links to the United Kingdom, meaning:

  1. The service has a significant number of United Kingdom users,
  2. United Kingdom users form one of the target markets for the service (or the only target market)
  3. The service is capable of being used in the United Kingdomq by individuals, and there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the United Kingdom arising from content present and likely to be encountered on the service.

Find out if the Online Safety Act applies to your online service

Which online services are excempt?

Online services are exempt when communication is limited to email, SMS, MMS, or one-to-one voice calls and / or users can only interact with business-created content through comments, ratings, or reactions – not with other users’ content.

What is a user-to-user service?

According to the Online Safety Act, a user-to-user service is “an internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”. In other words, any service that hosts User Generated Content.

What is a search service?

A search service is any internet service that is, or includes, a search engine and is not a user-to-user service. Search engines being sites that allow users to search other websites or databases, rather than just their own.

What does the Online Safety Act do?

The Online Safety Act imposes new duties of care for regulated services the act applies to, and gives the UK government significant regulatory powers over online content and. The goal is to ensure that all services within the scope of the act, are doing what they can to protect their users from illegal or harmful content shared on their platforms.

What is a duty of care?

A duty of care is a legal obligation imposed on individuals and companies, requiring them to meet a standard of reasonable care to keep others safe from harm.

What powers were given to the relevant Secretary of State?

The Online Safety Act granted the relevant Secretary of State the power to direct and strategize with OFCOM to adapt codes of practice for online safety, but not micromanage how OFCOM puts these codes into practice.

What are the duties of care imposed on regulated services?

In the case of the Online Safety Act, there are several duties of care imposed on all regulated services, with additional duties of care imposed on regulated services likely to be accessed by children.

Duties of care for all regulated services

All providers of regulated services have the following duties:

  1. The duty to perform an illegal content risk assessment
  2. The duty to implement systems and processes that
    • minimize the presence illegal content
    • minimize the length of time for which illegal content is present
    • minimize the dissemination of illegal content
    • minimize the time between becoming aware of illegal content and taking down said content
  3. The duty to protect users’ legal rights to privacy and express themselves freely
  4. The duty to implement systems to handle and respond to user complaints
    • Making it easy for their users to report harmful content
    • Taking appropriate action in response to complaints
  5. The duty to keep record when they perform their duties of care

Duties of care for regulated services likely to be accessed by children

Providers of regulated services are likely to be accessed by children have additional duties duties:

Children’s risk assessment duties

Regulated services likely to be accessed by children have the duty to:

  1. Conduct regular children’s risk assessments
  2. Take appropriate steps to reduce risks identified for different age groups
  3. Address harmful content impacts based on children’s age groups

Duties to protect children’s online safety

Regulated services likely to be accessed by children have the duty to create and maintain systems that:

    1. Prevent children from seeing harmful content
    2. Protect specific age groups from other harmful content they’re deemed at risk of encountering

Duty to have a clear and accessible Terms of Service

Regulated services likely to be accessed by children have the duty to clearly explain within their terms and conditions, how they will:

    • Prevent children from seeing primary priority harmful content
    • Protect at-risk age groups from priority harmful content
    • Protect at-risk age groups from other identified harmful content

These terms must be easy to understand and be applied consistently.

Use out Terms and Conditions Generator to create your terms of service.

When do you need to meet the requirements of the Online Safety Act?

According to Ofcom’s deadlines for online safety compliance, if you are classified as a regulated service under the act, you must carry out an illegal content risk assessment by March 2025, when the Illegal Harms Codes of Practice comes into force.

Learn how to perform an illegal content risk assessment.

What is the penalty for non-compliance with the Online Safety Act?

Ofcom is empowered to enforce the provisions of the Online Safety Act. Non-compliant platforms may face substantial fines of up to £18 million or 10% of their global revenue, whichever is higher. Additionally, senior managers could be held criminally liable if they fail to ensure their organization follows information requests from Ofcom.

How to perform an illegal content risk assessment

 

Step 1: Educate yourself on the types of illegal content that you’re looking for

Examine the 17 types of priority illegal content and determine which ones could appear on your service.

Step 2: Perform a risk assessment on your service

Evaluate how likely illegal content could appear and what impact it would have, using evidence from user reports and service features.

Step 3: Decide on safety measures and record outcomes

Choose and put in place safety measures that match your risk level, then document everything you’ve implemented.

Step 4: Monitor and update risk assessment when required

Track how well your safety measures work, watch for new risks, and review your assessment yearly or before major service changes.

The deadline for completing your first risk assessment is March 16, 2025. For more guidance, refer to Ofcom’s Risk Assessment Guidance and Risk Profiles