What is the Gramm-Leach-Bliley Act (GLBA)?
The Gramm-Leach-Bliley Act (GLBA): A Brief Overview

Create a tailored Privacy Policy, Terms & more in under 5 minutes.
The Online Safety Act 2023 aims to protect UK citizens from harmful content found online, but how does it aim to achieve this? Here’s everything you need to know about the act, including how to comply with it if your service is regulated by it.
Sort your terms of service for the Online Safety Act
Terms and Conditions GeneratorThe Online Safety Act 2023 is an act of the Parliament of the United Kingdom created to protect users, especially children, from the onslaught of harmful or illegal content they face online. It imposes new requirements for online service providers, holding them accountable for reducing the amount of harmful content shared on their platforms. And, it grants the government the power to introduce new codes of practice that enhance online safety for UK citizens.
The Online Safety Act applies to specific user-to-user services and search services, which they refer to as ‘regulated services’. These are user-to-user services and search services that don’t meet the criteria for exemption (see schedule 1 of the act) and that have links to the United Kingdom, meaning:
Find out if the Online Safety Act applies to your online service
Online services are exempt when communication is limited to email, SMS, MMS, or one-to-one voice calls and / or users can only interact with business-created content through comments, ratings, or reactions – not with other users’ content.
According to the Online Safety Act, a user-to-user service is “an internet service by means of which content that is generated by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user, or other users, of the service”. In other words, any service that hosts User Generated Content.
A search service is any internet service that is, or includes, a search engine and is not a user-to-user service. Search engines being sites that allow users to search other websites or databases, rather than just their own.
The Online Safety Act imposes new duties of care for regulated services the act applies to, and gives the UK government significant regulatory powers over online content and. The goal is to ensure that all services within the scope of the act, are doing what they can to protect their users from illegal or harmful content shared on their platforms.
A duty of care is a legal obligation imposed on individuals and companies, requiring them to meet a standard of reasonable care to keep others safe from harm.
The Online Safety Act granted the relevant Secretary of State the power to direct and strategize with OFCOM to adapt codes of practice for online safety, but not micromanage how OFCOM puts these codes into practice.
In the case of the Online Safety Act, there are several duties of care imposed on all regulated services, with additional duties of care imposed on regulated services likely to be accessed by children.
All providers of regulated services have the following duties:
Providers of regulated services are likely to be accessed by children have additional duties duties:
Regulated services likely to be accessed by children have the duty to:
Regulated services likely to be accessed by children have the duty to create and maintain systems that:
Regulated services likely to be accessed by children have the duty to clearly explain within their terms and conditions, how they will:
These terms must be easy to understand and be applied consistently.
Use out Terms and Conditions Generator to create your terms of service.
According to Ofcom’s deadlines for online safety compliance, if you are classified as a regulated service under the act, you must carry out an illegal content risk assessment by March 2025, when the Illegal Harms Codes of Practice comes into force.
Learn how to perform an illegal content risk assessment.
Ofcom is empowered to enforce the provisions of the Online Safety Act. Non-compliant platforms may face substantial fines of up to £18 million or 10% of their global revenue, whichever is higher. Additionally, senior managers could be held criminally liable if they fail to ensure their organization follows information requests from Ofcom.
Examine the 17 types of priority illegal content and determine which ones could appear on your service.
Evaluate how likely illegal content could appear and what impact it would have, using evidence from user reports and service features.
Choose and put in place safety measures that match your risk level, then document everything you’ve implemented.
Track how well your safety measures work, watch for new risks, and review your assessment yearly or before major service changes.
The deadline for completing your first risk assessment is March 16, 2025. For more guidance, refer to Ofcom’s Risk Assessment Guidance and Risk Profiles