CampaignSMS

Tackling Online Abuse In Sport: The UK's Online Safety Act 2023 – Lexology

Review your content’s performance and reach.
Become your target audience’s go-to resource for today’s hottest topics.
Understand your clients’ strategies and the most pressing issues they are facing.
Keep a step ahead of your key competitors and benchmark against them.
add to folder:
Find out more about Lexology or get in touch by visiting our About page.
After years in the making, the Online Safety Act (the “OSA”) has come into force after receiving Royal Assent on 26 October 2023 (as discussed in our blog here).
Amidst the proliferation of social media use, there has been a worrying increase in the levels of abuse that players, athletes, officials, managers, coaches and other individuals connected with the sports industry are facing. Notably, for the 2022/2023 football season, Kick it Out, an organisation which campaigns against discrimination in football, reported a 279% increase in reports of online abuse.[1] To give a sense of the scale of and spikes in the abuse that those connected with the sports industry can face, the abusive comments on Bruno Fernandes’ social media pages following a penalty miss, increased 3,000% that day and he continued to receive hateful messages every hour for two weeks following the miss.[2]
Given this context, it comes as little surprise that sports organisations such as the English Football Association (“FA”), Kick it Out, English Football League (EFL), Premier League and the Professional Footballers Association (PFA) worked closely with the Government on the OSA to help tackle discrimination against individuals online.[3] Whilst online abuse of footballers often hits the headlines given the sports’ widespread popularity, the issues are unfortunately commonplace across the sporting landscape. The recent Rugby World Cup highlighted incidents of abuse not only targeted at players, such as Tom Curry in the wake of England’s semi-final with South Africa[4], but also abuse of, and threats to, a number of officials[5]. A further example includes reports of female tennis players facing online threats on social media from gamblers.[6]
If enforced effectively, it is hoped the OSA could be a significant step forward in improving the online protection of those in sport as well as society more broadly. However, as the joint statement issued by the FA warns, there is still a lot to be done before real change is likely to be seen.[7] Despite the OSA being over 250 pages long, the majority of its substantive provisions are not yet in force and require implementation via codes and secondary legislation.
Aims of the OSA
The primary focus of the OSA (as set out in the Government’s Guide here) is to protect both children and adults online by imposing requirements on providers (such as social media platforms) who host user generated content or facilitate interactions between their users, and search engines to prevent and remove illegal content on their services. Larger providers are placed under additional obligations to remove content which breaches their own terms & conditions and to provide users with tools to allow them greater control over both the content they see and which other users they interact with.
For children, the focus outlined in the Guide is on: (i) removing illegal content quickly; (ii) implementing access requirements including age checking measures; and (iii) increasing the transparency of risks posed to children on certain social media platforms by publishing risk assessments.
The protection of adults takes a “triple shield” approach which includes: (i) preventing services being used for illegal activity; (ii) imposing obligations on the most high-risk service providers to remove content banned under their own terms and conditions; and (iii) giving users greater control over the content they see and engage with.[8]
Broadly, a key focus of the OSA is on removing “illegal” content. Whilst this focus on illegal content would not cover content which is merely offensive, as explained below in this article, illegal content for the purposes of the Act would cover material which would fall within the scope of a broad range of public order offences. With respect to content which is offensive but not illegal, the OSA imposes duties on providers to empower users to implement user controls designed to avoid such content being viewable by them.
Whilst these are the aims set out in the Government’s Guide, the OSA is highly granular and contains many more obligations and duties for service providers to comply with.
Who does the OSA apply to?
Broadly speaking, the OSA applies to providers which have “links with the UK” and:
For the purposes of the OSA, providers will be considered to “have links with the UK” if they either:
Significantly, providers need not be based in, or have any physical establishment in the UK to fall within scope of the OSA.
Certain U2U and search services will be exempt from the OSA (as set out in Schedule 1 of the OSA). These include, among other things, pure email, SMS or MMS services where those messages are the only user generated content enabled by the service. Additionally, any U2U or search services which are an internal resource for business are also exempt provided it meets the criteria for the exemption.
Furthermore, “limited functionality services” where users are only able to comment on content published by the provider themselves, such as comments under a story published on a news website, fall outside of the scope of the OSA. The OSA also includes specific safeguards for news publisher content and wider journalistic content when it is shared via regulated provider designed to ensure that the OSA does not inadvertently hamper a free press in the UK.
It is likely that a wide range of platforms (on which information will be shared and users can interact with other users) will fall within the ambit of the OSA in some capacity and not just the social media giants and search engines. Consequently, companies who operate such platforms should consider if, how and to what extent they may be affected by the OSA.
Obligations and duties
All providers regulated by the OSA and not otherwise exempt (“Regulated Providers”) are subject to a base level of obligations under the OSA. However, the OSA adopts a tiered approach. The level of obligations to which each Regulated Provider is subject will depend on which (if any) specified category (1, 2 or 2a) is applicable to that Regulated Provider. Category 1 is expected to capture a very small number of the highest risk platforms. Such service providers will have the most onerous obligations imposed on them to protect the individual users of their platforms. Although it is not known exactly which platforms will be categorised as Category 1, they are expected to include the well-known social media platforms[9]. Ofcom is mandated by the OSA to provide a register of all the categorised services once the thresholds for each category have been set out in secondary legislation. This register is expected to be published by the end of 2024[10]. It is expected, however, that many thousands of businesses will be affected by the OSA in some way.
The specific obligations and duties which will be imposed on Regulated Providers are subject to further consultation and regulation, but these broadly include obligations on illegal content duties and risk assessments, content reporting, complaints procedures, freedom of expression and privacy duties, record-keeping and review, and children risk assessment and protection duties.
The obligations in relation to illegal content have the potential to provide an increased level of protection to individuals in sport who are victims of online abuse, since providers will be required, amongst other things, to undertake risk assessments and put in place processes to: (i) improve user safety; and (ii) reduce the occasions in which users encounter illegal content.
Under the OSA, “illegal content” is expressed to consist of words, images, speech or sounds of which use, possession, viewing, accessing, publishing or disseminating amounts to an offence contained in another specified law (i.e. a priority offence).[11] There are a number of “priority offences” listed in the OSA as well as a catch-all provision which brings any legal offence where the victim (or intended victim) is an individual or individuals into the remit of the OSA. Those priority offences expressly listed in the OSA include, but are not limited to: offences relating to terrorism; child sexual exploitation and abuse; threats to kill; fraud; and public order offences.[12]
Of particular relevance to the current day sports industry, and in light of the examples of online abuse mentioned at the start of this blog, are the priority offences relating to the Public Order Act 1986, Protection from Harassment Act 1997 and Crime and Disorder Act 1998 (and other equivalent legislation in Northern Ireland and Scotland). Notably, the OSA will capture “illegal content” that amounts to fear or provocation of violence, harassment, use of words or behaviour or display of written material, harassment, stalking, and racially or religiously aggravated public order or harassment offences. In the development of the drafting of the Online Safety Bill, the FA welcomed the inclusion of hate crime as illegal content.[13]
Additionally, Category 1 providers will be subject to duties to empower adult users using their services. For example, such providers must include features that allow such users to increase the control they have over seeing abusive content that targets: race; religion; sex / gender; sexual orientation; disability or gender reassignment[14]. In addition, Category 1 providers are required to offer all adult users the option to verify their identity and filter out “non-verified users”. It is up to the provider on how it “verifies” users’ identities. For example, this may be achieved via authentication tools or requiring a user to provide ID when creating an account. Ofcom must publish guidance on how providers can fulfil this duty. If this filter is activated, it should prevent non-verified users from interacting with the user’s content and reduce the likelihood of the user viewing content which non-verified users generate on the service. Such controls are intended to enable users to limit the threatening and abusive content which they are exposed to since it is hoped that the removal of anonymity will act as a deterrent to internet trolls.
Enforcement
One immediate result of the enactment of the OSA is the appointment of Ofcom as the independent regulator who is responsible for enforcement of the OSA. The joint statement released by the FA called for the Government to ensure Ofcom has “sufficient powers to hold social media companies to account”.[15]
There are various consequences under the OSA for the Regulated Providers that fail to comply. These include fines up to the greater of £18million or 10% of a provider’s annual global turnover. There is also potential criminal liability for Regulated Providers and/or senior managers in certain instances. In the most extreme cases, Ofcom, with agreement from the court, can require payment providers, advertisers and internet service providers to stop working with a particular Regulated Provider, preventing it from generating money or being accessed from the UK.
Ofcom is under various consultation obligations following commencement of the OSA. It has split these obligations into three phases relating to: (i) illegal content duties; (ii) child safety, pornography and the protection of women and girls; and (iii) transparency, user empowerment and additional duties on categorised services. Each phase will introduce consultations, codes and guidance. The first consultation relating to illegal content duties opened 9 November 2023 and considers how U2U services and search services should approach their new duties relating to illegal content. The timing of this consultation is in accordance with Ofcom’s implementation roadmap which can be viewed here.
Conclusions
Currently, there is limited detail on the specific new obligations with which Regulated Providers must comply, given that detailed guidance, codes and secondary legislation are subject to further consultation and to be implemented before the OSA is fully implemented. However, amidst the current political climate, this may not be a straightforward process. A UK general election is due prior to 28 January 2025, which could delay Ofcom’s roadmap. Despite this, there is hope that the OSA will, in time, be looked upon as a significant legislative step forward in protecting individuals online. The impact of the OSA and how far it protects those in the UK sporting sphere from online abuse will depend on the manner and level of enforcement of its provisions by Ofcom.
In the meantime, those across the sports industry are well advised to follow Ofcom’s consultations (and consider responding to these as required) and continue to track the developments in the secondary legislation, and conduct appropriate risk assessments tailored to their business. Stay tuned for our follow-ups as the implementation of the OSA progresses.
add to folder:
If you would like to learn how Lexology can drive your content marketing strategy forward, please email [email protected].
Online Safety Act 2023 (UK)
© Copyright 2006 – 2023 Law Business Research

source

Leave a Reply

Your email address will not be published. Required fields are marked *