HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
Tackling Online Abuse In Sport: The UK’s Online Safety Act 2023
Friday, December 1, 2023

After years in the making, the Online Safety Act (the “OSA”) has come into force after receiving Royal Assent on 26 October 2023 (as discussed in our blog here).

Amidst the proliferation of social media use, there has been a worrying increase in the levels of abuse that players, athletes, officials, managers, coaches and other individuals connected with the sports industry are facing. Notably, for the 2022/2023 football season, Kick it Out, an organisation which campaigns against discrimination in football, reported a 279% increase in reports of online abuse.[1] To give a sense of the scale of and spikes in the abuse that those connected with the sports industry can face, the abusive comments on Bruno Fernandes’ social media pages following a penalty miss, increased 3,000% that day and he continued to receive hateful messages every hour for two weeks following the miss.[2] 

Given this context, it comes as little surprise that sports organisations such as the English Football Association (“FA”), Kick it Out, English Football League (EFL), Premier League and the Professional Footballers Association (PFA) worked closely with the Government on the OSA to help tackle discrimination against individuals online.[3] Whilst online abuse of footballers often hits the headlines given the sports’ widespread popularity, the issues are unfortunately commonplace across the sporting landscape.  The recent Rugby World Cup highlighted incidents of abuse not only targeted at players, such as Tom Curry in the wake of England’s semi-final with South Africa[4], but also abuse of, and threats to, a number of officials[5]. A further example includes reports of female tennis players facing online threats on social media from gamblers.[6]

If enforced effectively, it is hoped the OSA could be a significant step forward in improving the online protection of those in sport as well as society more broadly. However, as the joint statement issued by the FA warns, there is still a lot to be done before real change is likely to be seen.[7] Despite the OSA being over 250 pages long, the majority of its substantive provisions are not yet in force and require implementation via codes and secondary legislation.

Aims of the OSA

The primary focus of the OSA (as set out in the Government’s Guide here) is to protect both children and adults online by imposing requirements on providers (such as social media platforms) who host user generated content or facilitate interactions between their users, and search engines to prevent and remove illegal content on their services. Larger providers are placed under additional obligations to remove content which breaches their own terms & conditions and to provide users with tools to allow them greater control over both the content they see and which other users they interact with.

For children, the focus outlined in the Guide is on: (i) removing illegal content quickly; (ii) implementing access requirements including age checking measures; and (iii) increasing the transparency of risks posed to children on certain social media platforms by publishing risk assessments.

The protection of adults takes a “triple shield” approach which includes: (i) preventing services being used for illegal activity; (ii) imposing obligations on the most high-risk service providers to remove content banned under their own terms and conditions; and (iii) giving users greater control over the content they see and engage with.[8]

Broadly, a key focus of the OSA is on removing “illegal” content.  Whilst this focus on illegal content would not cover content which is merely offensive, as explained below in this article, illegal content for the purposes of the Act would cover material which would fall within the scope of a broad range of public order offences. With respect to content which is offensive but not illegal, the OSA imposes duties on providers to empower users to implement user controls designed to avoid such content being viewable by them.

Whilst these are the aims set out in the Government’s Guide, the OSA is highly granular and contains many more obligations and duties for service providers to comply with.

Who does the OSA apply to?

Broadly speaking, the OSA applies to providers which have “links with the UK” and:

  • host user-generated content (known as user-to-user (“U2U”) services);
  • facilitate private and public online interactions between users;
  • provide search engines (search services); or
  • deliver any service which publishes pornographic content.

For the purposes of the OSA, providers will be considered to “have links with the UK” if they either:

  1. have a significant number of UK users or have a target market in the UK; or
  2. operate services that are capable of being used in the UK and pose a material risk of significant harm to UK users presented by user-generated content and/or search content (as applicable).

Significantly, providers need not be based in, or have any physical establishment in the UK to fall within scope of the OSA.

Certain U2U and search services will be exempt from the OSA (as set out in Schedule 1 of the OSA). These include, among other things, pure email, SMS or MMS services where those messages are the only user generated content enabled by the service. Additionally, any U2U or search services which are an internal resource for business are also exempt provided it meets the criteria for the exemption.

Furthermore, “limited functionality services” where users are only able to comment on content published by the provider themselves, such as comments under a story published on a news website, fall outside of the scope of the OSA. The OSA also includes specific safeguards for news publisher content and wider journalistic content when it is shared via regulated provider designed to ensure that the OSA does not inadvertently hamper a free press in the UK.

It is likely that a wide range of platforms (on which information will be shared and users can interact with other users) will fall within the ambit of the OSA in some capacity and not just the social media giants and search engines. Consequently, companies who operate such platforms should consider if, how and to what extent they may be affected by the OSA. 

Obligations and duties

All providers regulated by the OSA and not otherwise exempt (“Regulated Providers”) are subject to a base level of obligations under the OSA. However, the OSA adopts a tiered approach. The level of obligations to which each Regulated Provider is subject will depend on which (if any) specified category (1, 2 or 2a) is applicable to that Regulated Provider. Category 1 is expected to capture a very small number of the highest risk platforms. Such service providers will have the most onerous obligations imposed on them to protect the individual users of their platforms. Although it is not known exactly which platforms will be categorised as Category 1, they are expected to include the well-known social media platforms[9]. Ofcom is mandated by the OSA to provide a register of all the categorised services once the thresholds for each category have been set out in secondary legislation. This register is expected to be published by the end of 2024[10]. It is expected, however, that many thousands of businesses will be affected by the OSA in some way.

The specific obligations and duties which will be imposed on Regulated Providers are subject to further consultation and regulation, but these broadly include obligations on illegal content duties and risk assessments, content reporting, complaints procedures, freedom of expression and privacy duties, record-keeping and review, and children risk assessment and protection duties.

The obligations in relation to illegal content have the potential to provide an increased level of protection to individuals in sport who are victims of online abuse, since providers will be required, amongst other things, to undertake risk assessments and put in place processes to: (i) improve user safety; and (ii) reduce the occasions in which users encounter illegal content.

Under the OSA, “illegal content” is expressed to consist of words, images, speech or sounds of which use, possession, viewing, accessing, publishing or disseminating amounts to an offence contained in another specified law (i.e. a priority offence).[11]  There are a number of  “priority offences” listed in the OSA as well as a catch-all provision which brings any legal offence where the victim (or intended victim) is an individual or individuals into the remit of the OSA. Those priority offences expressly listed in the OSA include, but are not limited to: offences relating to terrorism; child sexual exploitation and abuse; threats to kill; fraud; and public order offences.[12]

Of particular relevance to the current day sports industry, and in light of the examples of online abuse mentioned at the start of this blog, are the priority offences relating to the Public Order Act 1986, Protection from Harassment Act 1997 and Crime and Disorder Act 1998 (and other equivalent legislation in Northern Ireland and Scotland). Notably, the OSA will capture “illegal content” that amounts to fear or provocation of violence, harassment, use of words or behaviour or display of written material, harassment, stalking, and racially or religiously aggravated public order or harassment offences. In the development of the drafting of the Online Safety Bill, the FA welcomed the inclusion of hate crime as illegal content.[13]

Additionally, Category 1 providers will be subject to duties to empower adult users using their services. For example, such providers must include features that allow such users to increase the control they have over seeing abusive content that targets: race; religion; sex / gender; sexual orientation; disability or gender reassignment[14]. In addition, Category 1 providers are required to offer all adult users the option to verify their identity and filter out “non-verified users”. It is up to the provider on how it “verifies” users’ identities. For example, this may be achieved via authentication tools or requiring a user to provide ID when creating an account. Ofcom must publish guidance on how providers can fulfil this duty. If this filter is activated, it should prevent non-verified users from interacting with the user’s content and reduce the likelihood of the user viewing content which non-verified users generate on the service. Such controls are intended to enable users to limit the threatening and abusive content which they are exposed to since it is hoped that the removal of anonymity will act as a deterrent to internet trolls.

Enforcement

One immediate result of the enactment of the OSA is the appointment of Ofcom as the independent regulator who is responsible for enforcement of the OSA. The joint statement released by the FA called for the Government to ensure Ofcom has “sufficient powers to hold social media companies to account”.[15]

There are various consequences under the OSA for the Regulated Providers that fail to comply. These include fines up to the greater of £18million or 10% of a provider’s annual global turnover. There is also potential criminal liability for Regulated Providers and/or senior managers in certain instances. In the most extreme cases, Ofcom, with agreement from the court, can require payment providers, advertisers and internet service providers to stop working with a particular Regulated Provider, preventing it from generating money or being accessed from the UK.

Ofcom is under various consultation obligations following commencement of the OSA. It has split these obligations into three phases relating to: (i) illegal content duties; (ii) child safety, pornography and the protection of women and girls; and (iii) transparency, user empowerment and additional duties on categorised services. Each phase will introduce consultations, codes and guidance. The first consultation relating to illegal content duties opened 9 November 2023 and considers how U2U services and search services should approach their new duties relating to illegal content. The timing of this consultation is in accordance with Ofcom’s implementation roadmap which can be viewed here

Conclusions

Currently, there is limited detail on the specific new obligations with which Regulated Providers must comply, given that detailed guidance, codes and secondary legislation are subject to further consultation and to be implemented before the OSA is fully implemented. However, amidst the current political climate, this may not be a straightforward process. A UK general election is due prior to 28 January 2025, which could delay Ofcom’s roadmap. Despite this, there is hope that the OSA will, in time, be looked upon as a significant legislative step forward in protecting individuals online. The impact of the OSA and how far it protects those in the UK sporting sphere from online abuse will depend on the manner and level of enforcement of its provisions by Ofcom.

In the meantime, those across the sports industry are well advised to follow Ofcom’s consultations (and consider responding to these as required) and continue to track the developments in the secondary legislation, and conduct appropriate risk assessments tailored to their business. Stay tuned for our follow-ups as the implementation of the OSA progresses. 


[1] https://www.kickitout.org/reporting-statistics

[2] Crisp-a-Kroll-business-Online-abuse-in-sports.pdf (sportspromedia.com)

[3] https://www.thefa.com/news/2023/oct/26/online-safety-act

[4] Tom Curry targeted with ‘disgusting’ abuse in South Africa race row (telegraph.co.uk)

[5] World Rugby to take action against fans over World Cup referee abuse (telegraph.co.uk)

[6] Judy Murray: Female tennis players are facing death threats from gamblers on social media | UK News | Sky News

[7] https://www.thefa.com/news/2023/oct/26/online-safety-act

[8] https://www.gov.uk/guidance/a-guide-to-the-online-safety-bill

[9] See further discussion here: https://www.iptechblog.com/2023/10/uk-online-safety-act-becomes-law/

[10] https://www.ofcom.org.uk/online-safety/information-for-industry/roadmap-to-regulation

[11] Online Safety Act 2023, section 59(2), (3) and (4)

[12] Online Safety Act 2023, section 59(7), Schedule 7

[13] https://www.thefa.com/news/2022/apr/18/fa-welcomes-online-safety-bill-20220418

[14] Online Safety Act 2023, section 15(2) and section 16(4)

[15] https://www.thefa.com/news/2023/oct/26/online-safety-act

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins