English football has proposed a set of recommendations to further strengthen the Online Safety Bill and ensure victims of online abuse are adequately protected.
For some time, football, from the grassroots to the elite, has been impacted by vile online hate and discrimination. The language used is debasing, and often threatening and illegal. Similarly, emojis and memes are used to peddle legal but harmful messages. The victims are not only football players, but also their families, referees, coaches, administrators, pundits, fans and others in the game. The abuse is not virtual: it is real. It is directed at real people who are real victims.
Despite discussions with Twitter, Facebook and other social media companies for several years now, sadly there has been no real sign of significant proactive change that addresses the problem. The Online Safety Bill represents the greatest opportunity so far to regulate social media companies and enforce real change.
English football has created a list of proposals to strengthen the draft Online Safety Bill.
These proposals were presented by former footballer and presenter Rio Ferdinand alongside Kick It Out Chair Sanjay Bhandari and FA director Edleen John, on behalf of English football at the Joint Select Committee on Thursday 9 September.
OUR RECOMMENDED CHANGES TO THE ONLINE SAFETY BILL
1. Protection for groups identified as at risk under existing legislation:
Parliament has afforded certain groups statutory protection from discrimination, under the Equality Act 2010. We are calling on the Government to ensure that the same protection is afforded to these groups online.
2. Ofcom should be given powers on ‘legal but harmful’ content:
At present, the Online Safety Bill appears to give Ofcom enforcement powers on illegal harms - but on legal harms (i.e. "content that is harmful to adults") there is no duty of care and no tools for Ofcom to intervene.
3. The reach of anonymous accounts should be part of the Codes of Practice:
Social media companies can verify their users’ identities, and indeed choose to do so when verifying the authenticity of the accounts of public figures. Some form of verification could be required for all accounts, with verification data only shared if required by law enforcement.
4. Discrimination and hate speech should be the subject of specific codes of practice:
Discrimination and hate speech should be the subject of specific codes of practice (as is the case for child sexual exploitation and terrorism content) – to signify the seriousness of this abuse, and to clarify the standards and best practice that social media platforms will be expected to adopt.
5. Discrimination and hate speech should be ‘priority illegal content’:
Discrimination and hate speech should be categorised as “priority illegal content” in the Online Safety Bill, to put an increased obligation on service providers to take positive actions to minimise the presence of such content on their platforms.
6. Obligation on providers to manage the risks of content which is harmful to adults:
There should be a specific obligation on providers to specify in their terms of services how they will mitigate and manage the risks of content, which is harmful to adults, including racism and/or hate speech.
7. The Secretary of State should have enforcement powers:
There should be a statutory power to enable the Secretary of State for Digital, Culture, Media and Sport to clearly specify content that is harmful to adults in secondary legislation, as well as mandatory minimum steps that should be taken to deal with such harmful content.
8. Comments on news publishers’ platforms should be included within the scope of the Bill:
Exemptions in relation to news publisher-operated social media platforms and comments made in response to newspaper articles create a loophole whereby discrimination and hate speech are left out of the scope of the Bill - and leave an online environment where discriminatory comments and emojis can continue to be published, normalised and shared. We believe that this exemption should be removed, at least in relation to discrimination and hate speech.
9. Transparency reporting requirements should be defined by the Bill:
We would like increased certainty in the Online Safety Bill in relation to the transparency reports that services will be required to submit and/or publish. For example - how effective would a football banning order for online abuse be if the data transparency reporting showed that 70% of abusers are overseas and never attend English football games? We’re clear that would require a different policy intervention.
10. Social media companies should be required to assist the authorities with their criminal investigations:
Part of the safety duties should be to ensure that illegal content is always passed promptly to the appropriate authorities.
STRENGTHENING THE PROSECUTION SYSTEM
We also recommend that the Home Office should consider including football as a specific priority in a well-resourced Hate Crime Unit. Local police forces across the country and the Crown Prosecution Service would work hand-in-hand with football, the public authorities and social media companies to provide a proactive, joined up approach to address online discrimination against any protected characteristics as specified under the Equality Act 2010.
This would send a strong message about the intentions and direction of public policy, resource allocation and priorities in addressing this important issue - with the national game helping to change attitudes and behaviours across the whole of society.