Crossing The Line: Seven In Ten Premier League Footballers Face Twitter Abuse
As the new season warms up for kick-off, Ofcom reveals the scale of personal attacks suffered by Premier League footballers every day on Twitter, and sets out what must be done collectively to tackle the issue.
Ofcom, which is preparing to regulate tech giants under new Online Safety laws, teamed up with The Alan Turing Institute to analyse more than 2.3 million tweets directed at Premier League footballers over the first five months of the 2021/22 season. 
The study created a new machine-learning technology that can automatically assess whether tweets are abusive.  A team of experts also manually reviewed a random sample of 3,000 tweets. 
What we found
• The vast majority of fans use social media responsibly. Of the manually-reviewed random sample of 3,000 tweets, 57% were positive towards players, 27% were neutral and 12.5% were critical. However, the remaining 3.5% were abusive. Similarly, of the 2.3 million tweets analysed with the machine-learning tool, 2.6% contained abuse. 
• Hundreds of abusive tweets are sent to footballers every day. While the proportion of abusive tweets might be low, this still amounts to nearly 60,000 abusive tweets directed towards Premier League players in just the first half of the season – an average of 362 every day, equivalent to one every four minutes. Around one in twelve personal attacks (8.6%) targeted a victim’s protected characteristic, such as their race or gender.
• Seven in every ten Premier League players are targeted. Over the period, 68% of players (418 out of 618) received at least one abusive tweet, and one in fourteen (7%) received abuse every day.
• A handful of players face a barrage of abuse. We recorded which footballers were targeted, and found that half of all abuse towards Premier Leaguers is directed at twelve particular players. These players each received an average of 15 abusive tweets every day.
We also asked the public about their experiences of players being targeted online through a separate poll. More than a quarter of teens and adults who go online (27%) saw online abuse directed at a footballer last season. This increases to more than a third of fans who follow football (37%) – and is higher still among fans of the women’s game (42%).
Among those who came across abuse, more than half (51%) said they found the content extremely offensive, but a significant proportion didn’t take any action in response (30%). Only around one in every four (26%) used the flagging and reporting tools to alert the abusive content to the platform, or marked the content as junk.
Ofcom is holding an event today (2 August) to discuss these findings. Hosted by broadcast journalist and BT Sport presenter, Jules Breach, the event will hear from presenter and former England player Gary Lineker; Manchester United player Aoife Mannion; Professional Footballers’ Association Chief Executive Maheta Molango; and Kick It Out Chair Sanjay Bhandari.
What needs to be done
Kevin Bakhurst, Ofcom’s Group Director for Broadcasting and Online Content, said: “These findings shed light on a dark side to the beautiful game. Online abuse has no place in sport, nor in wider society, and tackling it requires a team effort.
“Social media firms needn’t wait for new laws to make their sites and apps safer for users. When we become the regulator for online safety, tech companies will have to be really open about the steps they’re taking to protect users. We will expect them to design their services with safety in mind.
“Supporters can also play a positive role in protecting the game they love. Our research shows the vast majority of online fans behave responsibly, and as the new season kicks off we’re asking them to report unacceptable, abusive posts whenever they see them.”
Dr Bertie Vidgen, lead author of the report and Head of Online Safety at The Alan Turing Institute said: “These stark findings uncover the extent to which footballers are subjected to vile abuse across social media. Prominent players receive messages from thousands of accounts daily on some platforms, and it wouldn’t have been possible to find all the abuse without these innovative AI techniques.
“While tackling online abuse is difficult, we can’t leave it unchallenged. More must be done to stop the worst forms of content to ensure that players can do their job without being subjected to abuse.”
What will online safety laws mean?
The UK is set to introduce new laws aimed at making online users safer, while preserving freedom of expression. The Online Safety Bill will introduce rules for sites and apps such as social media, search engines and messaging platforms – as well as other services that people use to share content online.
The Bill does not give Ofcom a role in handling complaints about individual pieces of content. The Government recognises – and we agree – that the sheer volume of online content would make that impractical. Rather than focusing on the symptoms of online harm, we will tackle the causes – by ensuring companies design their services with safety in mind from the start. We will examine whether companies are doing enough to protect their users from illegal content, as well as content that is harmful to children.