PFA Charity Study Finds 43% Of Premier League Players Studied Received Racist Abuse

SUPPORTED BY

"Football – and social media platforms – need to step up"

Oct 22, 2020
Jacob Davey
Words by
Photography by

A PFA Charity report into online abuse has revealed significant blind spots in combatting online abuse, with 43% of Premier League players that featured in the study experiencing targeted and explicitly racist abuse.

The PFA Charity’s study, conducted by Signify Group and supported by Kick It Out, studied 44 high-profile current and former players from across the top divisions of English Football, with machine learning systems analysing messages sent publicly via Twitter.

In the six weeks following Project Restart, 825,515 tweets were directed at the selected players, with 3,000 explicitly abusive messages a part of those tweets – with 56% of all the discriminatory abuse identified as being racist.

The study also found that Twitter’s algorithms were not effectively blocking or taking down racially abusive posts that were sent using emojis – with abusive emoji posts making up 29% of the racist messages players received. The use of emojis remains a significant blind spot in tackling online abuse.

England and Manchester City forward, Raheem Sterling said: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse. The technology is there to make a difference, but I’m increasingly questioning if there is the will.”  

The findings in this report provide clear evidence that cannot be ignored by both footballing and social media bodies involved in regulating online abuse. You can read the report, which contains abusive and racist language, in full, here.

No items found.
No items found.

PFA Charity Study Finds 43% Of Premier League Players Studied Received Racist Abuse

"Football – and social media platforms – need to step up"

Oct 22, 2020
Jacob Davey
Words by
Photography by

A PFA Charity report into online abuse has revealed significant blind spots in combatting online abuse, with 43% of Premier League players that featured in the study experiencing targeted and explicitly racist abuse.

The PFA Charity’s study, conducted by Signify Group and supported by Kick It Out, studied 44 high-profile current and former players from across the top divisions of English Football, with machine learning systems analysing messages sent publicly via Twitter.

In the six weeks following Project Restart, 825,515 tweets were directed at the selected players, with 3,000 explicitly abusive messages a part of those tweets – with 56% of all the discriminatory abuse identified as being racist.

The study also found that Twitter’s algorithms were not effectively blocking or taking down racially abusive posts that were sent using emojis – with abusive emoji posts making up 29% of the racist messages players received. The use of emojis remains a significant blind spot in tackling online abuse.

England and Manchester City forward, Raheem Sterling said: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse. The technology is there to make a difference, but I’m increasingly questioning if there is the will.”  

The findings in this report provide clear evidence that cannot be ignored by both footballing and social media bodies involved in regulating online abuse. You can read the report, which contains abusive and racist language, in full, here.

No items found.
No items found.

Related

News

PFA Charity Study Finds 43% Of Premier League Players Studied Received Racist Abuse

"Football – and social media platforms – need to step up"

Words by
Jacob Davey
Oct 22, 2020
Photography by
Example of image caption
Image caption goes here

A PFA Charity report into online abuse has revealed significant blind spots in combatting online abuse, with 43% of Premier League players that featured in the study experiencing targeted and explicitly racist abuse.

The PFA Charity’s study, conducted by Signify Group and supported by Kick It Out, studied 44 high-profile current and former players from across the top divisions of English Football, with machine learning systems analysing messages sent publicly via Twitter.

In the six weeks following Project Restart, 825,515 tweets were directed at the selected players, with 3,000 explicitly abusive messages a part of those tweets – with 56% of all the discriminatory abuse identified as being racist.

The study also found that Twitter’s algorithms were not effectively blocking or taking down racially abusive posts that were sent using emojis – with abusive emoji posts making up 29% of the racist messages players received. The use of emojis remains a significant blind spot in tackling online abuse.

England and Manchester City forward, Raheem Sterling said: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse. The technology is there to make a difference, but I’m increasingly questioning if there is the will.”  

The findings in this report provide clear evidence that cannot be ignored by both footballing and social media bodies involved in regulating online abuse. You can read the report, which contains abusive and racist language, in full, here.

No items found.
No items found.

Related

PFA Charity Study Finds 43% Of Premier League Players Studied Received Racist Abuse

"Football – and social media platforms – need to step up"

Oct 22, 2020
Jacob Davey
Words by
Photography by

A PFA Charity report into online abuse has revealed significant blind spots in combatting online abuse, with 43% of Premier League players that featured in the study experiencing targeted and explicitly racist abuse.

The PFA Charity’s study, conducted by Signify Group and supported by Kick It Out, studied 44 high-profile current and former players from across the top divisions of English Football, with machine learning systems analysing messages sent publicly via Twitter.

In the six weeks following Project Restart, 825,515 tweets were directed at the selected players, with 3,000 explicitly abusive messages a part of those tweets – with 56% of all the discriminatory abuse identified as being racist.

The study also found that Twitter’s algorithms were not effectively blocking or taking down racially abusive posts that were sent using emojis – with abusive emoji posts making up 29% of the racist messages players received. The use of emojis remains a significant blind spot in tackling online abuse.

England and Manchester City forward, Raheem Sterling said: “I don’t know how many times I need to say this, but football and the social media platforms need to step up, show real leadership and take proper action in tackling online abuse. The technology is there to make a difference, but I’m increasingly questioning if there is the will.”  

The findings in this report provide clear evidence that cannot be ignored by both footballing and social media bodies involved in regulating online abuse. You can read the report, which contains abusive and racist language, in full, here.

No items found.
No items found.