- Oops!Something went wrong.Please try again later.
Twitter and Instagram are looking at how they can prevent people from posting racist messages or emojis before they are sent amid a spike in abuse directed towards footballers after the Euro 2020 final.
The racist vitriol aimed at England players who missed penalties in the final, some of which is still visible on the platforms, led to difficult questions for bosses from both firms as they gave evidence before MPs in Westminster.
Former footballer Anton Ferdinand was among the ex-pros also giving evidence to the Home Affairs Select Committee and said social media companies have repeatedly failed to take meaningful action.
"What are the social media companies waiting for?" the former West Ham, Sunderland and QPR defender asked.
"Are they waiting for a high-profile footballer to kill themselves? Or a member of their family to commit suicide - is that what they are waiting for? Because if they are waiting for that it is too late… let's deal with the issue now."
Director of public policy for Instagram, Tara Hopkins, told the MPs that the site removed 15,000 abusive comments in the two days after the Euro final and has now identified strings of emojis that are regularly used to racially abuse players.
She said their "machine learning" technology has included them in material that should be taken down.
The chair of the committee, Yvette Cooper, produced messages that showed clear racist abuse still visible on Instagram that was posted several weeks ago.
She said to the Instagram boss: "Everything you have said to me just looks like utter garbage compared to seeing these posts."
"I'm sorry that those posts are still up and they should be removed," Ms Hopkins said.
Anton Ferdinand backed calls for more identification checks on people when they start a social media account and pleaded for platforms to see how technology could be deployed to try and identify the context of a tweet before it is sent - a more sophisticated version of predictive text.
"If the context of your tweet is wrong you shouldn't be able to tweet that message," the former footballer said.
Katy Minshall, the head of UK public policy for Twitter, said: "The burden shouldn't be on victims of abuse to report those tweets to us - we have got to a place where 95% of the abusive tweets we were taking down were detected through machine learning.
"I think that we have to think far more about the mechanisms where it is possible to send such tweets in the first place."
She said that the firm was introducing a device that can pick up on potentially racist content before it is posted and then question the sender about what they are sending.
Ms Minshall said that trials had shown that a third of tweets that were questioned by Twitter were then not posted by users.
"I think we have to do far more to prevent it being possible to happen with such ease," she added.
She conceded that the site still needs to do more to tackle the abuse, particularly when it has been reported to them.
"There isn't an excuse when someone has reported something and it breaks our rules and we haven't taken it down and haven't acted appropriately," she said.
Director of equality, diversity and inclusion at the Professional Footballers Association (PFA), Simone Pound, told the committee that over half of the accounts that sent abuse during the Euro 2020 tournament were UK-based and a third of the abuse was in emoji form.
She said: "We are creating a really toxic culture that we need to address because it is going to impact everybody... as well as taking down the posts we need to get to the people sending them.
"They (Twitter and Instagram) are not doing enough - they are multi-billion (pound) companies... and they haven't been able to come up with solutions… what we need to see as quickly as it needs to be done."
Last Thursday, England players were booed while they took the knee in a World Cup qualifier against Hungary, and Raheem Sterling and Jude Bellingham were both taunted with monkey noises from Hungarian fans in the stands in Budapest.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email email@example.com in the UK. In the US, call the Samaritans branch in your area or 1 (800) 273-TALK