In May 2019, Facebook asked the organizing bodies of English soccer to its London offices off Regent’s Park. On the agenda: what to do about the growing racist abuse on the social network against Black soccer players. I
Image: Matt Williams/The New York Times
n May 2019, Facebook asked the organizing bodies of English soccer to its London offices off Regent’s Park. On the agenda: what to do about the growing racist abuse on the social network against Black soccer players.
At the meeting, Facebook gave representatives from four of England’s main soccer organizations — the Football Association, the Premier League, the English Football League and the Professional Footballers’ Association — what they felt was a brushoff, two people with knowledge of the conversation said. Company executives told the group that they had many issues to deal with, including content about terrorism and child sex abuse.
A few months later, Facebook provided soccer representatives with an athlete safety guide, including directions on how players could shield themselves from bigotry using its tools. The message was clear: It was up to the players and the clubs to protect themselves online.
The interactions were the start of what became a more than two-year campaign by English soccer to pressure Facebook
and other social media companies to rein in online hate speech against their players. Soccer officials have since met numerous times with the platforms, sent an open letter calling for change and organized social media boycotts. Facebook’s employees have joined in, demanding that it to do more to stop the harassment.
The pressure intensified after the European Championship last month, when three of England’s Black players were subjected to torrents of racial epithets on social media for missing penalty kicks in the final game’s decisive shootout. Prince William condemned the hate, and British Prime Minister Boris Johnson threatened regulation and fines for companies that continued to permit racist abuse. Inside Facebook, the incident was escalated to a “Site Event 1,” the equivalent of a companywide five-alarm fire.
Yet as the Premier League, England’s top division, opens its season Friday, soccer officials said that the social media companies — especially Facebook, the largest — hadn’t taken the issue seriously enough and that players were again steeling themselves for online hate.
Social media companies aren’t doing enough “because the pain hasn’t become enough for them,” said Sanjay Bhandari, the chair of Kick It Out, an organization that supports equality in soccer.
This season, Facebook is trying again. Its Instagram photo-sharing app
rolled out new features Wednesday to make racist material harder to view, according to a blog post. Among them, one will let users hide potentially harassing comments and messages from accounts that either don’t follow or recently followed them.
“The unfortunate reality is that tackling racism on social media, much like tackling racism in society, is complex,” Karina Newton, Instagram’s global head of public policy, said in a statement. “We’ve made important strides, many of which have been driven by our discussions with groups being targeted with abuse, like the U.K. football community.”
But Facebook executives also privately acknowledge that racist speech against English soccer players is likely to continue. “No one thing will fix this challenge overnight,” Steve Hatch, Facebook’s director for Britain and Ireland, wrote last month in an internal note that The Times reviewed.
Some players appear resigned to the abuse. Four days after the European Championship final, Bukayo Saka, 19, one of the Black players who missed penalty kicks for England, posted on Twitter and Instagram that the “powerful platforms are not doing enough to stop these messages” and called it a “sad reality.”
Much of the racist abuse in English soccer has been directed at Black superstars in the Premier League, such as Raheem Sterling and Marcus Rashford. About 30% of players in the Premier League are Black, Bhandari said.
Tensions ratcheted up last year after the police killing of George Floyd in Minneapolis. When the Premier League restarted in June 2020 after a 100-day coronavirus hiatus, athletes from all 20 clubs began each match by taking a knee.
That has stoked more online abuse. In January, Rashford used Twitter to call out “humanity and social media at its worst” for the bigoted messages he had received. Two of his Manchester United
teammates, who are also Black, were targeted on Instagram with monkey emojis — which are meant to dehumanize — after a loss.
Inside Facebook, employees took note of the surge in racist speech. In one internal forum meant for flagging negative press to the communications department, one employee started cataloging articles about English soccer players who had been abused on Facebook’s platforms. By February, the list had grown to about 20 different news clips in a single month, according to a company document seen by The Times.
English soccer organizations continued meeting with Facebook. This year, organizers also brought Twitter into the conversations, forming what became known as the Online Hate Working Group.
But soccer officials grew frustrated at the lack of progress, they said. There was no indication that Facebook’s and Twitter’s top leaders were aware of the abuse, said Edleen John, who heads international relations and corporate affairs for the Football Association, England’s governing body for the sport. She and others began discussing writing an open letter to Mark Zuckerberg and Jack Dorsey, the chief executives of Facebook and Twitter.
“Why don’t we try to communicate and get meetings with individuals right at the top of the organization and see if that will make change?” said John, who is also the director of equality, diversity and inclusion at the English federation, explaining the thinking.
In February, the chief executives of the Premier League, the Football Association and other groups published a 580-word letter to Zuckerberg and Dorsey, accusing them of “inaction” against racial abuse. They demanded that the companies block racist and discriminatory content before it was sent or posted. They also pushed for user identity verification so offenders could be rooted out.
But, John said, “we didn’t get a response” from Zuckerberg or Dorsey. In April, English soccer organizations, players and brands held a four-day boycott of social media
In April, Facebook announced a privacy setting called Hidden Words to automatically filter out messages and comments containing offensive words, phrases and emojis. Those comments cannot then be easily seen by the account user and will be hidden from those who follow the account. A month later, Instagram also began a test that allowed a slice of its users in the United States, South Africa, Brazil, Australia and Britain to flag “racist language or activity,” according to documents reviewed by The Times.
The test generated hundreds of reports. One internal spreadsheet outlining the results included a tab titled “Dehumanization_Monkey/Primate.” It had more than 30 examples of comments using bigoted terms and emojis of monkeys, gorillas and bananas in connection with Black people.
‘The Onus Is on Them’
In the hours after England lost the European Championship final to Italy on July 11, racist comments against the players who missed penalty kicks — Saka, Rashford and Jadon Sancho — escalated. That set off a “site event” at Facebook, eventually triggering the kind of emergency associated with a major system outage of the site.
rushed to internal forums to say they had reported monkey emojis or other degrading stereotypes. Some workers asked if they could volunteer to help sort through content or moderate comments for high-profile accounts.
“We get this stream of utter bile every match, and it’s even worse when someone black misses,” one employee wrote on an internal forum.
But the employees’ reports of racist speech were often met with automated messages saying the posts did not violate the company’s guidelines. Executives also provided talking points to employees that said Facebook had worked “swiftly to remove comments and accounts directing abuse at England’s footballers.”
In one internal comment, Jerry Newman, Facebook’s director of sports partnerships for Europe, the Middle East and Africa, reminded workers that the company had introduced the Hidden Words feature so users could filter out offensive words or symbols. It was the players’ responsibility to use the feature, he wrote.
“Ultimately the onus is on them to go into Instagram and input which emojis/words they don’t want to feature,” Newman said.
Other Facebook executives said monkey emojis were not typically used negatively. If the company filtered certain terms out for everyone, they added, people might miss important messages.
Adam Mosseri, Instagram’s chief executive, later said the platform could have done better, tweeting in response to a BBC reporter that the app “mistakenly” marked some of the racist comments as “benign.”
also defended itself in a blog post. The company said it had removed 25 million pieces of hate content in the first three months of the year, while Instagram took down 6.3 million pieces, or 93%, before a user reported it.
Check out our Festive offers upto Rs.1000/- off website prices on subscriptions + Gift card worth Rs 500/- from Eatbetterco.com. Click here to know more.
©2019 New York Times News Service