Physical attacks track spikes in hate speech on Twitter, researchers say

Comment

SAN FRANCISCO — Earlier this month, the FBI charged a Florida man with making detailed threats online to murder 100 gay men. He had previously called for black people to be killed and said he planned to fire tear gas at a synagogue, according to the criminal complaint.

Suspect Sean Michael Albert, who said he was joking, has found more to like on Twitter since Elon Musk took over. The last 11 tweets he love before his arrest came from either Musk or Andrew Tate, the kickboxer accused of human trafficking whom Musk recently unleashed on the platform.

There is no evidence that what Albert saw on Twitter inspired him to post his own messages, which court documents say were posted on Discord, and his attorney did not respond to a request for comment. But former employees and online researchers say physical attacks in the United States have followed Twitter spikes in certain categories of hate speech, including anti-Semitic and anti-gay slurs and rhetoric.

New research to be published later this month by misinformation tracker Network Contagion Research Institute suggests a link between real-world incidents and variations of the word ‘groomer’, often aimed at gay people and suggesting they are adults determined to seduce children. Although polls indicate that a significant minority of the population think otherwise, gay people are no more likely to be predators than straight people.

Before Musk, Twitter classified the word “groomer” as hate speech. But usage started to spike soon after Musk announced he would buy the platform, and it has spiked several times since, often after real-life incidents like the fatal shooting at a Colorado gay club.

“Over the past three to four months we have seen an increase in anti-LGBTQ incidents, and you can see a statistical correlation between these actual incidents and the increased use of the term ‘groomer’ on Twitter,” Alexander Reid said. . Ross, a Network Contagion analyst who shared the findings with The Washington Post. He did not say that the use of the term had led to violence.

The second biggest spike in tweets with the word “neat” came right after Musk took over Twitter. The largest, at more than 4,000 in one day, came in late November, shortly before a record seven daily anti-gay attacks were recorded in the Armed Conflict Location and Event Data Project, a tracker nonprofit from political violence around the world, Ross said.

Twitter hate speech increases in major foreign markets after Musk takeover

While the hateful language leading to the incidents could raise the possibility that it inspires violence, the language of hate is also important afterwards, experts say. Calling victims of violence “groomers” “fuels this highly pressurized and toxic discourse that condemns victims and thereby justifies further activity,” Ross said.

Musk plays a role not just by relaxing speech policies and reducing moderation staff, but through personal choices in his interactions, researchers say.

Recent anti-Semitic incidents included direct references to rapper Ye, who tweeted against Jews after Musk welcomed him back to the platform after his suspension from Instagram. His comeback on Twitter saw him commit to going “death con 3 On JEWISH PEOPLE”.

Even after Musk suspended Ye again, tweets referencing Jewish “privilege” or “supremacy” increased, according to Joel Finkelstein, director of Network Contagion and senior fellow at Rutgers University.

An assailant who attacked a man in New York’s Central Park last month shouted “Kanye 2024” accompanied by anti-Semitic comments, police said. In November, vandals spray-painted “Kanye was rite” and swastikas on headstones at a Jewish cemetery in Illinois.

Hate crimes against Jews in New York rose from nine in September to 45 in November, to account for more than half of bias incidents in the city, according to New York Police Department statistics.

White nationalists and some black Americans sometimes amplified, Finkelstein said. Neo-Nazi groups posted memes on picture boards with Ye as the new heroic Hitler, while Cynthia McKinney of Georgia, a Black American Green Party activist who served six terms in Congress, tweeted that 2022 is “the year of #TheNoticing, the year gaslighting finally started failing!” The hashtag, driven by die-hard anti-Semites on Twitter and the 4chan image forum, refers to an alleged finding that some Jews hold influential positions. McKinney did not respond to requests for comment.

Finkelstein has seen the same patterns before, including during a conflict between Israel and Hamas in May 2021. A team of analysts from Network Contagion and elsewhere reviewed 6 billion tweets and Reddit posts and recently discovered that the volume of tweets using human rights language was a better predictor of both street protests and anti-Semitic incidents in the United States than was actual fighting in the Middle East.

Musk’s ‘free speech’ agenda is dismantling Twitter security work, insiders say

“We found that alongside the fighting there is a massive increase in words like colonialism and apartheid, and then there are incidents,” said Aviva Klompas, chief executive of Boundless, a nonprofit group that has also worked on the study. “Then you see the long tail of this weaponized language, and the incidents keep happening.”

Twitter keeps both legitimate debate and hate alive longer and spreads it more widely, Finkelstein said: “The wars that are going on around the world are also being fought online, and social media has become the weapon move from a local conflict to a clash of civilizations.

Along with firing most members of Twitter’s trust and safety team and its external safety advisory board, Musk reinstated accounts that stoke extremism and tweeted an image of Pepe the Frog, Twitter’s mascot. the alternative right.

He also went out of his way to criticize former trust and security chief Yoel Roth, who resigned after November midterms and criticized Musk’s habit of deciding content rules at the stolen. Musk took an old Roth tweet from an article outlining a criminal ruling against a teacher who had sex with an 18-year-old student and added ‘that explains a lot’ amid a push to portray himself as a tall enemy of child sex abuse images and Roth like the one who let him slide on Twitter – prompting hordes of Twitter users to call Roth a “groomer”.

Although Musk said ‘incitement to violence’ remains grounds for suspension, suggestions that Roth should be killed remained on the site after being flagged by a longtime researcher who uses the Twitter account @conspirator0 .

Roth fled his home as stories tweeted images of him, including one of a man feeding another through a wood chipper, with the words “how are you going to get rid of stupid” and a plural epithet. Other images included an uncaptioned firing squad and containers of bullets, one labeled “box of pills that cure paedophilia”.

Racist Tweets Quickly Surface After Musk Reaches Twitter Deal

Some of the tweets and accounts were deleted a day later, the researcher said. But similar answers are still up. Roth put his house up for sale and moved out, according to a person in contact with him. He declined to comment.

Musk’s new chief trust and safety officer, Ella Irwin, did not respond to an email seeking comment.

Roth too was distinguished in tweets from the influential @LibsofTikTok account, which is run by activist Chaya Raichik and has 1.7 million followers.

The account has long crossed against young transgender medical treatment in hospitals. A focus on Boston Children’s Hospital in August preceded threats against doctors there, while a Wisconsin school under fire for an investigation into the bullying of a transgender student temporarily closed in May in because of bomb threats and harassment.

Its spotlights were also followed by the Proud Boys and other violent groups protesting marches and other events.

The Butler Institute Task Force, a non-profit counter-extremism group, found 281 LibsofTikTok tweets last month mentioning a specific event, place or person between April and November. In 66 of those cases, reports of digital harassment or actual incidents, including death threats and bomb threats, followed. On several occasions, organizers have canceled events in response.

Prior to Musk’s takeover, complaints about LibsofTikTok sometimes resulted in the removal of individual tweets or week-long suspensions, including twice in september.

After Musk’s takeover, there were no such suspensions, and he personally interacted with the account, convincing some not to bother trying. “It’s no use,” said activist Erin Reed, who is following the story closely. Asked to comment, Raichik responded by accusing The Post of inciting violence against her.

The account’s favorite topics are drag shows and book readings, especially those open to minors. In November he underline an upcoming performance at the Sunrise Theater in Moore County, NC

Minutes into the December 3 show, the lights went out. Two separate electronic substations had been shot down and disabled, leaving 40,000 people without power for days. The FBI is investigating the incident and declined to say whether they believe the blackout was aimed at the show.

Rumman Chowdhury, Twitter’s former director for machine learning ethics and accountability, said escalating hate speech and violence were predictable results of Musk’s decisions but still deeply upsetting.

“It is certainly very shocking. It’s very sad to see this thing that so many of us have loved and built being decimated piece by piece,” she told the Post. “It’s very hard to see where this is heading and how serious it is getting.”

Cat Zakrzewski contributed reporting.

Leave a Comment