Social networks struggle to crack down on ‘incel’ movement | Social media

Despite years of strict moderation from major social networks, the “incel” community remains as influential as it was in 2014, when a 22-year-old Englishman murdered seven people on the streets of Isla Vista, California, out of misogyny.

The murders were a chilling parallel to the shootings in Plymouth last week. Both killers were extremists on social media, posting widely about their misogyny and feelings of hopelessness over their lack of sexuality.

But in the years since 2014, all the major social networks have worked against the movement. Reddit, once home to some of the internet’s largest communities, has spent much of the past two years enforcing previously loosely enforced policies.

Sub-shows such as r/incels and r/theblackpill have been banned for violating “site rules regarding violent content”. The latter was a rallying point for individuals who described themselves as “blackened,” a philosophy loosely associated with the local community in which members describe themselves as having awakened the true misery of modern life.

In other communities that could easily cross borders into violent extremism, volunteer mediators work hard to keep the conversation from veering into dark places. For example, the Forever Alone subreddit is “a place where people who have been alone most of their lives can come and talk about their issues.” Reddit’s 10 volunteer moderators don’t work, but they do enforce a set of rules, which include “be polite, friendly, and welcoming,” and a strict ban on “any implied, slang, or inferential references.”

The Plymouth shooter’s Reddit account was suspended on Wednesday, just hours before the attack, again for violating the site’s content policy. A Reddit spokesperson said, “We take these matters very seriously. Our investigation is ongoing.”

Other platforms have been slower to operate. YouTube, where the shooter had an account and regularly posted vlog-style videos, also removed his account — on Friday, citing the platform’s “offline behavior” policy. This policy is also relatively new: in 2019, YouTube was criticized for not removing content from users like Tommy Robinson, who were keen to only post videos that were within the platform’s rules, even when they were widely engaged in behavior that far exceeded what they allow to service.

“Our thoughts are with those affected by this horrific incident,” a YouTube spokesperson said. We have strict policies in place to ensure that our platform is not used to incite violence. In addition, we also have longstanding policies that prevent those responsible for such attacks from owning a YouTube channel and have since terminated their channel from our platform.”

On Facebook, incel movement is not completely blocked. Only a small handful of classified “ideologies of hate” are very limited, including white supremacy and Nazism. Many movements have been banned as “hate organisations,” but such a restriction does not apply to a non-leadership movement. Instead, the site’s restrictions on hate speech largely apply: Content that promotes hate on the basis of someone’s gender or gender, as well as any content that promotes violence, is prohibited.

Despite the actions of the large social networks, the incel community is still influential on the Internet. Sites with non-existent or non-existent moderation policies, such as 4chan and 8kun, have large form groups, and smaller, dedicated forums are able to set their own moderation policies.

Leave a reply:

Your email address will not be published.