Facebook Asking Users if they’re Worried that Friends are ‘Becoming Extremists’

2 months ago 3302

Social media giant Facebook looks to be making moves into further controlling the extremist content and thought that has become ever more common in social media in the last several years.

Some users of the service are being asked if they are worried that certain friends are becoming extremists, according to a Facebook spokesperson. Users are also being notified that they may have been exposed to extremist content, according to screenshots being shared on Twitter.

One of the prompts, shared on Twitter, asks “Are you concerned that someone you know is becoming an extremist? We care about preventing extremism on Facebook. Others in your situation have received confidential support.”

Another alert reads “Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others.”

CNet reports that when the links are followed, they lead to support resources for help.

Social media companies that amass enormous user bases like Facebook and Twitter have been under pressure to remove extremist content from their platforms for years, in the hopes that extremist posters will be stopped online before their violence spills out IRL.

Facebook has intensified its focus on this objective amidst increased scrutiny for the role that its platform played in the buildup to the US Capitol riots in January that led to the deaths of 5 people and left 140 others injured.

Facebook’s Redirect Initiative

This program against extremism is part of Facebook’s Redirect Initiative, aiming to combat violence on its site by redirecting users who search for hate or violence-related terms towards educational resources and outreach support groups.

“This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,” said a Facebook spokesperson in a statement.

“We are partnering with NGOs and academic experts in this space and hope to have more to share in the future.”

Facebook has said that it has been removing an increased amount of content in the fourth quarter for violating the social media site’s rules against hate speech, harassment, nudity and other offensive content. The company took down 26.9 million pieces of hate speech content, up from 22.1 million in the year’s third quarter.

By Luis Monzon
Follow Luis Monzon on Twitter
Follow IT News Africa on Twitter

Source Article