It's Time to Fix Terrorism on Facebook, Twitter, YouTube, Say Lawmakers

NBC News Clone summarizes the latest on: It S Time Fix Terrorism Facebook Twitter Youtube Say Lawmakers N637601 - Technology and Innovation | NBC News Clone. This article is rewritten and presented in a simplified tone for a better reader experience.

Social media firms including Facebook, Twitter and YouTube are "consciously failing" to stop their sites from being used to promote ter
Woman holds a tablet displaying WhatsApp's logo in front of the screen with the Facebook logo in this photo illustration taken in Prague
REUTERS/David W Cerny

Social media firms including Facebook, Twitter and YouTube are "consciously failing" to stop their sites from being used to promote terrorism and recruit extremists, U.K. lawmakers claimed in a report released on Thursday.

The Commons home affairs select committee, which is made up of British members of parliament (MPs), said that U.S. platforms have become the "vehicle of choice in spreading propaganda" and urged the technology giants to do more to remove extremist content.

We apologize, this video has expired.

"These companies are hiding behind their supranational legal status to pass the parcel of responsibility and refusing to act responsibly in case they damage their brands," the report said.

"If they continue to fail to tackle this issue and allow their platforms to become the 'Wild West' of the internet, then it will erode their reputation as responsible operators."

The lawmakers' accusations come after British authorities made a number of attempts to get Twitter posts and YouTube videos by radical Muslim preacher Anjem Choudary taken offline. Choudary was found guilty by a U.K. court last week of supporting Islamic State.

Social media companies have been making moves to try and fight extremist materials. A Twitter spokesperson pointed to the fact that the company had suspended 235,000 accounts since February related to the promotion of terrorism.

Google told MPs that it has a "trusted flagger" program that lets approved users highlight content which they have concerns about. This is then reviewed by YouTube staff. The report said that Google claimed the accuracy rate for trusted flaggers was 90 percent. Facebook and Twitter told MPs that it did not have similar schemes but it "did have arrangements with government agencies", according to the report.

"We take our role in combatting the spread of extremist material very seriously. We remove content that incites violence, terminate accounts run by terrorist organisations and respond to legal requests to remove content that breaks UK law. We'll continue to work with Government and law enforcement authorities to explore what more can be done to tackle radicalisation," YouTube spokesperson told CNBC in an email.

Simon Milner, director of policy for Facebook in the U.K. said the social network deals "swiftly and robustly" with reports of terrorism-related content.

"In the rare instances that we identify accounts or material as terrorist, we'll also look for and remove relevant associated accounts and content," Milner said.

"Online extremism can only be tackled with a strong partnership between policymakers, civil society, academia and companies. For years we have been working closely with experts to support counter speech initiatives, encouraging people to use Facebook and other online platforms to condemn terrorist activity and to offer moderate voices in response to extremist ones."

Read More From CNBC: Facebook is Testing a Service That Users May Hate

Still, lawmakers said that the companies' methods of rooting out extremist content are insufficient.

"It is therefore alarming that these companies have teams of only a few hundred employees to monitor networks of billions of accounts and that Twitter does not even proactively report extremist content to law enforcement agencies," MPs said.

To solve this, social media firms should be required to publish quarterly statistics showing how many sites and accounts they have taken down and for what reason, the report recommends. Facebook and Twitter should implement a trusted flagger system like Google's YouTube and these companies must be willing to extend it to smaller community organizations to help highlight terrorist material, the MPs said. The lawmakers also called for closer co-operation between tech firms and law enforcement agencies.

×
AdBlock Detected!
Please disable it to support our content.

Related Articles

Donald Trump Presidency Updates - Politics and Government | NBC News Clone | Inflation Rates 2025 Analysis - Business and Economy | NBC News Clone | Latest Vaccine Developments - Health and Medicine | NBC News Clone | Ukraine Russia Conflict Updates - World News | NBC News Clone | Openai Chatgpt News - Technology and Innovation | NBC News Clone | 2024 Paris Games Highlights - Sports and Recreation | NBC News Clone | Extreme Weather Events - Weather and Climate | NBC News Clone | Hollywood Updates - Entertainment and Celebrity | NBC News Clone | Government Transparency - Investigations and Analysis | NBC News Clone | Community Stories - Local News and Communities | NBC News Clone