YouTube Plans New Steps to Fight Terrorism Content on Their Platform

Angelica Greene
June 21, 2017

Google will be expanding its collaboration with groups that specialize in counter extremists to help identify the content that might be used to recruit and radicalize extremists, said the company in its blog post.

Google also intends to train new "content classifiers" in order to more quickly pinpoint videos created by terrorists without cleansing YouTube or other platforms of rightful content.

This means the tech giant will, for example, take a tougher position on videos containing supremacist or inflammatory religious content, "even if they do not clearly violate its policies", says Reuters. These videos will now "appear behind a warning" and will not be "monetized, recommended or eligible for comments or user endorsements".

"We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints", Kent Walker, Google's general counsel, said in a recent interview. Google's new measures come weeks after the deadly terror attack in London, after which British Prime Minister Theresa May called for new regulations on Internet companies. Google has, for some time now, been using image-matching technology to prevent people from reloading content that was previously flagged and removed.

Additionally, Google is increasing the number of independent experts involved in its "Trusted Flagger" program of partners that help find inappropriate content. After seven people were killed and 48 injured in an attack in London this month, United Kingdom officials have focused on sites seen as enabling extremists to recruit followers, coordinate attacks and spread propaganda.

"In previous deployments of this system, potential recruits have clicked through on the ads at an unusually high rate, and watched over half a million minutes of video content that debunks terrorist recruiting messages", Walker said.

Uber CEO resigns, company hopes to embrace a new chapter of growth
The newspaper says he stepped down "after a shareholder revolt made it untenable for him to stay on at the company". But Uber's human resources chief credited Fowler with driving this change at the meeting, according to an account.

Trump visit to Britain left out of Queen's Speech
Some Twitter users say Queen Elizabeth II trolled brexit by wearing a hat strikingly similar to the flag of the European Union. He has met Her Majesty several times, and during the election campaign said he "had a very nice chat with the Queen".

Amnesty urges donors to give more to South Sudan refugees
Syria is still the world's largest producer of refugees with an estimated 5. 5 million living in Turkey and elsewhere. Another 6.5 million people who had been forcibly displaced returned to their home communities.

In December, YouTube began working with Facebook, Microsoft, and Twitter to share data with the goal of reducing the spread of terrorist content online.

"While we and others have worked for years to identify and remove content that violates our policies. we, as an industry, must acknowledge that more needs to be done".

"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all".

To expand their counter-extremism efforts, Google will put out more targeted online advertisements to make viewers aware of the situation by redirecting them to anti-terrorism videos. "Google and YouTube are committed to being part of the solution".

Meanwhile, Facebook also recently said it's using artificial intelligence, in partnership with human expertise, to keep terrorist content from groups such as ISIS and Al Qaeda off the platform.

Other reports by GizPress

Discuss This Article