Google is intensifying its campaign against online extremism, saying it will put more resources toward identifying and removing videos related to terrorism and hate groups.
The renewed efforts arrive in the wake of violent attacks in the U.S. and elsewhere. A van struck a crowd of people outside a London mosque Sunday, the second time an automobile was used as a weapon in that city this month, and less than a week after a gunman attacked GOP lawmakers on a baseball field.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now,” Google said in a blog post.
Anti-hate groups like the Southern Poverty Law Center have been critical of Google and social media sites, saying that they have done little to curtail hate groups online.
Facebook and Google “have done little to counter the use of their platforms to spread hateful, false “information,” from conspiracy theories accusing various minority groups of plotting against America to websites promoting Holocaust denial and false “facts” about Islam, LGBT people, women, Mexicans and others, the organization said in a report earlier this year.
Google, along with other companies such as Facebook, Microsoft and Twitter, recently agreed to create an international forum to share and develop technology, support smaller businesses and speed up their joint efforts against online terrorism.
On the same day of the most recent attack in London, Google said it would escalate those efforts.
Google will nearly double the number of independent experts it uses to flag problematic content and expand its work with counter-extremist groups to help identify content that may be used to radicalize and recruit extremists.
It will also train more people to identify and remove extremist and terrorism-related content faster.
Google Inc. said it will also take a tougher stance on videos that don’t clearly violate its policies, like those that contain inflammatory religious or supremacist content. That content may still appear, but with a warning.
It is also increasing resources for engineering to identify extremist videos and teaming with Jigsaw to use targeted online advertising to reach potential Isis recruits and shift them toward anti-terrorist videos.
(AP)