Following in Facebook's footsteps, Google has also announced that it's also going to use AI to better tackle terrorist content.
"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all," said Kent Walker, Google's general counsel at Google.
Walker stated that Google and YouTube are committed to being part of the solution and that they were working with the government, law enforcement and civil society groups to tackle the problem of violent extremism online. "There should be no place for terrorist content on our services," Walker added.
In a bid to improve the crack down on such content, Google has promised to plough more investment into its machine learning technology to improve its ability to automatically detect and remove terrorist content while keeping innocent videos, such as a BBC News report, online.