Google-owned video sharing community platform, YouTube is stepping up its efforts to fight extremism and terrorism online, following the announcement of its parent company’s campaign towards creating a safer internet.
YouTube’s policy against content promoting extremist viewpoints hasn’t been clear, but Google has announced a change in a policy is in order.
The company has pledged towards taking additional steps in curbing extremist content from its platform and identified four additional steps that will be taken on YouTube towards this end.
“We have invested in systems that use content-based signals to help identify new videos for removal. And we have developed partnerships with expert groups, counter-extremism agencies, and the other technology companies to help inform and strengthen our efforts,” Kent Walker, General Counsel, Google said.
In addition to human content reviewers, the company has also employed an image-matching tech developed by its own engineers which will help in preventing identified terrorist content from reappearing on the website.
Google has also collaborated with various counter-extremism agencies and expert groups to help inform the masses as well as strengthen their efforts to fight the growing menace.
“While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done,” he added.
The tech titan is also partnering with other industry big wigs such as Facebook, Microsoft and Twitter to tackle terrorism online.
While Twitter has been frequently making changes to its policies to tackle cyber bullies and content supporting extremism, Facebook’s artificial intelligence system has also joined the fight against online terrorism.
4 Ways YouTube Will Fight Extremist Content
“Collectively, these changes will make a difference. And we’ll keep working on the problem until we get the balance right. Extremists and terrorists seek to attack and erode not just security, but also our values. We must not let them.”
- The company will be incorporating tech that will help quickly identify and remove extremist and terrorism-related videos while separating it from informative news-reportage.
- Since technology also needs human intervention to work with optimum efficiency, the company will also be increasing the number of independent experts employed under YouTube’s Trusted Flagger programme. They’ll be basically assisting the machine in segregating violent propaganda content from newsworthy or religious videos.
- Videos which do not directly violate YouTube’s policy but contain ‘inflammatory religious or supremacist content’ will appear with a warning and won’t be allowed monetisation, commenting, user endorsements and won’t be recommended either.
- YouTube will be upping its counter-radicalisation efforts by implementing the ‘Redirect Method’ across Europe — targeting potential ISIS recruits via online advertising, showing videos that encourage them not to join ISIS and the likes.
“Together, we can build lasting solutions that address the threats to our security and our freedoms. It is a sweeping and complex challenge. We are committed to playing our part,” Walker concluded.
The industry-wide trend to tackle one of the largest threats in the real world — quickly spreading online too — is a welcome and much-needed move.
Since a world interconnected by the Internet is an inevitable foreseeable future, making it a secure place which doesn’t have a place for those trying to propagate extremism is the right way to go.