Kent Walker, Google's top lawyer, wrote in an op-ed Sunday the company was taking additional steps to tackle violent extremism online.
"While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now," he wrote in the Financial Times, adding that YouTube (owned by Google) was also taking the same steps.
Lawmakers have blamed the Internet for allowing extremism to spread without consequence following recent terrorist attacks in England and the UK.
Walker said both companies will take four additional steps to remove content that violates their policies:
- Increase technology to help identify extremist and terrorism-related videos by devoting more engineering resources to "apply our most advanced machine learning research to train new “content classifiers” to help us more quickly identify and remove such content."
- Increase the number of independent experts in YouTube's flagger program.
- Place a warning before videos that "contain inflammatory religious or supremacist content" and don't monetize them.
- Expand its role in counter-radicalization efforts by redirecting people to anti-terrorist videos if ISIS advertisements are targeted at them.
"Collectively, these changes will make a difference," wrote Walker. "And we will keep working on the problem until we get the balance right. Extremists and terrorists seek to attack and erode not just our security, but also our values; the very things that make our societies open and free. We must not let them."
© 2024 Newsmax. All rights reserved.