Washington, May 3: YouTube is one of the most popular video platforms in the world. But sometimes, this platform leads to some violent or manipulating content, so to limit such content, In the first three months of 2019, Google manually reviewed more than a million suspected "terrorist videos" on YouTube.
Google disclosed in an April 24 letter made public on Thursday that the manual review found 90,000 videos violated its terrorism policy.
In March, following the live-streaming on social media of a mass shooting in New Zealand, the chair of the U.S. House Committee on Homeland Security insisted the top executives of Google, Facebook Inc, Twitter Inc, and Microsoft Corp do a better job of removing violent political content. After a briefing in March, Representative Max Rose, who chairs a subcommittee on intelligence and counter-terrorism, requested the four companies in an April 10 letter to reveal their budgets for counter-terrorism programs and a number of people working solely on counter-terrorism programs.
Rose said in a statement Facebook has not responded and the other firms did not fully or directly answer his questions.
Twitter said in an April 24 to Rose that “putting a dollar amount on our broader efforts is a complex request”. Twitter said a “substantial portion” of its 4,100-person global workforce is involved in reviewing content.
Google said in its letter it has more than 10,000 people working across the company on content review.