CALIFORNIA, U.S. - In a bid to remove all kinds of violent and sexual scenarios from content watched by kids, YouTube is clamping down on disturbing videos aimed at children on its service.
The genre reportedly depicts family-friendly characters seen in violent and sexual scenarios.
According to recent reports, the videos are evading filters on the YouTube Kids app.
In August this year, YouTube enforced a policy that restricted creators from monetizing videos that make "inappropriate use of family friendly characters."
YouTube’s latest step will automatically block such content from its kids app.
YouTube's director of policy Juniper Downs said in a statement, “We're in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged. Age-restricted content is automatically not allowed in YouTube Kids."
As part of the move, the website will add a much different human element to the policing process.
So far, Google has relied on its algorithms to filter inappropriate content from its kids app.
Google is now hoping that the age-gate feature combined with its existing security layers will stop the disturbing clips from reaching children.
The internet giant said that the new changes will go live within a few weeks.
YouTube’s latest safeguard will also impact its main service, where age-restricted videos are only accessible to signed-in users aged 18 and over.