The worlds largest video sharing site YouTube’s system for reporting sexualised comments on children’s videos has not been active for over a year now.
Volunteer moderators say there could be up 100,000 predatory accounts leaving indecent comments and videos.
A BBC trending investigation discovered a flaw in the tool that enables the public to report abuse.
They have procedures that can automatically block exploitive and illegal videos.
However, it also relies on users to to report illegal content that goes against their guidelines.
YouTube’s trusted flagger programme, a group that includes law enforcement and charities identify comments directed at children.
Some of the comments are shocking and disturbing; including phone numbers of adults and innappropriate video requests.
This is the kind of material that should immediately be removed under YouTube’s regulation and in many cases reported to authorities.
The site says they have a zero tolerance policy against any form of grooming or child engagement.
However, there is a constant worry that those who groom young people are using social media sites such as YouTube to reach them.
YouTube has come under pressure because of the reoccurrence of illegal and inappropriate content.
They have announced additional measures to crack down on disturbing videos.
The measures include more enforcement, terminating channels that could possibly endanger children and removing potentially dangerous ads from certain videos.
There are a series of guidelines available to help protect children online including becoming more aware of what children are doing on the internet, paying specific attention to comments made on videos and being aware in popular content posted by children online.
In the UK, online grooming can be reported via the Child Exploitation and Online Protection command.