To curb problematic content, YouTube deleted more than 58 million videos and 224 million comments due to policy violations in the last quarter.
The move comes amidst pressure from interest groups and governments including the United States, European Union and Asia to remove extremist and hateful content on social media which will potentially incite violence.
EU proposed hefty fines for online services if they did not remove extremist material within an hour of the government direction to do so.
An official at the Ministry of Home Affairs said social media firms had agreed to respond to requests to remove objectionable content within 36 hours.
Automated tools to detect such videos help YouTube quickly identify spam, extremist content and nudity.
But automated technologies cannot detect videos with hateful rhetoric and dangerous behaviour. So, it is up to users to report problematic videos or comments. So the content may be viewed widely before being removed.
Parent company Google added thousands of moderators this year and is expanding it to 10,000. Also pre-screening of every video is unfeasible.
About 1.67 million channels and 50.2 million videos were removed. Nearly 80 per cent of the channel deletions were related to spam uploads, 13 per cent content had nudity and 4.5 per cent concerned child safety.
Apart from that, 7.8 million videos were removed for policy violations.