By PTI
NEW DELHI: YouTube on Monday said it did not detect materials related to child sexual abuse on its platform despite multiple probes and also has not received evidence of such content on the video streaming platform from regulators.
The statement from a YouTube spokesperson came after the government issued notices to social media platforms, including YouTube, X (formerly Twitter) and Telegram, earlier this month asking them to take down child sexual abuse material from their platforms in India.
In a statement, a YouTube spokesperson said: “We have a long history of successfully fighting child exploitation on YouTube. Based on multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators.”
The video platform owned by Google further said that “no form of content that endangers minors is allowed on YouTube, and we will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content.”
“We are committed to working with all collaborators in the industry-wide fight to stop the spread of child sexual abuse material (CSAM),” the YouTube spokesperson added in an e-mail statement.
YouTube has submitted its formal response to the issue. In Q2 2023, YouTube removed over 94,000 channels and over 2.5 million videos for violations of child safety policy.
According to YouTube, in India, it shows a warning at the top of search results for specific search queries related to CSAM. This warning states child sexual abuse imagery is illegal and links to the national cyber crime reporting portal.
ALSO READ | Government notice to X, YouTube over child abuse material
The government, on October 6, said notices have been issued to social media platforms X (formerly Twitter), YouTube and Telegram to remove child sexual abuse material from their platforms in India.
Minister of State for Electronics and IT, Rajeev Chandrasekhar had warned that if social media intermediaries do not act swiftly, their safe harbour status under section 79 of the IT Act would be withdrawn, implying that the platforms can be directly prosecuted under the applicable laws and rules even though the content may have not been uploaded by them.
“Ministry of Electronics and IT has issued notices to social media intermediaries X, YouTube and Telegram, warning them to remove Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet. The notices served to these platforms emphasise the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms,” the statement by the government on October 6 said.
The notices also called for the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the dissemination of CSAM in the future. Follow channel on WhatsApp
NEW DELHI: YouTube on Monday said it did not detect materials related to child sexual abuse on its platform despite multiple probes and also has not received evidence of such content on the video streaming platform from regulators.
The statement from a YouTube spokesperson came after the government issued notices to social media platforms, including YouTube, X (formerly Twitter) and Telegram, earlier this month asking them to take down child sexual abuse material from their platforms in India.
In a statement, a YouTube spokesperson said: “We have a long history of successfully fighting child exploitation on YouTube. Based on multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators.” googletag.cmd.push(function() {googletag.display(‘div-gpt-ad-8052921-2’); });
The video platform owned by Google further said that “no form of content that endangers minors is allowed on YouTube, and we will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content.”
“We are committed to working with all collaborators in the industry-wide fight to stop the spread of child sexual abuse material (CSAM),” the YouTube spokesperson added in an e-mail statement.
YouTube has submitted its formal response to the issue. In Q2 2023, YouTube removed over 94,000 channels and over 2.5 million videos for violations of child safety policy.
According to YouTube, in India, it shows a warning at the top of search results for specific search queries related to CSAM. This warning states child sexual abuse imagery is illegal and links to the national cyber crime reporting portal.
ALSO READ | Government notice to X, YouTube over child abuse material
The government, on October 6, said notices have been issued to social media platforms X (formerly Twitter), YouTube and Telegram to remove child sexual abuse material from their platforms in India.
Minister of State for Electronics and IT, Rajeev Chandrasekhar had warned that if social media intermediaries do not act swiftly, their safe harbour status under section 79 of the IT Act would be withdrawn, implying that the platforms can be directly prosecuted under the applicable laws and rules even though the content may have not been uploaded by them.
“Ministry of Electronics and IT has issued notices to social media intermediaries X, YouTube and Telegram, warning them to remove Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet. The notices served to these platforms emphasise the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms,” the statement by the government on October 6 said.
The notices also called for the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the dissemination of CSAM in the future. Follow channel on WhatsApp