New Delhi: The government on Friday issued notices to social media companies X (formerly known as Twitter), YouTube and Telegram warning them to remove child sexual exploitation content from their platforms or face legal action. face the
“We have sent notices to X, Youtube and Telegram to ensure that there is no Child Sexual Exploitation (CSAM) content on their platforms. A safe and secure internet under the Govt. IT laws. is committed to building,” Minister of State for Skill Development and Entrepreneurship and Electronics and IT Rajeev Chandrasekhar said in a statement on Friday.
“The IT laws under the IT Act place strict expectations on social media intermediaries not to allow offensive or harmful posts on their platforms. If they do not act swiftly, Section 79 of the IT Act Their safe harbor will be withdrawn and they will have to face the consequences under Indian law.
In a statement to Mint, a YouTube spokesperson said, “We have a zero-tolerance policy on child sexual exploitation content. We do not accept any content that puts minors at risk. We has invested heavily in technology and teams to fight sexual exploitation of abuse and exploitation online and act quickly to remove it as soon as possible. In Q2 2023, we updated our child protection policies. More than 94,000 channels and more than 2.5 million videos have been removed due to the violations. We will continue to work with experts inside and outside of YouTube to provide the best protections for minors and families.”
Questions sent to Telegram and X were not immediately answered.
The ministry said the notices specify that non-compliance with the requirements will be considered a violation of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021, and the three social media intermediaries. warned that any delay in complying with the notice would result in withdrawal of their safe harbor protection under Section 79 of the IT Act, which currently shields them from legal liability.
The notices sent to these platforms also emphasize the importance of immediately and permanently removing or disabling access to any CSAM on their platforms. They also call for the implementation of proactive measures such as content moderation algorithms and reporting mechanisms to prevent the spread of CSAM in the future.
Some industry executives expressed surprise at the issuance of the notice, claiming that there were no immediate compliance issues that had been highlighted. A senior policy consultant, who works with a number of big tech firms, said, “We have not yet been able to determine whether there is a social enterprise behind triggering this notice, which is often the case. Most of the tech platforms, named in Meity’s notice on Friday, have been in active discussions with Meity through multiple consultations, and it is not yet clear what the reason may be.”
A senior lawyer who represents IT policy compliance for global tech firms, requesting anonymity because he represents clients in that space, added, “Immediate compliance has no impact. It should, because the firms mentioned in the notice have actively worked with Meity. A number of issues — including discussions on CSAM itself during the consultation period for the DPDP Act. There have also been no legal triggers in the immediate time frame. , and we have to wait and see how it turns out.”
The Information Technology (IT) Act, 2000 provides the legal framework to deal with obscene material including CSAM. Sections 66E, 67, 67A, and 67B of the IT Act impose severe penalties and fines for online transmission of obscene or obscene material.
“Exciting news! Mint is now on WhatsApp channels. 🚀 Subscribe today by clicking the link and stay updated with the latest financial insights!” Click here!