Act swift on CSAM or lose safe harbour

[ad_1]


The Ministry of Electronics and Information Technology (MeitY) issued a directive on Friday, instructing social media intermediaries X (formerly known as Twitter), Telegram, and YouTube to proactively remove any child sexual abuse material (CSAM) from their platforms. The ministry warned that any delay in compliance could result in the loss of ‘safe harbour’ immunity.


The ministry emphasised the “prompt and permanent removal” or “disabling of access” to any CSAM posted by users on these platforms. Additionally, it called for the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the dissemination of CSAM in the future.


As previously reported, law enforcement agencies have observed an increase in cybercrimes against children, with a growing use of artificial intelligence to manipulate images, audios, and videos to create harmful content. The ministry did not clarify the reason for sending the notices particularly to the three platforms mentioned above.


Rajeev Chandrasekhar, Minister of State for Electronics and Information Technology (IT), stated, “We have issued notices to X, YouTube, and Telegram to ensure that no CSAM exists on their platforms. The government is committed to establishing a safe and trusted internet under the IT Rules. The IT Rules, as outlined in the IT Act, establish strict expectations for social media intermediaries not to permit criminal or harmful content on their platforms. Failure to act promptly will result in the withdrawal of their safe harbour under Section 79 of the IT Act, with legal consequences under Indian law.”


Section 79 of the IT Act, 2000, states that an intermediary shall not be held liable for any third-party information, data, or communication links made available or hosted by them. This provision offers a ‘safe harbour’ or immunity to online platforms against legal action concerning illegal content shared on their platform. However, this immunity is subject to certain obligations prescribed under IT Rules.


The ministry also noted that non-compliance with its directives would be considered a violation of Rule 3(1)(b) and Rule 4(4) of the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.


These rules mandate online intermediaries to remove any user-generated content that is obscene, pornographic, peedophilic, or harmful to children.


According to the due diligence requirements outlined in the rules, platforms must take down unlawful content within 36 hours of receiving a court order or a directive from a government agency.


The IT Act, 2000, also provides a legal framework for addressing pornographic content, including CSAM. Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

[ad_2]

Source link