‘Remove child sexual abuse material’: India warns X, YouTube, Telegram | Latest News India

GoogleAds

The ministry of electronics and information technology (MeitY) on Friday issued a notice to social media platforms, including Telegram, X (formerly Twitter)), and YouTube, to remove child sexual abuse material (CSAM) from their platforms in India.

The notice also calls for the implementation of proactive measures to take down child sexual abuse material (Representative Photo) {{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}}

The notice also calls for the implementation of proactive measures to take down CSAM and emphasises the importance of prompt and permanent removal of CSAM.

Stay tuned with breaking news on HT Channel on Facebook. Join Now

Minister of state for electronics and IT Rajeev Chandrasekhar has been a vocal advocate for removing such harmful content from the Indian internet, ensuring this approach becomes the ministry’s policy vision.

MeitY also said that “delay in complying with the notices will result in the withdrawal of their safe harbour protection under Section 79 of the IT Act”.

“There will be ZERO tolerance for criminal & harmful content on Indian Internet. ITRules under the ITAct clearly lays down the expectation from Intermediaries: They cannot host criminal & harmful content like CSAM,” Chandrasekhar wrote on Twitter.

{{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}}

It was not immediately clear what prompted the government to issue a warning to the three platforms.

The notices also state that non-compliance with these requirements will be deemed a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021, said the ministry.

The Information Technology (IT) Act, 2000, provides the legal framework for addressing pornographic content, including CSAM.

Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

However, safe harbour protection does not automatically disappear. The loss of safe harbour is determined by courts and not the executive such as the IT Ministry. For the courts to assess whether safe harbour has been lost or not, someone first has to file a case.

{{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}}

Rule 3(1)(b) requires all intermediaries, irrespective of their size, to make “reasonable efforts” to ensure that their platform does not contain content that is “obscene, pornographic, paedophilic” or “harmful to child” amongst other things.

Rule 4(4) requires significant social media intermediaries, that is, social media platforms with more than 50 lakh users in India, to “endeavour to deploy” technology-based solutions to proactively identify and take down CSAM.

When these rules were notified in 2021, multiple legal experts pointed out that language that includes “reasonable efforts” or “shall endeavour to” leads to ambiguity as to who would determine if reasonable efforts had been made and if the platform had endeavoured to take down CSAM.

Twitter uses PhotoDNA to detect and take down CSAM. PhotoDNA was developed by Microsoft to scan hashes (cryptographic fingerprints) of images against a database of known CSAM.

{{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}}

However, a New York Times report published in February said that after Elon Musk took over Twitter (now X), CSAM increased on the platform as he had fired teams that were experienced in dealing with the problem. In July, Musk reinstated a far-right Twitter account “Dom Lucre” that had earlier been suspended for posting CSAM.

YouTube uses its in-house technology called CSAI Match which works similarly. HT reached out to Google and Telegram for comment.

“We have a zero-tolerance policy on child sexual abuse material. No form of content that endangers minors is acceptable to us. We have heavily invested in the technology and teams to fight child sexual abuse and exploitation online and take swift to remove it as quickly as possible. In Q2 2023, we removed over 94,000 channels and over 2.5 million videos for violations of our child safety policies. We will continue to work with experts inside and outside of YouTube to provide minors and families with the best protections possible,” a YouTube spokesperson said responding to the ministry notice.

{{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}}

Telegram spokesperson Remi Vaughn responded saying, “Child abuse materials are explicitly forbidden by Telegram’s terms of service. Telegram’s moderators actively patrol public parts of the platform and accept user reports in order to remove content that breaches our terms. In the case of child abuse content, we publish daily reports about our efforts here: t.me/stopca. More than 48,000 groups and channels were removed over the month of September.”

According to the Telegram channel Vaughn, 10,312 groups and channels have been banned in the first six days of October. Despite the requirement under IT Rules 2021, Telegram does not officially publish a monthly transparency report, making the channel one of the very few ways in which such information can be determined.

{{^userSubscribed}} {{/userSubscribed}} {{^userSubscribed}} {{/userSubscribed}} Unlock a world of Benefits with HT! From insightful newsletters to real-time news alerts and a personalized news feed – it’s all here, just a click away!- Login Now! Get Latest India News along with Latest News and Top Headlines from India and around the world

Source : https://www.hindustantimes.com/india-news/remove-child-sexual-abuse-material-india-warns-x-youtube-telegram-101696652523819-amp.html

Auteur :

Date de Publication : 2023-10-07 09:00:00

Le droit d’auteur pour le contenu syndiqué appartient à la source liée.

GoogleAds
Photo Video Mag
Logo
Compare items
  • Total (0)
Compare
0

1 2 3 4 5 6 7 8 .. . . . . . . . . . . . . . . . . . . . . . . . . .... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .