4 min readNew DelhiUpdated: Feb 18, 2026 03:23 AM IST
Raising concerns over India’s recently notified social media rules which prescribe strict content takedown timelines, social media giant Meta, which operates platforms like Facebook, Instagram and WhatsApp, said Tuesday that the norms might be “challenging” to comply with from an operational standpoint.
Rob Sherman, Meta vice president policy and deputy chief privacy officer, told reporters on the sidelines of the ongoing India-AI Impact Summit: “Operationally, three hours (the takedown window) is going to be really challenging.”
According to Sherman, the government had not consulted with the industry before notifying the rules.
“Traditionally, the Indian government has been quite consultative when it comes to these things. This is an example where I think we are concerned that had they come to us and talked to us about it, we would have talked about some of the operational challenges,” he said.
Last week, the IT Ministry notified amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. One of the changes it has implemented is that social media platforms must now remove problematic content within two-three hours as opposed to 24-36 hours before. Industry executives had earlier said that the new timeline is the shortest takedown window prescribed by any government in the world.
Although the rules have set alarm bells ringing within tech companies, government officials have said that there were enough discussions with the industry, and the timeline has been compressed following feedback from people who have been impacted as platforms did little to curb content virality quickly.
The requirement to take down content quicker does not just apply to AI-generated content but a wide range of content that the law deems unlawful. Platforms must now remove non-consensual intimate imagery within two hours, as opposed to 24 hours earlier, and other forms of unlawful content within three hours, from an earlier requirement to act on it within 36 hours.
Story continues below this ad
Meta’s Sherman said when social media companies receive takedown notices from the government, there is a certain time it takes for them to investigate and validate the flagged content, and three hours might not be enough time for that.
“Whenever we get the request from the government (to take down content), we will have to look into it, we will have to investigate it and validate it ourselves. And so that’s just something that takes some amount of time, particularly if there’s something that we need to look into. That’s often not possible to turn around in three hours,” he said.
Sherman’s comments come amid increasing scrutiny over social media platforms globally. On Tuesday, IT Minister Ashwini Vaishnaw, addressing reporters separately, said many countries were considering banning social media platforms for children. States like Andhra Pradesh and Goa have indicated that such a measure is needed.
“This is something which has now been accepted by many countries that age-based regulation has to be there. It was part of our DPDP (Digital Personal Data Protection) Act when we created this age-based differentiation on the content which is accessible to young people,” Vaishnaw said.
Story continues below this ad
Responding to a question from The Indian Express on India’s data protection law potentially opening up space for localisation of more types of personal data, Sherman said India’s current localisation requirements typically focus on “specific kinds of information that have national security implications”.
He said strict localisation requirements would be logically difficult for platforms such as WhatsApp, Instagram and Facebook because they are designed for cross-border communication, which requires data to be stored in multiple locations across the world.
© The Indian Express Pvt Ltd

