Summary
- India has ordered that platforms must label AI content and embed identifiers, while banning label tampering.
- The government has proposed a 3-hour window to remove flagged deepfakes or AI content.
- Automated tools are to be deployed to block exploitative content, alongside user warnings every three months.
Deepfakes and AI voice-over videos have flooded the Internet, and are more prominent on Indian social media than ever. The Indian government wants to address this and has issued a strict directive to social media platforms, including Meta and YouTube, to clearly label all AI-generated content.
Social media in India could get an AI label soon
In a move to curb the spread of misinformation and deepfakes, the Indian government has set a mandatory 3-hour deadline for platforms to take down flagged AI-generated content or deepfakes following a government notification or court order.

According to the official order, platforms must ensure that synthetic content like AI voiceovers or generated content carries an identifier which cannot be removed once applied. On top of that, platforms have also been ordered to use automated detection tools to block the circulation of illegal, deceptive or sexually exploitative AI content.
Adding it all up, platforms must warn users about the consequences of posting derogatory AI content and the legal consequences of violating AI misuse at least once every three months.
These directions build upon the existing draft amendments to the IT Rules 2021, and target social media companies with over 50 lakh registered users. While social media firms have already introduced features to disclose synthetically altered content, this latest order formalises the requirement.
All in all, this is an apt enforcement from the government that will ensure users aren't swayed by misinformation and refrain from sharing the same. This should ensure a higher level of transparency across platforms, safeguarding user trust and minimising the harm caused by deepfaked and unchecked synthetic content.



























