Daily Breach

Legal & Policy Trending

India Notifies Landmark IT Rules Amendment 2021 to Regulate Deepfakes and AI-Generated Content

Introduction

The Government of India has formally notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, marking a significant regulatory shift in how synthetically generated information, including deepfakes and AI-generated media, is governed in the country. The notification was issued by the Ministry of Electronics and Information Technology on 10 February 2026 and will come into force on 20 February 2026.

Background / Context

Rapid advancements in generative AI have led to an exponential rise in synthetic audio, visual, and audio-visual content. While these technologies drive innovation, they also introduce serious risks such as misinformation, identity manipulation, election interference, non-consensual imagery, and large-scale social engineering.

The 2026 amendment builds upon the IT Rules, 2021, to directly address these emerging threats by bringing deepfakes and similar AI-generated content within a clear statutory framework.

Key Regulatory Changes at a Glance

1. Statutory Recognition of Synthetic Content

For the first time, the rules formally define “synthetically generated information” as audio, visual, or audio-visual content that is artificially or algorithmically created or altered in a manner that makes it appear real or indistinguishable from authentic individuals or real-world events.

Importantly, the definition excludes good-faith editing, accessibility enhancements, and routine formatting that do not misrepresent reality.

2. Mandatory Labelling and Provenance

Intermediaries facilitating the creation or dissemination of synthetic content must now ensure that such content is:

  • Clearly and prominently labelled as synthetically generated
  • Accompanied by visible or audible disclosures
  • Embedded with permanent metadata or technical provenance markers, including unique identifiers, wherever technically feasible

This aims to ensure immediate user awareness and traceability.

3. Enhanced Due Diligence Obligations

Platforms are required to deploy reasonable and appropriate technical measures, including automated tools, to prevent the creation or circulation of unlawful synthetic content. This includes content that:

  • Falsely depicts real individuals or events in a deceptive manner
  • Contains non-consensual intimate imagery or child sexual abuse material
  • Results in false electronic records or documents
  • Facilitates explosives, arms, or other serious criminal activity

4. Stricter Responsibilities for Significant Social Media Intermediaries

Significant social media intermediaries must now:

  • Obtain user declarations on whether uploaded content is synthetically generated
  • Verify such declarations using proportionate technical measures
  • Ensure that synthetic content is not published without appropriate labelling

Failure to comply may be treated as a lack of due diligence under the rules.

5. Accelerated Response Timelines

The amendments significantly shorten compliance and takedown timelines, reinforcing the expectation of near real-time action against unlawful or harmful synthetic content.

Impact / Scope

These changes have wide-ranging implications for:

  • Social media platforms
  • AI model providers
  • Content hosting services
  • Digital publishers and influencers

Organizations operating in India must reassess content moderation systems, AI governance frameworks, metadata handling practices, and user disclosure mechanisms to remain compliant.

Outlook

As generative AI adoption accelerates, these rules are likely to serve as a foundation for further AI-specific legislation in India. Organizations should expect increased scrutiny, evolving compliance standards, and potential alignment with global AI governance frameworks in the near future.

Sources

Adv. Rohan Talreja

Adv. Rohan Talreja

About Author

An Advocate with a professional focus on cyber law, information security, and data protection. His work centres on data protection compliance under India’s Digital Personal Data Protection Act, 2023 and the EU General Data Protection Regulation (GDPR), along with cybersecurity governance and legal risk management. He has experience in contract compliance and the drafting and negotiation of commercial, master service, and vendor agreements. His writing explores the intersection of cyber law and technical cybersecurity, particularly in areas of data privacy, cybercrime, regulatory compliance, and emerging cyber risks, offering practical and policy-oriented insights.

Leave a Reply

Your email address will not be published. Required fields are marked *