Aishani Partners
The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026 – Salient Features

I. Introduction
On 10 February 2026, the Ministry of Electronics and Information Technology notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Amendment Rules, 2026, further amending the Rules.
This Amendment has come into effect last Friday, i.e., on 20.02.2026. The Amendment brings revisions on to Rules 2, 3, 4 and 7 of the Rules. The Amendment introduces a comprehensive regulatory framework governing “synthetically generated information”, by recognising it and excluding edited good-faith content from the definition, in response to the proliferation of artificial intelligence-generated and deepfake content. With the introduction of such definitions and provisions, India joins the likes of the European Union, California State of the USA and China in developing regulations addressing the issues surrounding illegal deepfakes and AI-generated content.
II. Recognition of Synthetically Generated Information (“SGI”)
The Amendment introduces new statutory definitions and imposes compliance requirements for social media intermediaries relating to disclosure and labelling of synthetic content.[i] The important definitions which the Amendment has attempted to define are:
- “audio, video or audio-visual information” includes any audio and/or visual content, including but not limited to recordings or images, whether created or modified using a computer resource.
- “synthetically generated information” or SGI to is defined as any “audio-visual” material that is artificially or algorithmically created or altered using computer resources so as to appear authentic or indistinguishable from real persons or events, and includes deepfakes. The definition excludes any audio-visual material that is a result of routine or good-faith editing, formatting, educational or research preparation, or technical enhancements that do not materially alter the underlying content so as to give the impression that the material is authentic or “real-life like”. The definition by implication, also excludes text-based material that is created artificially.
III. Additional due diligence obligations for internet intermediaries:
Rule 3 of the Rules have been amended to oblige all intermediaries to follow due diligence obligations[ii], including:
- informing users, at least once every three months, of their liability, if found in contravention of the rules and regulations / terms of use / privacy policy of the website/application, including the right of the intermediary to terminate or suspend user accounts, liabilities under specific statutes and criminal liabilities for offences.
Additionally, any intermediary that allows users to create, generate, modify, alter, publish, transmit or disseminate SGI is obliged to:
- inform users that causing creation, generation, modification, alteration, publication or transmission of information as SGI in violation of applicable law or in the commission of an offence under criminal laws, will oblige the intermediary to:
- disable access to or remove such information;
- suspend or terminate the user account;
- identify such user and disclose such identity to the victim of such contravention;
- where such violation is an offence under criminal law, mandatory reporting of such violation to the appropriate authority under applicable law.
Such actions may be triggered when an intermediary becomes aware of contraventions, either on its own accord, or though a complaint by a victim, or through “actual knowledge” from an appropriate authority.
the amendment also introduces new Rule 3(3) requiring compliance duties for intermediaries that enable the creation or dissemination of SGI. Such intermediaries are required to deploy reasonable and appropriate technical measures, including automated tools, to prevent the generation or circulation of SGI that violates applicable law, particularly content involving sexual exploitation, non-consensual intimate imagery, impersonation, false electronic records, or material relating to explosives or unlawful activities.
Where the SGI is lawful, intermediaries must ensure that it is clearly and prominently labelled as synthetically generated, and accompanied by metadata or other appropriate technical provenance (origin tracing) mechanisms. Intermediaries are further prohibited from allowing the removal or alteration of such labels or embedded identifiers.[iii]
Additionally, Significant Social Media Intermediaries (“SSMIs”) as per new Rule 4(IA), must require their users to declare whether uploaded content is synthetically generated, verify such declarations through appropriate technical measures, and ensure that the approved SGI is displayed with clear disclosure prior to publication. SSMIs are responsible for ensuring that all SGIs are verified and not be published without a label (either due to user declaration or through their verification).
Rule 4(4) now mandates that SSMIs are to deploy appropriate technical measures, including automated tools or other suitable mechanisms, to detect information depicting serious offences such as rape, child sexual abuse or conduct, or any duplicates of any information already taken down by the intermediary. This rule was earlier directory in nature, and has been made mandatory. SSMIs are also mandatorily required to notify users attempting to access such information that it has identified such information to be violative of law.[iv]
IV. Time limit for Intermediaries to take action
- The time limit available to Social Media Intermediaries to take action upon receipt of actual knowledge of a commission of an act violative of applicable law or an offence, has been significantly reduced from 36 (thirty six) hours to 3 (three) hours.[v]
- Further, the time limit for a Grievance Officer appointed by the intermediary to resolve a complaint received from a user or victim has been reduced from 15 (fifteen) days to 7 (seven) days.[vi]
- Further, the time limit for the Grievance Officer appointed by an intermediary to resolve a complaint related to a request for takedown of an illegal or offensive content has been reduced from 72 (seventy two) hours to 36 (thirty six) hours.
- The time limit for an intermediary to remove or disable access to content involving nudity, sexual acts, privacy violations, impersonation, or artificially morphed images, upon receipt of a complaint, has been reduced from 24 (twenty four) to 2 (two) hours.
Contributed by Aditi Verma Thakur and Akash Sajan
[i] IT Intermediary Rules r. 2(1)(ca).
[ii] ibid r. 3(1)(c).
[iii] ibid r. 3(3).
[iv] ibid r. 4.
[v] ibid r. 3(1)(d).
[vi] ibid r. 3(2).
