Amendment to IT (Intermediary Guidelines & Digital Media Ethics Code) Rules, 2025 

Posted On - 28 November, 2025 • By - King Stubb & Kasiva

The Ministry of Electronics and Information Technology, through its notification dated 21 October 2025, has introduced the Information Technology (Intermediary Guidelines & Digital Media Ethics Code) Amendment Rules, 2025. This update comes at a time when digitally altered videos, images and voice clips have started resembling authentic content so closely that most users cannot tell the difference. With such material rapidly circulating online, the Government has signalled that the earlier light-touch regulatory approach is no longer sufficient. The amendments formally recognise “synthetically generated information” content created or modified through automated or algorithmic processes to imitate genuine material and place a direct responsibility on intermediaries to ensure that users are not misled by it.

Under the revised framework, any platform that permits users to create, upload, modify or share synthetic content must now adopt unambiguous and visible labelling practices. For visual content, this means embedding a permanent marker or indicator showing that the material is synthetic. For audio content, a clear verbal disclaimer must be placed within the first 10% of its duration so that listeners are alerted upfront.

The obligations are even stricter for significant social media intermediaries. Such platforms must now require users to actively declare whether the material they are uploading is synthetic. This declaration cannot simply be taken at face value the Rules obligate platforms to verify the user’s disclosure through appropriate technical checks. Additionally, intermediaries must ensure that any label placed on synthetic content is easily identifiable and understandable for an ordinary user. Failure to detect undeclared synthetic content or inaction against misleading uploads may now be treated as a violation of the intermediary’s due-diligence responsibilities.

For employers especially organisations operating digital platforms, creator tools, user-generated content ecosystems or services that enable content modification this is not a compliance requirement that can be deferred. It calls for a comprehensive review of the platform’s content lifecycle, from user on boarding to publishing, flagging, moderation and archival. Companies will need to examine whether their current content-verification systems are adequate, strengthen tools that detect altered or manipulated material, update user agreements and policies, and ensure that robust audit logs exist to demonstrate compliance if a regulator seeks verification.

The broader regulatory message is unmistakable as synthetic content becomes easier to produce and faster to spread, intermediaries are expected to anticipate the risks and take proactive steps to protect users from deception. For businesses operating in the digital or platform economy, adherence to these Rules is far more than a routine regulatory formality. It serves as a safeguard against reputational harm, erosion of user trust and increased legal vulnerability in an environment where the distinction between authentic and altered content is narrowing at an unprecedented pace.