The Crumbling Shield of Safe Harbor: A Legal Critique of Intermediary Immunity under Indian Law

Posted On - 19 June, 2025 • By - Jidesh Kumar

Introduction

The legal framework surrounding digital intermediaries in India is undergoing a significant transformation. Once safeguarded by the relatively broad protective ambit of Section 79 of the Information Technology Act, 2000 (“IT Act”), intermediaries such as online marketplaces, social media platforms, and messaging services now find themselves increasingly vulnerable to litigation, regulatory scrutiny, and criminal exposure. With the evolution of technology, platforms have shifted from being passive conduits of information to active facilitators of transactions, curators of content, and aggregators of profit.

This article argues that the notion of “safe harbor” under Indian law is no longer tenable in its current form. The scope of immunity granted by Section 79 is riddled with definitional gaps, vague standards, and a compliance structure that is more performative than substantive. More importantly, the application of criminal provisions under the Bharatiya Nyaya Sanhita, 2023 (“BNS”) has opened a new frontier for classifying intermediaries as abetters, conspirators, or recipients of proceeds of crime.

Section 79 IT Act: Statutory Immunity and Its Weaknesses

Section 79(1) provides that an intermediary shall not be liable for any third-party information, data, or communication link made available or hosted by it. However, Section 79(3) immediately qualifies this exemption by removing protection in cases where the intermediary:

  • Has conspired, abetted, aided, or induced the unlawful act;
  • Has actual knowledge and fails to act expeditiously; or
  • Fails to observe due diligence as prescribed.

At first glance, the section appears to offer a balanced mechanism—immunity conditioned upon neutrality and compliance. But closer analysis reveals that the very terms governing the exceptions are legally undefined and dangerously vague.

Due Diligence: A Hollow Concept

The term “due diligence” is nowhere defined in the IT Act. Instead, it is broadly referenced in Section 79(2)(c) and then left to be operationalized by subordinate legislation—specifically, the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (“IT Rules, 2021”).

These Rules require intermediaries to:

  • Publish terms of service prohibiting illegal content;
  • Appoint grievance officers and compliance personnel;
  • Respond to takedown notices within 36 hours;
  • Ensure traceability (for significant social media intermediaries);
  • Submit monthly compliance reports.

Yet, these obligations are largely procedural, and do not mandate proactive deterrence or technical safeguards such as content filtering or algorithmic monitoring. In the absence of a legislative definition, Indian courts would logically apply the test of reasonableness. But even this test is undefined in the intermediary context, resulting in a compliance regime that is entirely subjective and inconsistently enforced.

Legal Uncertainty and Litigability: because due diligence lacks statutory or judicial clarity, it is a fertile ground for litigation. What qualifies as adequate due diligence? Is issuing a warning enough? Must the intermediary suspend accounts? Must it conduct IP verification? The absence of these answers makes every enforcement action subject to judicial interpretation—creating uncertainty for platforms and opportunity for plaintiffs.

Undefined Terms: Conspired, Abetted, Aided, Induced

Section 79(3)(a) states that safe harbor does not apply where the intermediary has “conspired or abetted or aided or induced the commission of the unlawful act.” These terms—particularly “induced” and “abetted”—are borrowed from criminal law, yet remain undefined within the IT Act.

This opens up significant room for challenge:

  • Does promoting a listing amount to inducement?
  • Is algorithmic amplification a form of aiding?
  • Does monetization through commissions or ad revenue constitute abetment?

The use of such terms without statutory definition exposes the entire immunity regime to constitutional scrutiny under Article 14 (equality before the law) and Article 21 (due process). Furthermore, since these terms have criminal law analogs, the BNS provides a more aggressive framework to hold intermediaries liable.

Judicial Interpretations and the Erosion of Neutrality

A. Shreya Singhal v. Union of India (2015): In this landmark judgment, the Supreme Court clarified that intermediaries are not liable to act unless they receive a court order or government directive, thus rejecting the “notice and takedown” model driven by private complaints. However, the Court’s interpretation does not protect intermediaries from liability where there is actual knowledge and failure to act, or where the platform actively participates in content dissemination.

B. Christian Louboutin SAS v. Nakul Bajaj (2018): The Delhi High Court held that an e-commerce platform which exercises control over product listings, logistics, warranties, and customer engagement cannot claim safe harbor. The Court emphasized that where a platform goes beyond mere hosting to facilitating and monetizing, it becomes an active participant, thereby forfeiting its immunity.

C. MySpace Inc. v. Super Cassettes Industries Ltd. (2016): The Delhi High Court observed that platforms cannot ignore repeated takedown requests and continue to host infringing content. The Court introduced the idea of “constructive knowledge”—where repeated violations can imply awareness, even if formal notice is absent. The doctrine of willful blindness was implicitly endorsed.

Criminal Exposure Under Bharatiya Nyaya Sanhita, 2023

The BNS introduces several provisions that can be directly applied to digital intermediaries engaging in—or tolerating—illegal activities:

  • Section 61 – Abetment: Any person who instigates, aids, or intentionally facilitates the commission of an offense is liable. This can include platforms that knowingly allow infringers to operate, promote such content, or fail to remove it despite knowledge.
  • Section 62 – Criminal Conspiracy: When two or more persons agree to commit an illegal act, or a legal act by illegal means, they are liable for conspiracy. Platforms that share revenues with counterfeit sellers or ignore patterns of illegality may be deemed to be in tacit conspiracy.
  • Section 316 – Dishonest Receipt of Stolen Property: Receiving commissions or fees from the sale of infringing or counterfeit goods may be considered proceeds of a crime. Where a platform is aware—or should reasonably be aware—of the illegality, it may be treated as a receiver of stolen property.

Constructive Knowledge and Willful Blindness

Modern platforms operate using advanced analytics, content scanning algorithms, and behavioral tracking. It is implausible to claim ignorance of infringing or criminal activity when the tools to detect and act upon it are readily available.

The doctrine of constructive knowledge—knowledge a party is deemed to have because it ought to have known—coupled with willful blindness—the deliberate avoidance of actual knowledge—creates a compelling basis for denying safe harbor to platforms that fail to act. Courts are increasingly receptive to these doctrines, especially when platforms continue to profit from illegal activities despite prior warnings or litigation.

Policy Failure: Toothless Due Diligence

The IT Rules, 2021 offer no real deterrent. There is no obligation to:

  • Proactively filter or block infringing content;
  • Suspend repeat violators;
  • Impose penalties on users who breach terms;
  • Publicly disclose enforcement outcomes.

In effect, intermediaries can maintain a formal facade of compliance while doing nothing to prevent recurring illegality. This not only negates the intent of Section 79 but also amounts to aiding the normalization of digital illegality.

A Call for Aggressive Enforcement

In light of the above, regulators, plaintiffs, and courts must:

  • Treat procedural compliance as insufficient without substantive action.
  • Use Sections 61, 62, and 316 of the BNS to initiate criminal proceedings.
  • Demand discovery of internal moderation and enforcement data during litigation.
  • Reject the defense of neutrality when platforms exercise commercial and editorial control.
  • Encourage constitutional scrutiny of vague terms and subjective standards in Section 79.

Conclusion

The time has come to abandon the fiction that intermediaries are passive actors. Today’s platforms are multi-layered commercial ecosystems that influence content visibility, consumer behavior, and transaction legitimacy. Section 79, as currently drafted and applied, is inadequate, outdated, and dangerously permissive.

With the undefined nature of key terms like “due diligence,” “abetment,” and “inducement,” and with increasing reliance on vague standards like “reasonableness,” the law leaves too much to post-facto judicial discretion. This is a legal vacuum that must be filled—through litigation, regulatory amendments, and criminal prosecution.

The age of safe harbor is over. The age of platform accountability—backed by civil damages, criminal liability, and public transparency—has decisively begun.