Edit

Indian Government Says X Not Immune Under IT Act for Unlawful Social Media Posts

Indian Government Says X Not Immune Under IT Act for Unlawful Social Media Posts
The Indian government has informed the Karnataka High Court that the platform X, owned by Elon Musk, cannot claim unrestricted legal immunity under the ‘safe harbour’ provision of the Information Technology Act when it comes to allowing unlawful content. The Centre stressed that promoting or tolerating harmful material under the guise of free speech poses a significant threat to democracy, public order, and social cohesion.

In its written submission to the court, the government clarified that the right to freedom of speech under Article 19(1)(a) of the Constitution is not absolute. The Solicitor General stated that constitutional jurisprudence has consistently maintained a difference between speech that promotes meaningful democratic discourse and speech that undermines societal peace, stability, or infringes on individual rights. As per the submission, X was attempting to evade accountability by wrongly interpreting the 'safe harbour' clause as an unconditional right.

The safe harbour provision shields intermediaries from liability for third-party content provided they follow due legal process and act upon notices regarding unlawful content. The government emphasized that this provision is not a blanket protection and certainly not a constitutional guarantee. Instead, it is a statutory privilege granted under specific terms. If platforms fail to comply with statutory obligations, they forfeit this privilege.

X had earlier approached the Karnataka High Court seeking to restrain Indian authorities from taking coercive action against it, arguing that blocking orders were being issued in violation of the IT Act and Article 14 of the Constitution, which guarantees equality before the law. However, the Centre responded that such platforms cannot assert rights while ignoring their duties and responsibilities, especially when their algorithms actively curate and promote certain types of content.

The government warned that the rapid spread of unlawful information on social media could seriously disrupt democratic institutions. With the ability to instantly reach millions across languages and geographies, platforms like X hold immense influence over public discourse and perception. This influence, the Centre said, brings with it a responsibility that is different from traditional media, and thus, requires a tailored regulatory approach.

In the submission, it was also noted that intermediaries use amplification technologies to promote visibility of content. Algorithms are not neutral tools; they shape narratives, boost polarising material, and can impact public sentiment. The government argued that such active roles in content curation demand stricter oversight and a higher standard of accountability.

Additionally, the Centre pointed out that when social media platforms claim safe harbour, they must demonstrate adherence to due diligence standards. This includes timely removal of content flagged as illegal by the appropriate legal authorities. The government further criticized the idea of treating safe harbour as an automatic entitlement, highlighting that the framework was designed to encourage responsible content moderation, not blanket immunity.

The implications of this case go beyond one company or one platform. The government’s stance could influence how digital platforms function in India going forward. In a global context, it echoes discussions in other jurisdictions about limiting the scope of legal protections offered to social media companies, especially when they fail to curb the spread of hate speech, disinformation, or unlawful material.

According to the government’s argument, X’s position in court tries to separate platform responsibility from platform functionality, which is fundamentally flawed. The act of selecting, promoting, or suppressing content via algorithms is an editorial decision that should attract a corresponding legal duty. The government highlighted that the influence of such platforms, especially when weaponised for misinformation or incitement, requires a framework where immunity cannot be used as a shield.

The Centre reiterated its commitment to free expression and emphasized that its intention is not to stifle speech but to ensure that rights are balanced with responsibilities. It reminded the court that freedom of speech does not extend to encouraging violence, communal unrest, or criminal activity, and that no platform should be allowed to become a medium for such harm.

If upheld, the government’s position could redefine how intermediary protections are interpreted and enforced in India. It sets the stage for more robust laws on content moderation and intermediary liability in the evolving landscape of digital communications. This case is likely to become a landmark in India’s digital regulatory framework, shaping how accountability is shared between users, platforms, and the state. As technology continues to reshape how people communicate, the law too must evolve to protect both fundamental rights and societal peace.

What is your response?

joyful Joyful 0%
cool Cool 0%
thrilled Thrilled 0%
upset Upset 0%
unhappy Unhappy 0%
AD
AD
AD
AD