Pavel Durov's Arrest & Free Speech: Legal Standpoint

Through this conversation with Vikram Jeet Singh, Partner, BTG Advaya and Gaurav Sahay, Practice Head, (Technology & General Corporate), Fox Mandal & Associates, let us deep dive into the legal aspects of this arrest and its consequences for the social media world

The arrest of Pavel Durov, CEO of Telegram, has sent shockwaves to the social media ecosystems across the world. Durov, the CEO of Telegram, a messaging application, was arrested in France on August 25, on account of illegal activities on his platform.

Durov spent five days in custody and was released on a bail of five million Euros.

Through this conversation with Vikram Jeet Singh, Partner, BTG Advaya and Gaurav Sahay, Practice Head, (Technology & General Corporate), Fox Mandal & Associates, let us deep dive into the legal aspects of this arrest and its consequences for the social media world.

What legal precedents could this arrest set for other CEOs and platforms that prioritise free speech over content moderation?

Vikram Jeet Singh, BTGA: Given the arrest was made under French law, there is limited guidance for the Indian market. India has a standalone IT Act that provides for ‘intermediary safe harbour’ for online platforms provided certain compliances are carried out. Similar protections can be found in other countries’ laws. The arrest is France is indicative of the fact that certain crimes are treated much more seriously (this was to do with failure to crack down on child porn, as per reports). At the same time, regulators may be running out of patience (and alternatives) for regulating the big platforms, short of applying criminal sanction.

Gaurav Sahay, Fox Mandal & Associates: The arrest of Telegram's founder by the French government could well set a significant legal precedent in the context of a trend towards greater accountability. Tech platforms currently operate under "safe harbour" provisions, which protect them from liability for user-generated content. This case could challenge those protections, particularly in jurisdictions where governments are pushing for more stringent content regulation.

The case could lead to a shift in regulatory approaches, with governments taking a more aggressive stance on content moderation. The arrest might also encourage other governments to take similar actions, creating a ripple effect resulting in tech companies re-evaluating their content policies and cooperation with law enforcement in different regions. This could have a chilling effect with CEOs becoming more cautious, potentially leading to more aggressive content moderation policies to avoid legal repercussions. This could spark debates about the balance between free speech and responsible content regulation.

To what extent is Telegram responsible for the illegal activities conducted on its platform, given its content moderation policies?

Vikram Jeet Singh BTGA: All online intermediaries are protected from liability for the third party content they host, provided they abide by due diligence rules and content moderation guidelines. A failure to comply with such measures makes the platform liable for content as it originated such content itself. The Telegram app is, at its core, a messaging app and not a content platform such Meta or YouTube. Even so, it faces liability for illegal content that its users share on its platform, for its failure to prevent them for doing so.

Gaurav Sahay, Fox Mandal & Associates: Telegram’s responsibility for illegal activities conducted on its platform is a nuanced issue. While its current moderation policies provide some safeguards, the platform’s focus on privacy and encryption complicates its ability to actively prevent illegal activities. The extent of its responsibility can be looked into from its design and intent through features like end-to-end encryption in its "Secret Chats." This makes it difficult for the platform to monitor or moderate content in these chats, potentially limiting Telegram's direct responsibility for the illegal activities conducted on the platform. Telegram allows users to create public channels and groups, which can reach large audiences. The platform does implement some moderation, such as removing terrorist-related content or illegal activities when they are reported. However, the scale of Telegram’s operations makes it difficult to actively monitor all content which is seen as failing to take sufficient preventive measures. In consequence thereof, Telegram has faced bans or legal actions in countries like Russia and Iran for not complying with local regulations. Telegram walks a fine line between adhering to local laws and maintaining its commitment to privacy and free speech. Its decisions to resist or comply with these laws could influence perceptions of its responsibility for illegal activities conducted on the platform.

How does Telegram's content moderation compare to other messaging apps like WhatsApp and Signal, particularly in terms of preventing child pornography?

Vikram Jeet Singh, BTGA: Difficult to answer in brief, without a deep dive into policy terms. It appears that Telegram did not comply with child protection norms. Reports indicate that this failure to co-operate with law enforcement child sexual content, but also drug-trafficking and fraud, led to the investigation and arrest. Reportedly Telegram has long operated as a low moderation service, given its roots. A complicating factor is that Telegram claims to be ‘end to end encrypted’, calling into question as to how content moderation can effectively work on it.

Gaurav Sahay, Fox Mandal & Associates: Telegram's content moderation practices differ from those of other messaging apps like WhatsApp and Signal, particularly in how they balance privacy, encryption, and the prevention of illegal activities such as child pornography and extremism.

Telegram offers end-to-end encryption only in its "Secret Chats." Regular chats and group chats are encrypted server-side, which means Telegram has access to the content if needed but the platform's main focus on privacy limits its ability to proactively scan for illegal activities. WhatsApp provides end-to-end encryption by default for all chats, which means that neither WhatsApp nor any third party can access the content of the messages. This strong encryption makes it difficult for WhatsApp to monitor content for illegal activities directly. WhatsApp relies heavily on user reports and metadata to detect potential issues. Signal is known for its stringent privacy policies and offers end-to-end encryption for all communications. Like WhatsApp, Signal cannot access the content of messages, making proactive moderation nearly impossible. Signal’s focus on minimal data collection further limits its ability to monitor illegal activities.

What changes, if any could Telegram could Telegram implement to strike a balance between privacy and security?

Vikram Jeet Singh, BTGA: The issue here is not so much free speech as crime prevention. There is no argument that child sexual content or fraudulent content like deepfakes are protected by free speech laws. There is consensus on regulating harmful content; the trick is to not err too much on the other side, such that it affects freedom of speech. (That is very much easier said than done, admittedly, as platforms have found out over the past decade.) One avenue open for intermediaries may be to engage with NGOs and industry groups to moderate harmful content.

Gaurav Sahay, Fox Mandal & Associates: There are several changes Telegram could consider implementing to enhance security and prevent illegal activities while maintaining its commitment to free speech:

Telegram could implement more intuitive and accessible reporting tools, introduce AI and machine learning algorithms to analyse behavioural patterns, introduce stricter moderation policies for public channels and groups. Expanding partnerships could help Telegram access more comprehensive databases of illegal content and improve its ability to detect and remove such materials by regularly publishing reports on content moderation actions, detailing how it handles illegal activities while protecting free speech. Without compromising encryption, Telegram could consider monitoring metadata to detect unusual patterns that might indicate illegal activity by introducing community-led moderation for larger groups and channels could help Telegram manage content more effectively.

How might this incident influence future government policies regarding digital privacy?

Vikram Jeet Singh, BTGA: Even considering that this is a specific case under French law, there are signs that regulators are running out of patience with platforms. The ‘collaborative’ approach where platforms mostly ‘self-regulate’ has not worked over the past decade. New laws such as the UK’s Online Safety Act, 2023, and India’s proposed Digital India Act, are being framed that are intended to stringently regulate online speech and illegal activity. The pressure on platforms to comply will ramp up, in the coming years, and the arrest of a platform’s management executive is important precedent.

Gaurav Sahay, Fox Mandal & Associates: The incident could act as a catalyst for more stringent government policies on digital privacy and content moderation. Governments may push for greater access to encrypted communications, more proactive content moderation, and increased accountability for platform operators.

This could lead to calls for the implementation of backdoors in encrypted messaging apps, to limit the anonymity provided by messaging apps, requiring platforms to collect more user data or enforce stricter identity verification processes. Governments could mandate implementation of proactive content moderation strategies or pre-emptive takedown measures. The incident could lead to policies that hold platform operators more accountable for illegal content on their platforms. This might include fines, criminal penalties, or other legal consequences for failing to remove or report illegal activities in a timely manner. Countries may seek to harmonize their regulations to ensure that platforms cannot exploit jurisdictional differences to avoid compliance, leading to the creation of international frameworks for content moderation and privacy protection. Safe harbour provisions, which protect platforms from liability for user-generated content, might come under scrutiny. Safe harbour protections might become conditional on platforms implementing effective content moderation policies. Governments could require companies to demonstrate that they are taking active steps to prevent illegal activities in order to qualify for liability protections.

This incident could shift public opinion in favour of more stringent content moderation, particularly if illegal activities on platforms like Telegram are linked to real-world harm.

What are the potential long term consequences for telegram if it continues with its laissez - faire approach to content hosting?

Vikram Jeet Singh, BTGA: Telegram may continue to face liability, escalating into criminal sanction. It is important to remember that penal laws are always available to law enforcement as a tool; and it may be that this incident emboldens them to start using these tools more often. For example, the abetment of pornography is an offence that can be made out against platforms who do not take requisite content moderation measures. India’s new penal code also punishes anyone who “publishes false or misleading information, jeopardising the sovereignty, unity and integrity or security of India”. Laws around online behaviour will continue to tighten, and enforcement will ramp up.  

Gaurav Sahay, Fox Mandal & Associates: If Telegram continues with its laissez-faire approach to content hosting, there could be several long-term consequences for the platform. These consequences could affect its user base, legal standing, attract bans and restrictions. Telegram could face increasing legal actions, including fines and lawsuits, eventually be forced to implement stronger moderation practices in response to regulatory pressure. If Telegram becomes widely associated with illegal activities, then companies and organizations may be reluctant to associate with Telegram, leading to a loss of partnerships and advertising opportunities. This could impact the platform’s ability to generate revenue and sustain its operations.

Any other point you would like to highlight?

Vikram Jeet Singh, BTGA: With the caveat again that this is primarily a French law matter, in the future this arrest may be seen as a watershed in enforcing online liability laws against platforms. As noted, most other attempts at regulation have failed over the past 10 years. Now, law enforcement and regulators may feel they have no choice but to apply criminal laws to individuals running these platforms, to make enforcement effective. India tried a similar tack by requiring a local officer to be liable for compliance under its IT rules. It remains to be seen if this devolves into a cross-border prosecution of management figures.

Gaurav Sahay, Fox Mandal & Associates: We should also look at the instance from a geo-political perspective as well. Surprisingly Russia has raised its protest addressing the CEO of Telegram as its citizen. The heavily polarised world between the western forces on one side and Russia and China on the other, the connection to the on-going war with Ukraine, which has intensified lately, cannot be overlooked.

profile-image

Krishnendra Joshi

BW Reporters Krishnendra has 6 years of experience in Content and Copywriting. He realised the value of persuasive writing while working with LawSikho. Writing a few marketing emails taught him that right wordings create the right impact. Reading The Boron Letters by advertising legend Gary Halbert inspired him to keep learning about the craft of writing. He does not restrict himself to legal content writing alone. He has written content for clients in the SaaS Industry and Personal development Industry. He believes in writing for multi niches to enhance his creativity and train his writing muscle.

Also Read

Stay in the know with our newsletter