Twelve days after his arrest in France, Telegram CEO Pavel Durov has broken his silence with a 600-word statement that paints his legal troubles as a byproduct of the platform’s rapid growth and mounting regulatory pressures.
Durov, detained on allegations that his platform facilitated criminal activity, including the distribution of child sexual abuse material, defended himself by claiming that Telegram is experiencing “growing pains”—a phrase he used to underscore the challenges of policing an ever-expanding user base.
His statement comes as part of an ongoing clash between global tech platforms and regulators, particularly in Europe, where authorities have ramped up efforts to hold companies accountable for the illegal activities that occur on their platforms. Durov’s arrest is the latest episode in this struggle, as authorities seek to impose stricter accountability on tech CEOs.
Tekedia Mini-MBA edition 16 (Feb 10 – May 3, 2025) opens registrations; register today for early bird discounts.
Tekedia AI in Business Masterclass opens registrations here.
Join Tekedia Capital Syndicate and invest in Africa’s finest startups here.
While Durov expressed shock over the legal action, citing what he believes is miscommunication between French authorities and Telegram, he also admitted that “policing Telegram has become harder” as the platform’s user base has surged to 950 million. He went on to emphasize his personal commitment to “significantly improving things in this regard.”
In his full statement, Durov detailed the context of his arrest, offering a three-pronged defense. First, he noted that Telegram has an official representative in the EU to handle law enforcement requests, and its contact information is publicly available. Second, he highlighted that French authorities had multiple ways to contact him, given his French citizenship and regular visits to the country. Lastly, he argued that holding a CEO responsible for third-party activities on a platform is “a misguided approach.”
Durov argued that charging him personally is an example of authorities overreaching.
“Building technology is hard enough as it is. No innovator will ever build new tools if they know they can be personally held responsible for potential abuse of those tools,” he said.
This friction between tech innovators and regulators over privacy and security is not new, and Durov’s case exemplifies the complexity of balancing user rights with law enforcement needs. The Telegram CEO acknowledged this challenge in his statement, explaining how difficult it is to “reconcile privacy laws with law enforcement requirements” while ensuring a platform operates consistently across various legal systems.
As Durov explained, Telegram has had to make difficult decisions in the past when confronted with demands from authoritarian regimes. He highlighted Telegram’s refusal to hand over “encryption keys” to Russian authorities, leading to the platform being banned in Russia, as well as its refusal to block channels of peaceful protesters in Iran, which resulted in a similar ban. He reiterated that Telegram’s goal is not profit-driven, but rather focused on “defending the basic rights of people, particularly in places where these rights are violated.”
However, Durov did not shy away from admitting Telegram’s shortcomings. He said: “Even the fact that authorities could be confused by where to send requests is something that we should improve.”
Yet, he strongly refuted claims that Telegram is an “anarchic paradise” for criminals, stressing that the platform takes down millions of harmful posts and channels daily, and has transparent processes in place, such as daily transparency reports and direct hotlines with NGOs for urgent moderation requests.
Read His Full Statement Below:
Thanks everyone for your support and love!
Last month I got interviewed by police for 4 days after arriving in Paris. I was told I may be personally responsible for other people’s illegal use of Telegram because the French authorities didn’t receive responses from Telegram.
This was surprising for several reasons:
1. Telegram has an official representative in the EU that accepts and replies to EU requests. Its email address has been publicly available for anyone in the EU who googles “Telegram EU address for law enforcement”.
2. The French authorities had numerous ways to reach me to request assistance. As a French citizen, I was a frequent guest at the French consulate in Dubai. A while ago, when asked, I personally helped them establish a hotline with Telegram to deal with the threat of terrorism in France.
3. If a country is unhappy with an internet service, the established practice is to start legal action against the service itself. Using laws from the pre-smartphone era to charge a CEO with crimes committed by third parties on the platform he manages is a misguided approach. Building technology is hard enough as it is. No innovator will ever build new tools if they know they can be personally held responsible for the potential abuse of those tools.
Establishing the right balance between privacy and security is not easy. You have to reconcile privacy laws with law enforcement requirements, and local laws with EU laws. You have to take into account technological limitations. As a platform, you want your processes to be consistent globally, while also ensuring they are not abused in countries with weak rule of law. We’ve been committed to engaging with regulators to find the right balance. Yes, we stand by our principles: our experience is shaped by our mission to protect our users in authoritarian regimes. But we’ve always been open to dialogue.
Sometimes we can’t agree with a country’s regulator on the right balance between privacy and security. In those cases, we are ready to leave that country. We’ve done it many times. When Russia demanded we hand over “encryption keys” to enable surveillance, we refused — and Telegram got banned in Russia. When Iran demanded we block channels of peaceful protesters, we refused — and Telegram got banned in Iran. We are prepared to leave markets that aren’t compatible with our principles because we are not doing this for money. We are driven by the intention to bring good and defend the basic rights of people, particularly in places where these rights are violated.
All of that does not mean Telegram is perfect. Even the fact that authorities could be confused about where to send requests is something that we should improve. But the claims in some media that Telegram is some sort of anarchic paradise are absolutely untrue. We take down millions of harmful posts and channels every day. We publish daily transparency reports (like this or this ). We have direct hotlines with NGOs to process urgent moderation requests faster.
However, we hear voices saying that it’s not enough. Telegram’s abrupt increase in user count to 950M caused growing pains that made it easier for criminals to abuse our platform. That’s why I made it my personal goal to ensure we significantly improve things in this regard. We’ve already started that process internally, and I will share more details on our progress with you very soon.
I hope that the events of August will result in making Telegram — and the social networking industry as a whole — safer and stronger. Thanks again for your love and memes.