Pavel Durov, the CEO and founder of messaging app Telegram, was detained in Paris on Saturday as part of an ongoing French investigation into financial and cybercrimes. On Monday, French officials said he remains under arrest, although he has not been charged with any crime.
French President Emmanuel Macron denied it The arrest was politically motivated. Durov holds French and United Arab Emirates citizenship but is originally from Russia; France strongly criticized Russia Ukraine invasion and imposed sanctions on its economy.
Details on what exactly led to the arrest are limited. However, according to French prosecutors, Durov is being detained as part of a larger French investigation. The New York Times The report said prosecutors said they were looking for an “unidentified person” who they believe committed the crime. An extensive list of crimes — apparently with the help of Telegram — including distribution of child sexual abuse materials, money laundering and drug trafficking. The Washington Post reports that French police have advised that “child sex crimes” are an area of particular focus for officers.
It is unclear what Durov’s relationship, if any, is to “nameless person“Unless formally charged, Durov can only be held until Wednesday.
This is not the first time that Telegram has been linked to illegal activities. This is one Globally popular platform which offers both broadcast channels (so users can send text and media to large groups) and user-to-user chat. It also offers what it says.”secret chat” conversations that are end-to-end encrypted — meaning that messages sent are readable only to conversation participants and no one else, not even Telegram, can see the content.
That feature, as well Other privacy features Like self-deleting messages, make the app extremely useful for political dissidents and journalists trying to work under repressive regimes or trying to protect sources. But the app has also, over the years, become a place where extremists can radicalize users and organize terrorist attacks.
That led Some pressure from the government Telegram to be more cooperative in sharing data with authorities. Even so, Telegram has largely managed to avoid dramatic legal encounters — until now.
Durov’s arrest is bringing new scrutiny to the app and reigniting the hotly debated issues of free speech and challenges to content moderation on social media.
Telegram and content control issues
Durov and his brother Nikolai founded Telegram to offer an app that focused on user privacy following Russia’s pursuit.Snow revolution” in 2011 and 2012, when blatant electoral fraud fueled months of protests, culminating in a harsh and ever-evolving government crackdown. Earlier, Durov clashed with Russian authorities who wanted to suppress speech on services like Facebook Founded as VKontakte.
Over the years since its inception, Telegram has allegedly enabled some truly heinous crimes. Perhaps most notoriously, it was used to coordinate ISIS attacks in Paris and Berlin. It cracked down on ISIS-based activity on the app after those attacks, but its content moderation policies have faced much scrutiny.
As Vox points out, those policies are laxer than those of other social media groups and outlets The Washington Post reported that Telegram hosted a variety of incriminating content, including child pornography. Keeping this kind of material off a platform is a difficult — but not impossible — task, Alessandro Accorsi, a researcher at the International Crisis Group, told Vox.
“The effectiveness of content moderation depends largely on the platform and the resources allocated to security,” Accorsi said. “Social media companies are generally reactive. They want to limit the financial resources devoted to moderation, as well as potential legal, political and ethical headaches. So what usually happens is that they will focus their efforts on a few groups or issues for which inaction on their part is legal or honorable. bears the cost.”
For example, when ISIS uses a service for terrorist attacks, that service focuses on stopping ISIS from using its products.
In communications that are not end-to-end encrypted, technology companies use a combination of algorithm-driven alongside human investigators. program Sort by content. The end-to-end encryption used in Telegram’s “secret chats,” however, makes this kind of moderation impossible.
Also complicating matters is the varied nature of Internet laws around the world. In the United States, publishers are generally shielded from legal liability for what users post. But not universally so; Many countries have many Strict legal framework around intermediary liability. of France SREN Act Very strict and can impose fines against publishers for content infringement
“It’s a really hard thing to do, especially in a comparative context, because what’s hateful or extreme or radical speech in some places like the United States is going to be different from Myanmar or Bangladesh or other countries,” David Muchlinskyprofessor of international affairs at Georgia Tech, told Vox. This makes content moderation “a clumsy tool at best”.
There are telegrams, recent responses outside stressSome have employed content moderation, Accorsi told Vox. It has banned channels associated with a handful of organizations (most recently Hamas and far-right groups in the UK), but thousands of problematic groups still exist.
France’s investigation suggests that Telegram is not doing enough to prevent bad actors from using the platform to commit crimes.