Is Telegram Still Safe To Use, And Are Your Messages Really Private?

Telegram and Pavel Durov

Poetra.RH/Shutterstock

Pavel Durov, founder and CEO of communication platform Telegram, was recently arrested in France. The charges included non-compliance with law enforcement in an investigation into illegal activity on the app, including illicit drug trafficking. Druov was later granted expensive bail, but he’s finally spoken about it. The Telegram chief highlighted how the company accepted a ban rather than compromising the safety net of encryption and also took a similar approach when it was pressured into censorial activities.

Advertisement

Durov didn’t go into specific details about the comprising alignment struck between privacy and security, but the platform recently made a quiet, yet alarming, change to how it handles moderation. Telegram has made some changes to its FAQ page regarding content reporting in the context of moderation, though the company argues that it has only updated the language, but the systems remain the same internally. It has been updated to say «All Telegram apps have ‘Report’ buttons that let you flag illegal content for our moderators — in just a few taps.»

This change leads to an obvious question: Can Telegram moderators take a peek into your chats now? The company said in a statement shared with The Verge that nothing has changed with the source code and that the app always had the option to report chats, even private conversations. «The FAQ change only made it clearer how to report content on Telegram,» the spokesperson said about the change in FAQ language.

Advertisement

Why moderation is a thorny topic

people using online communication

Oscar Wong/Getty Images

Now, depending on how you view moderation, your perception will differ. Why moderation? Well, it lets authorities curtail the spread of illegal activity. But at the same time, as Durov explains about the need to comply with local laws when needed, authoritarian regimes can abuse them to clamp down on dissent and activism, among other legitimate demands for private communication. Yet, moderation comes with a compromise on what one would call «private communication.» Signal, for example, has no moderation. Signal CEO Moxie Marlinspike highlighted how Signal’s all-hands-for-privacy approach and lack of moderation have even divided employees. 

Advertisement

Telegram is the most popular messenger in urban Ukraine. After a decade of misleading marketing and press, most ppl there believe it’s an «encrypted app»

The reality is the opposite-TG is by default a cloud database w/ a plaintext copy of every msg everyone has ever sent/recvd. https://t.co/6eRGIyXyje

— Moxie Marlinspike (@moxie) February 25, 2022

The argument here is that Signal is fully committed to end-to-end encryption. That means the messages you send and receive are not accessible to any person, not even Signal. They are coded the moment they leave your phone and are only decoded when they reach the recipient’s phone.

WhatsApp also relies on end-to-end encryption, though it has a system in place where a message reported as spam or risky is seen by its team of human moderators, but only the last five messages in a chain. The Meta-owned company also shares a limited amount of data with law enforcement agencies when needed. Yet, despite its controversial policy change about data sharing in 2021, the company can’t read or decipher your conversations. Telegram’s situation is tricky.

Advertisement

Not all Telegram chats use end-to-end encryption

Telegram CEO Pavel Durov

Manuel Blondeau – Corbis/Getty Images

By default, Telegram chats are not encrypted, person-to-person or group. You only get end-to-end encryption safety in the Secret chat mode. When you start a secret chat, you don’t get access to all the fancy features that you would otherwise get with non-encrypted regular chats. Also, they are locked to the same device, which means you can’t see the messages on any other connected hardware, mobile or desktop. Telegram tries to sweeten it with extra tricks like setting a self-destruct timer and mandatory two-way message deletion.

Advertisement

For regular chats, Telegram relies on a kind of encryption that it developed in-house, called MTProto. The company refers to these chats as cloud chats, which rely on a client-server/server-client format. In a nutshell, when your chats leave the phone, they are encrypted before they reach the server. Technically, the server can decrypt them, unlike end-to-end encrypted Secret chats where only the two phones involved in a conversation hold the decryption key.

This approach is deemed necessary by Telegram to make the conversations available across all platforms, including the web. Telegram says it stores the data packets in a distributed fashion across data centers in different locations across the world. «The relevant decryption keys are split into parts and are never kept in the same place as the data they protect. As a result, several court orders from different jurisdictions are required to force us to give up any data,» says the company.

Advertisement

The compromise with Telegram

Telegram and Pavel Durov

Thrive Studios/Shutterstock

Telegram’s claims of privacy appear to instill a certain sense of safety, but security experts have repeatedly flagged it as a risk. For starters, there is no way to secure group chats with end-to-end encryption on Telegram. Moreover, if an app is promising safe communication, why not enable end-to-end encryption for all chats by default? Also, for Secret Chats to work, the person on the other end has to be online.

Advertisement

But it’s the data security that is crucial. «Telegram stores all your contacts, groups, media, and every message you’ve ever sent or received in plaintext on their servers. The app on your phone is just a «view» onto their servers, where the data actually lives. Almost everything you see in the app, Telegram also sees,» notes Signal founder Marlinspike.

The Grugq, an independent security researcher and a well-known face in the zero-day trading community for hackers as well as enterprises, has a much harsher take on Telegram. «The safest way to use Telegram would be not to,» says one of their well-known posts on the Underground Tradecraft blog. It certainly doesn’t help that Telegram, which has close to a billion users, only has 30 full-time employees. 

Advertisement

«Thirty engineers’ means that there is no one to fight legal requests, there is no infrastructure for dealing with abuse and content moderation issues,» Eva Galperin, Director of Cybersecurity at Electronic Frontier Foundation, told TechCrunch. «If I was a threat actor, I would definitely consider this to be encouraging news.»

Trust shouldn’t be optional

Telegram on a phone.

Alex Photo Stock/Shutterstock

Dr. Matthew Green, a cryptography expert and professor at John Hopkins, recently highlighted the functional and fundamental flaws with Telegram’s encryption claims. «They basically made up a protocol,» Green said in an interview with the DailyDot. The Committee to Protect Journalists (CPJ) has also flagged «critical» flaws with trusting Telegram for secure communications, advising journalists to use WhatsApp or Signal. With privacy, you shouldn’t be asked to trust one aspect of the platform and ignore the other. The idea is that users shouldn’t be burdened with the trust question. It should come as a default. Everything else should be a secondary priority — ads, features, and monetization, to name a few. 

Advertisement

Telegram can tout the horns that it allows massive broadcast channels and groups. In fact, the app is a social media universe of its own. All that activity means heaps of data as well as meta-data. A lot of data, especially built atop an encryption policy that is not state-of-the-art, is bad news, especially when you have a team of just over two dozen people tasked with maintaining a platform with about 950 million users. 

With the likes of Signal, you at least know the security infrastructure is trustworthy. You don’t have to make a choice. On Telegram, you don’t get that without making functional sacrifices. It’s safe, technically, but not without caveats. Signal, or WhatsApp, doesn’t burden users with caveats, at least not with the privacy of their personal communication.

Advertisement

Scroll al inicio