Pavel Durov’s arrest suggests that the law enforcement dragnet is being widened from private financial transactions to private speech.
The arrest of the Telegram CEO Pavel Durov in France this week is extremely significant. It confirms that we are deep into the second crypto war, where governments are systematically seeking to prosecute developers of digital encryption tools because encryption frustrates state surveillance and control. While the first crypto war in the 1990s was led by the United States, this one is led jointly by the European Union — now its own regulatory superpower.
Durov, a former Russian, now French citizen, was arrested in Paris on Saturday, and has now been indicted. You can read the French accusations here. They include complicity in drug possession and sale, fraud, child pornography and money laundering. These are extremely serious crimes — but note that the charge is complicity, not participation. The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.
Well, except Telegram isn’t a good tool for privacy.
There is no E2EE. Simple encryption is only available for 1:1 chats and disabled by default. Telegram doesn’t disclose their encryption methods, so there is no way to verify the (in)effectiveness. Telegram is able to block channels from their end, so there is no privacy from their end either.
Well, except Telegram isn’t a good tool for privacy.
That’s not the point. The hunting down on tools and their creators (and on our right to privacy) is the issue here. At least, imho.
It has nothing to do with privacy. Telegram is an old-school social network in that it doesn’t even require that you register to view the content pages. It’s also a social network taken to the extreme of free speech absolutism in that it doesn’t mind people talking openly about every kind of crime and their use of its tools to make it easier to obtain the related services. All that with no encryption at all.
Free speech is good. Government regulated speech is bad.
free speech can be good. free speech can also be bad. overall, it’s more good than bad however society seems to agree that free speech has limits - you can’t defame someone, for example
free speech absolutism is fucking dumb; just like most other absolutist stances
this also isn’t even about free speech - this is about someone having access to information requested by investigators to solve crimes, and then refusing to give that information
This is pure nonsense.
Western governments hate Telegram because until now Telegram didn’t cooperate with Western intelligence services like American social media companies do. Everything on Meta or Google gets fed into NSA, but Telegram has been uncooperative.
This will likely change after Durov’s arrest, but it was nice while it lasted.
we don’t disagree about that: governments don’t like that telegram doesn’t cooperate; that’s not in dispute
where the disagreement comes is the part after. telegram (and indeed meta, google, etc) have that data at their disposal. when served with a legal notice to provide information to authorities or shut down illegal behaviour on their platforms, they comply - sometimes that’s a bad thing if the government is overreaching, but sometimes it’s also a good thing (in the case of CSAM and other serious crimes)
there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic
there are plenty of clear cut examples of where telegram should shut down channels - CSAM etc… that’s what this arrest was about; the rest is academic
Was it? The French authorities did not provide any convincing evidence, just accusations.
This will likely change after Durov’s arrest, but it was nice while it lasted.
Why use a tool that relies on the goodwill of the operator to secure your privacy? It’s foolish in the first place.
The operator of that tool tomorrow may not be the operator of today, and the operator of today can become compromised by blackmail, legally compelled (see OP), physically compelled, etc to break that trust.
ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.
Quoting @possiblylinux127@lemmy.zip :
Other encrypted platforms: we have no data so we can’t turn over data
Telegram: we collect it all. No you can’t know who is posting child abuse content
And frankly, if they have knowledge of who is sharing CSAM, it’s entirely ethical for them to be compelled to share it.
But what about when it’s who is questioning their sexuality or gender? Or who is organizing a protest in a country that puts down protests and dissent violently? Or… Or… Or… There are so many examples where privacy IS important AND ethical, but in zero of those does it make sense to rely on the goodwill of the operator to safeguard that privacy.
ANYONE who understood how telegram works and also felt it was a tool for privacy doesn’t really understand privacy in the digital age.
Telegram is the most realistic alternative to breaking Meta’s monopoly. You might like Signal very much, but nobody uses it and the user experience is horrible.
That apparently applies to child abuse and CSAM
Questionable interpretation. Privacy doesn’t mean mathematically proven privacy. A changing booth in a store provides privacy but it’s only private because the store owner agreed to not monitor it (and in many cases is required by law not to monitor it).
Effectively what you and the original commenter are saying (collectively) is that mathematically proven privacy is the only privacy that matters for the Internet. Operators that do not mathematically provide privacy should just do whatever government officials ask them to do.
We only have the French government’s word to go off of right now. Maybe Telegram’s refusals are totally unreasonable but maybe they’re not.
A smarter route probably would’ve been to fight through the court system in France on a case by case level rather than ignore prosecutors (assuming the French narrative is the whole story). Still, I think this is all murkier than you’d like to think.
It’s a street, not a changing booth. Also, I’m familiar with every charge against Durov and I personally have seen the illegal content I talked about. If it’s so easily accessible to the public and persists for years, it has nothing to do with privacy and there is no moderation - though his words also underscore the latter.
Who said it’s a street? What makes it a street?
personally have seen the illegal content I talked about.
Did you seek it out? I and nobody I know personally, have ever encountered anything like what was described on that platform and I’ve been on it for years.
Was it the same “channel” or “group chat” that persisted for years?
What gives them the right or responsibility to moderate a group chat or channel more than say Signal or Threema? Just because their technical back end lets them?
I mean by that argument Signal could do client side scanning on everything (that’s an enforcement at the platform level that fits their technical limitations). Is that where we’re at? “If you can figure out how to violate privacy in the name of looking for illegal content, you should.”
Nothing Telegram offers is equivalent to the algorithmic feeds that require moderation like YouTube, Twitter, Instagram, or Facebook, everything you have to seek out.
Make no mistake, I’m not defending the content. The people who used the platform to share that content should be arrested. However, I’m not sure I agree with the moral dichotomy we’ve gotten ourselves into where e.g., the messenger is legally responsible for refusing service to people doing illegal activity.
I won’t go into the specific channels as to not promote them or what they do but we can talk about one known example, which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data. They used the two highly popular bots called H****a and the E** ** G**, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5. They use the Telegram API and have been there for years. How do you moderate that? You don’t. You take it down as the illegal, privacy-violating, and doxing-enabling content that it is.
Edit: “Censored” the names of the bots, as I still don’t want to make them even easier to find.
which is how Bellingcat got to the FSB officers responsible for the poisoning of Navalny via their mobile phone call logs and airline ticket data
Was that a bad thing? I’ve never heard the name Bellingcat before, but it sounds like this would’ve been partially responsible for the reporting about the Navalny poisoning?
They used the two highly popular bots called Ha and the E ** G, which allow to get everything known to the government and other social networks on every citizen of Russia for about $1 to $5.
Ultimately, that sounds like an issue the Russian government needs to fix. Telegram bots are also trivial to launch and duplicate so … actually detecting and shutting that down without it being a massive expensive money pit is difficult.
It’s easy to say “oh they’re hosting it, they should just take it down.”
https://www.washingtonpost.com/politics/2018/10/16/postal-service-preferred-shipper-drug-dealers/
Should the US federal government hold themselves liable for delivering illegal drugs via their own postal service? I mean there’s serious nuance in what’s reasonable liability for a carrier … and personally holding the CEO criminally liable is a pretty extreme instance of that.
Just because their technical backend let’s them?
Yes. They can VERY CLEARLY SEE that the platform is being misused. Signal can’t. Signal is genuinely clueless as to what you do on their platform. If you’re going to promote your service as “privacy respecting” and not mean it, you better count on any world government getting on your ass for not taking down CSAM material. The difference between being ignorant and being irresponsible is ignoring the issue after you’ve been made aware of it.
Signal can very clearly see all the messages you send if they just add a bit of code.
Telegram is in the news often for public groups with lots of crime
“The news” is too vague a source to dispute.
I am going to quote myself here:
The issue I see with Telegram is that they retain a certain control over the content on their platform, as they have blocked channels in the past. That’s unlike for example Signal, which only acts as a carrier for the encrypted data.
If they have control over what people are able to share via their platform, the relevant laws should apply, imho.
I am going to quote myself here:
Allow me to quote myself too, then:
That’s not the point.
I do not disagree with your remarks (I do not use Telegram), I simply consider it’s not the point or that it should not be.
Obviously, laws should be enforced. What those laws are and how they are used to erode some stuff that were considered fundamental rights not so long ago is the sole issue, once again, im(v)ho ;)
It IS the point. If Telegram was designed and set up as a pure carrier of encrypted information, no one could/should fault them for how the service is used.
However, this is not the case, and they are able to monitor and control the content that is shared. This means they have a moral and legal responsibility to make sure the service is used in accordance with the law.
The point is that if you’re going to keep blackmail, you have to share with the government.
The easy answer is to stop keeping blackmail.
Signal fans being edgy cool kids
Signal has its own issues. At least it has proper encryption
Yay, let’s all hate on the one crypto messenger, that is independently verifiably secure.
Well, except Telegram isn’t a good tool for privacy.
If Telegram wasn’t good for privacy, Western governments would not be trying to shut it down.
E2EE is nice, but doesn’t matter if the government can just sieze or hack your phone. Much better to use non-Western social media and messaging apps.
This is a very bad faith argument. It relies on assuming that “western government bad” without any basis of statement as to WHY youre claiming this. Would you say Twitter is good for privacy too then? It fits the same argument. The western governments are currently Trying to shut it down, the EU has threatened to shut off access completely. Is Twitter good for privacy because the western governments are trying to shut it down? No. Twitter is absolutely awful for privacy. On the same card, so is telegram. Telegram can not be publically audited. Their backend is closed source. You dont know what theyre doing with your data. For all you know, they took your phone number and sold it to a bail bondsman for when they see you talking about doing crimes on their platform. They could’ve sold any data you gave to them to anyone and you wouldnt be able to prove it because theres no way for you to personally audit them. You know what you can audit? Signal, XMPP, Matrix, fuck you could even audit OpenPGP over email. The argument you put fourth is completely bad faith and is full of holes.
Dis you miss the entire Snowden revelations? Western governments are hostile to online privacy and freedom.
If Telegram wasn’t good for privacy, Western governments would not be trying to shut it down.
They are not trying shutdown Telegram, they are trying to control it.
E2EE is nice, but doesn’t matter if the government can just sieze or hack your phone. Much better to use non-Western social media and messaging apps.
What kind of argument is this supposed to be? Governments can size your phone anywhere … oh wait … lemmy.ml … yeah, I see…
They like to poke fun at the “west” but Russia, China and others are all worse some how. At least in most countries it is controversial to attack journalists and encryption
The fuck does that even mean?
Oh youre on a DIFFERENT HOMESERVER than ME?
Literal brainlet behavior. “I am the chad and you are the little virgin soyjack and little virgin soyjack is on lemmy.ml”
In case you are serious: Lemmy.ml is known for being a tankie instance. So a nonsensical anti-west statement makes a lot more sense considering the instance the user chose.
If it would be a good tool for privacy, Russia would try to shut it down the same way they did with Signal.
Russia tried for years to ban Telegram. They stopped after Telegram managed to keep itself alive by proxies.
they did ban it, and everyone still used it (Telegram was good at evading the bans back then, but eventually Roskomnadzor became decent at banning it), and then they unbanned it, whatever that means
Telegram’s “privacy” is fully based on people trusting them not to share their data - to which Telegram has full access - with anyone. Well, apart from the optional E2EE “secret chat” option with non-standard encryption methods that can only be used for one on one conversations. If it were an actual privacy app, like Signal, they could’ve cooperated with authorities without giving away chat contents and nobody would’ve been arrested. I’m a Telegram user myself and I from a usability standpoint I really like it, but let’s be realistic here: for data safety I would pick another option.
Matrix does have this the same. Most of publicly accessible channels are non encrypted. It’s all because of e2e performance issues for big channels. It comes with a cost which is not required for most people
Matrix spec is E2EE by default. Just because popular rooms turn it off does not mean Matrix is not encrypted. Frankly if a room is public, why does it need E2EE? A fed could join a 1k+ room all the same, encryption or not and just download the messages.
The same argument is valid for telegram
No, it isnt. Telegram is not E2EE even though they claim they are a private messenger.
The crime is not responding to authorities when obviously illegal content such as CSAM is posted. Don’t let the right try to spin this as a free speech thing. It’s not.
Other encrypted platforms: we have no data so we can’t turn over data
Telegram: we collect it all. No you can’t know who is posting child abuse content
Wait, telegram has collected it, knows about and, ultimately condones it? Or is it more of a wilful ignorance and resistance to forced compliance?
It’s definitely not willfully ignorance given they collect the data.
It’s clearly wrong. Matrix does have non-encrypted channels and honestly most of publicly accessible channels are non-encrypted. Do you consider matrix also on the Dame “bucket” as telegram? In matrix you can created encrypted channels but they work very badly in terms of performance with huge number of people like 1000+
Matrix isnt a corpo shill singularity. Its a large amount of home servers talking to each other. This is like comparing the concept of email to being the same as telegram. You could target AOL, or Yahoo, ProtonMail, Google and on and on and on, but just saying “Matrix” isnt correct for this argument.
Most of the matrix users use matrix central homeserver so it’s valid argument
Doesnt change the fact that Matrix is not a single company.
This.
We still don’t have a legal definition of “hate speech”. Yes it’s defined it is what it is, you can’t find any international legal definition and it’s left to the interpretation of judges. Don’t you consider it worrying?
About crime, as far as I know, child abuse and sex content is taken down. Drugs not - there are many countries with very lax drugs policies.
I didn’t comment on hate speech. I commented on CSAM, which the sources I’ve read and listened to (podcasts) say Telegram pretty much never answered when contacted.
Well, I didn’t see child pornography on telegram but I saw sex channels being removed. Comparing to Instagram, I didn’t see happening this on Instagram. Minor soft pornography is flourishing on Instagram. CSAM or terrorism is always a case brought up to take some unpopular things down
CSAM or terrorism is always a case brought up to take some unpopular things down
I’ll concede this point.
I thought telegrams encryption was more or less non-existent? Am I missing something?
Removed by mod
It isn’t secure in the least. They just have been ignoring the police world wide
that’s correct - the issue here is that he has full access to the information that investigators are requesting and is simply refusing to comply with requests
this isn’t shit like a conversation you had with a friend about weed - this is CSAM and drug trafficking
It would be easy to dismiss the headline’s claim because Telegram’s design makes it arguably not a privacy tool in the first place.
However, it is possible that this arrest was chosen in part for that reason, with the knowledge that privacy and cryptography advocates wouldn’t be so upset by the targeting of a tool that is already weak in those areas. This could be an early step in a plan to gradually normalize outlawing cryptographic tools, piece by piece. (Legislators and spy agencies have demonstrated that they want to do this, after all.) With such an approach, the people affected might not resist much until it’s too late, like boiling the proverbial frog.
Watching from the sidelines, it’s impossible to see the underlying motivations or where this is going. I just hope this doesn’t become case law for eventual use in criminalizing solid cryptography.
You’re thinking too far. As someone who knows two people that worked for the Swiss government closely:
Don’t worry about it. The whole deepstate Idea is absolutely ridiculous.
There is no big plan to weaken encryption or anything. There was probably a single prosecutor working on a case involving Telegram that saw his chance and took it.
Seriously, you should be a lot more worried about google or meta, not western democracies.
Unless you live in russia/china/iran/yourFavouriteDictatorship, then forget whatever I just said. But if you live there, what’s happening in France isn’t a Problem to you anymore since your government does it anyways lol
But yeah, I’m getting a not tired of the deepstate conspiracies. He broke the law, that’s why he gets arrested, not because of some deepstate conspiracy
What are you on about?
When legislation aiming to restrict people’s rights fails to pass, it is very common for legislators/governments to try again shortly thereafter, and then again, and again, until some version of it eventually does pass. With each revision, some wording might be replaced, or weak assurances added, or the most obvious targets changed to placate the loudest critics. It might be broken up in to several parts, to be proposed separately over time. But the overall goal remains the same. This practice is (part of) why vigilance and voting are so important in democracies.
There’s nothing “deep state” about it. It’s plainly visible, on the record, and easily verifiable.
As someone who knows two people that worked for the Swiss government closely
This is an appeal to authority (please look it up) and a laughably weak one at that.
There is no big plan to weaken encryption or anything.
You obviously have not been keeping up with events surrounding this topic over the past 30 years.
There is no big plan to weaken encryption or anything.
This may not be a symptom of such a plan, but there very much is such a plan.
Exportation of PGP and similar “strong encryption” in the 90s was considered as exporting munitions by the DoD.
it was not until almost two decades later that the US began to move some of the most common encryption technologies off the Munitions List. Without these changes, it would have been virtually impossible to secure commercial transactions online, stifling the then-nascent internet economy.
More recently you can take your pick.
Governments DO NOT like people having encryption that isn’t backdoored. CSAM is literally the “but won’t someone think of the children” justification they use, and while the goals may be admirable in this case, the potential harm of succeeding in their quest to ban consumer-accessible strong encryption seems pretty obvious to me.
As a bonus - anyone remember Truecrypt?
https://cointelegraph.com/news/rhodium-enterprises-bitcoin-usd-loan-bankruptcy
The world is turning bad, Telegram is not really a private app, but they have one advantage is that they fuck off all the govs that try to get datas from its users! Soon govs will forbid the encryption to watch gently in our digital life. He’s not complice with these crimes, he’s just proposing a tool that make communication more secure and private, but sadly some bad actors use it as a way to do bad things…
Why do they have the data in the first place?
Your communications on telegram are not encrypted by default. You can have e2e encrypted 1on1-conversations, but group chats are blown for them to do everything.
They had a hilarious argumentation where they claimed that the key to unlock your chats is stored on a different server than your chats are and therefore they cannot access it. A company that argues like they (“trust us”) isn’t trustworthy.
Signal has been audited over and over again by internationally respected cryptographers. They cannot decrypt your chats by design. No need for “trust us bro”.
Yeah this is true and I don’t recommended Telegram in any case, but it’s sad that a guy who try to protect a bit our privacy be arrested
All the governments?
I remember them responding to a couple antipiracy lawsuits in… India I think? they also make an exception for ISIS-related channels. But mostly all, yes.
More likely they will just dissolve as an organization. They are hated by all at this point
Chris Berg is a professor of economics at the RMIT Blockchain Innovation Hub.
Thanks, here is more information about Crikey:
Crikey is an independent Australian source for news, investigations, analysis and opinion focusing on politics, media, economics, health, international affairs, the climate, business, society and culture. We are guided by a deceptively simple, old idea: tell the truth and shame the devil.
Chris Berg is a professor of economics at the RMIT Blockchain Innovation Hub.
Worthless opinion piece is worthless.
deleted by creator
Honestly this could go to ways. I really hope people more to more secure platforms but it is possible they find something equally as problematic
deleted by creator
Telegram is not a privacy tool.
I mean, if he’s convicted for a privacy tool, while it’s not a privacy tool, we have a bit of ambiguity.
Arguably advertising something which is not a privacy tool as one is fraud. Maybe even phishing, since TG the company has in plaintext all the chat history of its users.
And this
The meaning of that word “complicity” seems to be revealed by the last three charges: Telegram has been providing users a “cryptology tool” unauthorised by French regulators.
in non-libertarian language means something similar, that is, that something not confirmed to be a privacy tool is being provided as a privacy tool.
I am a libertarian, but in this case they are consistent, if I’m reading this correctly. They are not abusing power, they are doing exactly what they are claiming to be doing.
Also maybe I’m just tired of Telegram. It’s engaging, and I have AuDHD, which means lots of energy spent, and I can’t drop it completely because work, and also some small communities are available as TG channels. Would be wonderful were they to move at least to WhatsApp, but it is what it is.
Still, ability to easily create a blog (what a TG channel really is for its users) reachable without bullshit is a niche in huge demand. LJ filled that at some point, Facebook did at another, TG does now.
Something like this is desperately needed. I’d say the solution should be complementary to Signal - that is, DMs and small groups should not be its thing. Neither should be privacy of huge chats and channels - they’d be public anyway. However, anonymity with means to counter spam should, so should be metadata of user activity.
Where my piracy groups going to go now?
I’ll miss their 2gb upload cap
I believe they’ve always been on torrenting websites and archive.org for older media
Torrenting websites… you mean forums?..ah fuck…
Yeah I did, Ive been up a bit too long. I would think anyone on telegram doing piracy would move to signal, XMPP or matrix at this point though. (that is granted, there is still thousands of redditors just doing it out in the open so, maybe not)
Unfortunately no. Most are still on telegram. There’s a few matrix groups I found but the amount of users is a drop in the bucket compared to telegram groups
In all fairness Telegram has unencrypted user data and messages but didn’t turn it over to the authorities. They also allow known criminal activity to thrive.
They also allow known criminal activity to thrive.
Most scammers I have seen are operating out of Facebook or Instagram.
What is “most scammers?”
That’s not a useful metric. What is a “scammer?” Also it is probably better to look a scammers per capita
It is very important to mention that you mean end-to-end encryption. The data is stored encrypted when using cloud chat. Nothing (besides phone number what I know) is stored in plain text on Telegram’s servers.
I am not defending Telegram. I am just stating facts.
Negative votes incoming in 3… 2… 1…
It is very important to mention that you mean end-to-end encryption. The data is stored encrypted when using cloud chat.
In response, it is very important to mention that point-to-point encryption and encryption at rest are next to meaningless with respect to the chat participants’ privacy. They might be relevant to the case against Durov, but they don’t protect against leaks or compromised servers. Please don’t rely on them for your safety.
It isn’t E2EE
Governments want to make it illegal to have privacy. Durov’s arrest was one of the many steps they are taking in that direction.
That might be true but in this case Telegram was hosting lots of CSAM and other illegal activity in public group chats.
Imagine you are the victim of Sex abuse. Your nude images are on a public group chat and yet Telegram does nothing. There is no technical reason they couldn’t remove the images. They just don’t feel like it. What’s worse is that there is a lot of images of children.
https://en.m.wikipedia.org/wiki/Occam's_razor
educate yourself.