[VIDÉO] Selon nos informations, le fondateur et PDG de la messagerie sécurisée Telegram a été interpellé ce samedi soir à l'aéroport du Bourget. Pavel Durov, franco-russe de 39 ans, était accompagné de son garde du corps et d'une femme. - INFO TF1/LCI : le fondateur et PDG de la messagerie Telegram interpellé en France (Police, justice et faits divers) - TF1 INFO
Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.
Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.
There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.
There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.
The standard I recall being established back in the nineties as to whether strong encryption was even legal in the US was “substantial non-infringing use” or similar. It’s been awhile.
The problem with key-escrow or anything similar is that any proscribed circumvention is also available to the “bad guys”.
I think Telegram’s stance would be that they can’t moderate because of strong end-to-end encryption. Back in the day the parallel would have been made to the phone system or mail.
Of course this is all happening in France, so I have no idea what the combination of French and EU laws will have on this, but I would still broadly expect that if a parallel can be made to mail or phone, Telegram would be in the clear. The phone company and mail service have no expectation of content moderation.
The huge difference between mail or phone and telegram is that both mail and phone work with law enforcement, with useful records being made available upon subpoena. Telegram, by design, will not.
If you think drawing that parallel is useful to Telegram, they would then also be required to maintain the same standards of security as the mail, with package inspections, drug dogs, entire teams of government officials investigating illegal activities etc.
The criminals use it precisely because it is not a parallel to other available channels, as it circumvents those safeguards.
I mean yes but that’s like saying Bitcoin is used by criminals to buy drugs and weapons. The problem is that’s not their only use.
Wait till you hear about the idiots who unironically make that argument for banning Bitcoin too
Bitcoin is a bad example, since it’s not designed as a private currency. Monero/XMR is actually usable.
Yep. The issue is that they put out a tool that does some good things, but is also heavily adopted by criminals who piggyback on it.
Should we let child abuse just proliferate with these tools, because there’s so much need for privacy? How do you weed out the bad without kneecapping the good? There’s no good answer here. The good parts of the tech working enable the bad parts, too.
There has to be a certain level of knowledge and acceptance of the bad parts to continue developing it. It’s a catch 22, so law enforcement has to pick between sacrificing the privacy or allowing a tool to exist that proliferates child abuse material and other ills.
There are valid arguments for the importance of privacy, and valid arguments for making sure there these crimes shouldn’t have a safe haven. Action to either end will hurt some people and enrage others.
The standard I recall being established back in the nineties as to whether strong encryption was even legal in the US was “substantial non-infringing use” or similar. It’s been awhile.
The problem with key-escrow or anything similar is that any proscribed circumvention is also available to the “bad guys”.
I think Telegram’s stance would be that they can’t moderate because of strong end-to-end encryption. Back in the day the parallel would have been made to the phone system or mail.
Of course this is all happening in France, so I have no idea what the combination of French and EU laws will have on this, but I would still broadly expect that if a parallel can be made to mail or phone, Telegram would be in the clear. The phone company and mail service have no expectation of content moderation.
I guess we’ll see.
The huge difference between mail or phone and telegram is that both mail and phone work with law enforcement, with useful records being made available upon subpoena. Telegram, by design, will not.
If you think drawing that parallel is useful to Telegram, they would then also be required to maintain the same standards of security as the mail, with package inspections, drug dogs, entire teams of government officials investigating illegal activities etc.
The criminals use it precisely because it is not a parallel to other available channels, as it circumvents those safeguards.
Removed by mod
That’s like saying Voat isn’t only used by incel trolls who got banned from reddit
That’s like saying Voat isn’t only used by incel trolls who got banned from reddit