Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from iHeartRadio. Hey there,
and welcome to tech Stuff. I'm your host, Jonathan Strickland.
I'm an executive producer with iHeart Podcasts and how the
tech are you? So? On Saturday August twenty fourth, twenty
twenty four, French law enforcement officials arrested a man named
(00:29):
Pavel Durov shortly after his private plane landed at an
airport outside of Paris. So why did they do that? Well,
Durov is the CEO of the company Telegram, a communications
and file sharing app. Pavel and his brother Nikolai founded
Telegram partly as a way to allow users to communicate
(00:51):
with one another without fear of censorship. But this freedom
also means that some people use Telegram to either commit
or facilitate illegal activities, and that's what led to Pavel
Durov's arrest. Now I should probably put this in right here,
even though it is a spoiler. Dirov has since been
released from police custody, but we'll still have to appear
(01:13):
in court, which, as I'm recording this has not yet happened.
By the time you hear this, maybe it has now.
I thought I would give a history of Telegram and
some information on the charges that Pavol faces, as well
as more information about what Telegram itself does. But to
do that, when we first have to talk about a
predecessor to Telegram. It is a social media platform called
(01:38):
v Contact. That's Vko n Takte or VK for short,
because this launched in Russia. So in two thousand and six,
two years after Facebook, or rather the Facebook launched, Pavel
Durov created VK with Nikolai So it featured many of
(02:00):
the same functions that you would find on platforms like
Facebook or MySpace back in the day. Users could create profiles,
they could connect with friends, and as I said, Pavel
lived in Russia at the time. He has citizenship in
many different countries at this point, including France, and the
Russian government has let's say, a little history with wanting
(02:23):
to control the flow of information. That's quite hard to
do on a platform made up of user generated content.
The information isn't coming from some central source. It's not
like a media company. It's being generated by the people
who are actually on the platform. One twenty eleven, some
Russian citizens were taking to VK to protest recent parliamentary elections.
(02:48):
The Russian government didn't want anyone upsetting the borsched cart,
so to speak, and so Durav received the polite request
that perhaps he censored the posts that were protesting the elections,
you know, or else. Durov declined to acquiesce to their request,
and so the Russian government decided to pull out all
(03:11):
the stops. Now those stops included using essentially propaganda to
heap aspersions upon both VK and Durov himself, but perhaps
the more effective tactic was they sent an armed police
presence to Durov's house to intimidate him into complying with
(03:31):
Russian demands. On top of that, the government allegedly demanded
that Durov handover user information linked to accounts identified as
having posted anti government messages. Durov decided that he had
had enough of that kind of stuff, so ultimately he
would cash out and sell his interest in VK. This,
(03:52):
from what I can tell, happened around twenty fourteen when
he officially sold off his interest in the company, but
he was already thinking ahead to the next thing and
getting out of Dodge or Moscow, as it turns out,
much earlier than that, So he and his brother would
found telegram back in twenty thirteen, so he still had
(04:16):
interests in VK at this point, but would sell them
off the following year. Now, another thing that happened in
twenty thirteen that I'm guessing influenced the Brothers was the
tale of Edward Snowden. So Snowden was a contractor for
the NSA here in America. Now, in the course of
Snowden's work, he learned about massive surveillance programs, including ones
(04:39):
that involved not just the United States, but it's allies
collaborating with one another and sharing surveillance information between them.
And the scope and depth of these surveillance programs really
concerned Snowden, so he purposefully leaked information ultimately to the
public about them, and that got various governments very upset
(05:02):
at Snowden. To put it lightly, Snowden ultimately sought asylum
in Russia, where he was granted asylum, and in twenty
twenty two he received citizenship in Russia. While the US
government viewed Snowden's actions as a violation of several laws,
including the Espionage Act of nineteen seventeen, lots of other
people saw what he did as a heroic act of defiance,
(05:25):
uncovering questionable and disturbing practices across the world, some of
those practices subsequently being found unconstitutional and illegal here in
the United States, and he also helped drive home the
concern that your communications with the people in your life
might not be as private as you would like to think.
One other important change that had happened between the launch
(05:49):
of VK and the launch of Telegram was the rise
of the smartphone. Back in two thousand and six, consumer
smartphones weren't actually a thing yet. The only phonelks who
touted smartphones around were like business executives and bleeding edge
technology enthusiasts. Everyone else was sporting a cell phone, you know.
(06:10):
Maybe it was a flip phone, maybe it was a
candy bar style phone. Maybe you were super cool and
you had a sidekick. I really really wanted a sidekick anyway.
The point is, accessing the Web in general and social
media in particular typically meant you were on a desktop
or a laptop computer. But by twenty thirteen we were
(06:31):
well into a seed change in which more people were
relying on smartphones in order to access web content. Now,
it wouldn't really be until like early twenty fourteen that
we started to see web Internet usage on mobile eclipsing
desktop usage here in the United States. That is, it
was different for different parts of the world. Here in
the US around twenty fourteen, that's when that started to happen,
(06:53):
where mobile phones were becoming the primary way that people
were interacting with content on the web, which meant web
designers were hectically trying to figure out how to optimize
pages for users on mobile devices, or to create mobile
specific apps to try and drive traffic to those. I've
seen plenty Heck, I recorded ads for how Stuff Works
(07:16):
is mobile app way back in the day, because optimizing
a page so that it looks good no matter what
device you're on was tricky. Like it's easier today because
best practices have been created to handle that sort of thing.
But back in like the twenty tens era, that was
still a developing discipline, and so there were a lot
(07:39):
of outlets out there that created apps hoping that they
could drive people to use those apps and manage to
monetize that traffic. I never saw download figures on how
well the house Stuff Works app did. If I had
to suspect what I would guess it was fairly low.
I thought it was a decent app, but not not
(07:59):
equivalent to the web page experience. Anyway, the mobile landscape
was clearly becoming much more important, and so the Durovs
decided they would focus on a mobile centered application for Telegram,
at least initially. They set up their headquarters in the
United Arab Immirates, specifically in Dubai. So Durov chose this
(08:23):
because he said it was quote the best place for
a neutral platform like ours to be in if we
want to make sure we can defend our user's privacy
and freedom of speech end quote. So in other words,
Durov was confident that the UAE government wouldn't get so
handsy with Telegram as Russia had with VK. The company
(08:45):
would launch apps for iOS in August twenty thirteen and
Android in October of that same year. That was after
Telegram had held a contest for Android developers to essentially
port the code over to the Android environnment. Nikolai developed
the initial mobile protocol for Telegram that included the encryption protocol.
(09:07):
Now I should add that not all messages are encrypted
on Telegram, and that's actually a key component to the
issues that Durov currently faces. So in fact, even a
one on one conversation inside telegram is not encrypted by default.
You actually have to manually turn on encryption if you
(09:30):
want your communication to be encrypted. For those one on
one conversations, you can have to end encryption that protects
the message from prying eyes. And I've talked about encryption
many times before, but basically, from a very high level,
here's how it works. You have two users, and they
have a means of encrypting and decrypting messages so that
(09:51):
the information that's actually sent between the two appears to
be gibberish. An outside party snooping in on the conversation
would just see a string of apparently meaningless characters with
no real message inside of it, but the individual users
would get the decrypted raw text messages or files or
(10:12):
whatever it might be into end. Encryption has lots of legitimate,
important uses that can help people maintain secure and private
methods of communication, and law enforcement agencies often really don't
like it because it makes their jobs harder. I guess
it depends on how you look at their jobs, because
some people would say, oh, sure, law enforcement's job of
(10:34):
surveying all innocent citizens and trying to look for crimes,
and others would say, oh, law enforcement's job of being
able to detect and then protect people from criminal activity.
I guess it really depends on your point of view.
But detecting illegal activities is a lot harder if all
the communications between the practitioners are scrambled. And there's been
(10:55):
a lot of pressure in different parts of the world
on different platforms to open up backdoor access to encrypted communications,
essentially to give law enforcement a universal key to decrypt
communications so that they can see if there's anything hinky
going on. Now, in many cases this really isn't possible
(11:16):
unless you grant access to the end points themselves, like
you somehow get access to the actual end devices where
the decryption is taking place, because good end to end
encryption means that even the platform that's offering it is
unable to break that encryption. Moreover, the design was such
that these encrypted chats would only exist on those end
(11:37):
devices because Telegram didn't store the encrypted communications in the cloud,
So even if someone were to compromise Telegram systems, they
wouldn't be able to access the encrypted communications stored there
because there weren't any stored there. Anyway, this protocol that
was primarily about encryption, was called mt proto, and Nikolai
(12:00):
authored the first version of it, and you can visit
the Telegram website to read up on how it works. Although,
to be fair, the version that Telegram uses now is
MT proto two point zero, and so it's different. Right.
It's evolved considerably since the launch in twenty thirteen, and
it gets a bit complicated to talk about the details
(12:23):
in a podcast that doesn't have visual aids. Besides that,
even if this were a video podcast, I am no
expert in encryption, and so chances are I would end
up communicating something poorly or just outright incorrectly if I
were to really tackle it. The important bit is that
mt proto was one of the early building blocks Nickel
I made for Telegram, and they also chose to make
(12:45):
this an open protocol, meaning the entire community could review
and examine how the protocol worked. This was meant to
eliminate trust issues, like the idea is remove trust from
being a concern, like there's no reason to trust Telegram.
You can look into all of this and make sure
that things are run the way the app was claiming it,
(13:06):
so users could see exactly how the protocol was handling encryption.
And verify that communications were secure, right, like, no copies
of that communication would be going to Telegram itself, et cetera.
All right, We've got a lot more to talk about
when it comes to Telegram and its features. Before we
get to that, let's take a quick break to thank
our sponsors, so we're back. We talked a bit about
(13:37):
encryption with Telegram. Another feature that the app would offer
early on was a self destruct option for messages. That is,
once a message had been received, and then later it
was improved so once it had been read, then it
would self delete after a given amount of time. So
(13:57):
early on the self destruct was literally like within a
certain amount of time since the message was received. Problem is,
not everyone reads messages just when they get them, right.
You could be in a situation where you can't. Maybe
you're on a flight or something and you haven't had
your phone connected to Wi Fi and you're not connected
(14:17):
to the internet, so you could receive the message through Telegram,
but you haven't had a chance to read it yet,
and then meanwhile the timer is ticking down for when
it's going to self delete. Later on, they did increase
this so that it was after you had opened the
message and then the timer would start. So this was
kind of like the original concept behind Snapchat, in which
(14:41):
a user could send an image and after a given
amount of time, that image would go puff in the
receiver's app, except, of course, in the case of Snapchat,
copies of those images could still exist on Snapchat's cloud servers.
That was a whole thing Telegram was saying, Okay, well,
that's not going to be the case with us, Like,
the messages are only going to exist on the end devices,
(15:02):
not in the cloud, So once they delete, that's it.
We don't have a copy of it, so they're gone. Also, originally,
the self delete feature only worked for the receiver's side,
so the person who sent the message would still be
able to see the message after it had been deleted
off the second party's device. Eventually, Telegram would change that
(15:26):
as well, where the message would get deleted off all
the devices, both of the devices, because this only works
on end to end or user to use their communications,
so one on one situations. But yeah, originally it was
just the person who received it. Their message would get deleted. Clearly,
if you want to send let's say, a really sensitive
message and you don't want other people to snoop in
(15:49):
and see this. Having it self delete not just on
the person who received the message, but also your device
would be really important. I mean, what if someone got
hold of your and then thumb through to try and
see what kind of messaging you'd been up to. If
you are in a country that has a really authoritarian
government that is just looking for a reason to throw
(16:12):
you in the whoscal Yeah, you want to have a
messaging app that's going to clear your history so that
you don't have to worry about getting hauled away for
violating some authoritarian rule. Now telegram would allow for more
than just one on one communications, though these other methods
would not enjoy the benefit of end to end encryption,
(16:36):
which again becomes part of the problem for dear off. Now,
so you could have a session in which one user
was posting to many other users. This would be kind
of a broadcast approach, one person broadcasting to many I
like to think of this kind of similar to ways
that Twitter now x works, or threads or something like that,
(16:57):
where you can post a general message and it goes
out to anyone who can follow you, or or sometimes
if you're posting to the public, anybody at all. You
could have a session in which you know, you have
a group chat, potentially a really massive group chat. Telegram
is said to be able to accommodate up to two
hundred thousand users in a single chat session. I have
(17:22):
no idea how you would be able to parse such
a thing or even just keep up with chat now.
I've been in YouTube chat rooms where there were you know,
around one thousand people watching something, and obviously only a
slice of that population is even bothering chatting while they're watching,
and even in that case, keeping up with what's going
(17:44):
on is really challenging to do, so I don't know
how you do it in a Telegram chat room. To
be fair, I've also never used Telegram personally, so I
don't have any real experience with this app on a
user level. Telegram also allows for file transfers between users, which,
along with the encryption and messages that self delete after
a given amount of time, make up a large part
(18:06):
of the reasons that authorities have been concerned about this app,
because anytime you have ways for people to share information,
there's a concern that people are going to do that
in order to further nefarious goals. By December twenty thirteen,
the Telegram community had plugged one quote unquote significant vulnerability
(18:27):
in mt proto as part of the first Telegram crypto contest.
That's according to Telegram's own timeline of how things evolved
in the app. The person who discovered the vulnerability received
a one hundred thousand dollars bug bounty for doing so,
so there was an incentive to help improve the protocols
(18:49):
and to look for things like vulnerabilities, and overall this
would benefit the entire community, So it was an effective
way to improve the product. You may your own community
QA testers in a way. The following month, Telegram offered
the feature of file transfers, with an initial file size
limit of one and a half gigabytes. That's a fairly
(19:12):
hefty file sized and Telegram did not restrict the types
of documents that could be transferred across its service, so
whether it was a JPEG or a doc file or
a PDF or whatever it might be, you could send
it as long as it was as it wasn't larger
than one and a half gigabytes. Around that same time,
developers created a client for PCs, so this expanded Telegram
(19:36):
beyond the smartphone environment. This would eventually evolve into the
Telegram desktop app. In February twenty fourteen, developers created a
web based app for the service, and others made a
version of the app for Windows Phone. Do you remember
Windows Phone? I mean that was still a thing back
in twenty fourteen. In fact, it would remain a thing
(19:58):
for five more years. Microsoft officially in support for Windows
Phone in twenty nineteen. Telegram was evolving rapidly, so the
next feature added to the app in March twenty fourteen
was support for voice messages. So now you've got file transfers,
you've got one on one communication, you have one to
(20:18):
many in the broadcast channels, you've got chat, you've got
voice messaging. At this point, also the app changed how
it handled secret messages. This is when we got to
the change for self destruct. So again, earlier self destruct
only affected the receiving device, but now, starting in March
twenty fourteen, the messages would disappear from both the sender's
(20:39):
device and the receiver's devices, so you no longer had
a trail of these messages. To list all the feature
upgrades month by month would become really tedious, And as
I said, if you go to Telegram's timeline of the
evolution of the app. You can actually read all about this. Weirdly,
(21:02):
it's it's listed in reverse chronological order. I guess that
makes sense if you just want to know what the
most recent additions are to the app, But if you're
looking at it from a historical perspective, it's very weird
to start from the most recent and work your way backward,
because it just means that as you go on, the
app gets fewer features over time because you're going back
(21:23):
in time. I actually read it in reverse order anyway,
There's no reason to go through all of them. It
would just get very tedious. You did see a lot
more features get added over time. The app just would
get more and more robust every year, gain support for
everything from multiple file uploads where you could do several
(21:45):
files at a time, particularly photos like you could you
could choose multiple photos as opposed to doing them one
at a time, to playing animated gifts, you know, like
you know, the standard stuff that you encounter in various
social platforms today. Later updates incorporated the ability to play
media files directly within the Telegram app. You know, previously
you would have to download the file and then play
(22:06):
it in some other media player app. Now that capability
was built directly into Telegram itself, it also opened up
support for bots. These could be incorporated into chat rooms
to enhance the experience in various ways, though frequently not
really for moderation, which is again like content moderation, I mean,
and that is another issue that we'll talk about in
(22:28):
this episode. So the list of features has grown now
to a point where it would take an entire episode
to cover all the things that Telegram facilitates today. It
encompasses everything from allowing payments through the app for physical
and digital goods, to an in app currency called Stars
that users can use to reward other accounts for stuff.
(22:48):
It's kind of like the bits you would find on Twitch,
that kind of stuff. Okay, So that's all fine. Telegram
has evolved since it first hit the scene in mid
twenty thirteen. You would expect that, right, as many of
the same features that you find in other platforms, while
also being largely free from traditional advertising and, according to
(23:09):
the dourovs from data mining. Right Like, if you're on WhatsApp,
which is owned by Meta, you know that Meta is
scraping a lot of data from you. Because that's what
they do with all their platforms. That's their bread and butter,
right like whether it's on Facebook or Instagram or WhatsApp,
that ends up being really valuable information when Facebook is
(23:31):
looking for ways to sell targeted advertising with you in mind.
So the original concept behind Telegram was that it wouldn't
be profit oriented at all. It was not organized as
a not for profit organization, however, that's important to remember,
but the concept was that profit would not be a
driving consideration for the service. That the Dourovs were determined
(23:55):
to create a useful communications platform free from external interference,
which included not just governments, but things like corporations. But
Telegram does earn revenue primarily. It does this through in
app purchases, so rather than settle for the bog standard experience,
a user can shell out money to access premium features
(24:16):
like customization tools and such. This can include stuff like
stickers that can be used in chat rooms or themes
for chat spaces, that kind of thing. Users can even
design their own stickers and offer them up for sale
in a digital marketplace, with Telegram getting a cut of
the action, so they take a little percentage of each
sale done through that method. Telegram also does work with
(24:40):
businesses in ways that allow for other revenue generation methods.
I did say also is largely free of advertising, but
not totally free of advertising. There are ads on Telegram, however,
they are limited to appearing within broadcast channels that have
at least one thousand subscribers ONTs more, these ads can
own only appear as text messages, and they have a
(25:02):
hard limit of one hundred and sixty characters. On top
of that, businesses can establish their own channels and their
own groups within Telegram, and that in turn is another
revenue stream for Telegram itself. But now let's talk about
the dark side of this application. So one consequence of
(25:22):
creating a platform that aims to be free from censorship
and government involvement is that people who wish to engage
in illegal activities will make use of those services in
order to further their own goals. Complicating matters is that
you're talking about an app that is available around the world,
and what is legal in one nation may not be
(25:43):
legal in another. Plus, Deurov himself has citizenship in multiple countries,
including France. Further, Telegram's general policy in response to governments
demanding data is to tell those governments to pound sand it. Really,
it doesn't matter what the situation is, Telegram will say
(26:04):
it's not our policy to share that user information with you,
and we're not going to do it. Durov has said
that Telegram's commitment to privacy is more important than things
like our fear of how bad people could use Telegram.
He says privacy is more important than what someone might
use Telegram to do. So, when US lawmakers ask Telegram
(26:24):
to hand over information connected to people who were involved
or suspected as being involved in the insurrection on January sixth,
twenty twenty one, Telegram denied that request. They said, no,
that's against our policy. Now there are concerns that, you know,
things like terrorist cells are making use of Telegram in
order to communicate with each other. There are also cases
(26:47):
in which people are using Telegram to distribute everything from
pirated content to really serious issues like child pornography. And
that encryption, as I said earlier, really only works for
one on one communication, So for the case of things
like chat rooms or broadcast channels, as well as just
the default settings for user to user communications, there is
(27:11):
no encryption, which means any illegal activity that's happening across
those channels is potentially viewable. In fact, the only type
of communication that actually uses encryption and only if it's
manually switched on, means that Telegram isn't really an encrypted
communication tool. It's just a communication app that happens to
(27:35):
have one end to end encryption feature that's only on
if you turn it on, and only for certain use cases.
That ends up becoming a massive problem for Deroff, or
potentially a massive problem. I mean, it could turn out
that no charges are filed and nothing happens. But the
reason why he was detained by police in the first place,
(27:58):
you could argue, is because cause Telegram does not encrypt everything.
I'll explain more, but first let's take a quick break
to thank our sponsors. Okay, So the reason that it's
(28:21):
important that Telegram does not offer encryption across all methods
of communication on the app is that it means Telegram
potentially could view the stuff that happens on its own network.
It could be aware of the things that are transpiring
on Telegram, and lots of countries have rules in place
(28:43):
that state a platform is obligated to moderate the content
that happens on the platform itself. It's not responsible for
generating that content necessarily, but it is responsible for moderating it.
Here in the United States, we have rules that protect
platforms from being held accountable for the stuff that users
(29:03):
post to them. The infamous section two thirty is about this.
So the thought behind all of this is that you
can't really blame a platform for something that someone a
user does on the platform itself. The responsible party is
the person who did the illegal activity, not the platform
where that illegal activity happened. However, this protection only extends
(29:28):
so far. If a platform is unwilling or unable to
moderate content, to remove illegal content, to act when needed
to curtail illegal activity on the platform itself, then it
can see that protection get stripped away. So your protection
only lasts as long as you are accountable. So, as
(29:50):
an example, if someone were to upload a movie to
YouTube and they don't have the right to do this
right like they've taken a pirated copy of a film.
Let's say it's Big Trouble in Little China, arguably the
best movie ever made, and they've put Big Trouble in
Little China up on YouTube and they don't own the
rights to Big trouble in Little China, and YouTube is
(30:11):
made aware of this, They're alerted, Hey, someone has uploaded
copyrighted material and they don't have the right to do it.
Then YouTube is obligated to take that video down or
whatever action the copyright holder deems appropriate, or else YouTube
risks losing that protection we talked about. YouTube itself would
(30:32):
not be held responsible for the initial upload as long
as it did act accordingly once alerted to the infraction. Now,
Telegram largely doesn't police content on its platform. However, there
are exceptions. One big one is in matters that deal
with child abuse. The company relies on users to report
(30:52):
instances of content relating to child abuse and then takes action. So,
according to an article by Jordan Pearson of the Verb
which Telegram claims to do this around one thousand times
per day, yikes, it is horrifying to think that child
abuse material is that rampant to begin with, And of
course that just marks the instances where someone actually reported it,
(31:16):
so that's pretty horrifying. Complicating matters is that Telegram has
been accused of only putting up a show when it
comes to content moderation that, rather than outright removing offending
channels and material. Telegram simply just makes them hidden, so
they're not removed, they're just hidden from average users, which
means people who know where to go could still go
(31:38):
there and still engage in this activity, and that seems
to be a pretty big problem. It's kind of the
look the other way approach, which is you can easily
argue that is essentially facilitating and being complicit in illegal
activity that's happening on those channels, and these policies are
what put durav on thin ice with authorities in France.
(32:00):
Telegram could technically take a much firmer stance with content moderation.
There's nothing stopping the company from doing so. The communications
are not encrypted in things like chat channels and broadcast
channels and most to end user communications unless they've manually
turned that setting on. So the fact that Telegram doesn't
(32:23):
appear to take this kind of action opens the possibility
for authorities to charge Durov and the company overall with
facilitating illegal activity. The act of not acting becomes the issue.
The French authorities have argued that Durov is complicit in
crimes that range from money laundering to the distribution of
abusive materials and everything in between. This is what Dirov
(32:47):
is going to have to face when brought before a judge,
where he might possibly be indicted. By the time you
hear this episode, that decision may already have been made,
but as I record this, it is yet to happen.
Though he again was released from police custody. Now that's
not necessarily an indication of where things are going to go,
because the authorities had until today, which is Wednesday, August
(33:08):
twenty eighth, twenty twenty four, when I'm recording this, to
officially charge du Off or to let him go. They
couldn't hold him longer, not legally anyway. So if he
gets charged, that's up to the judge. But yeah, the
case is a really complicated one. So on the one hand,
I do believe there is a real need for systems
that allow for secure and private communication. There are people
(33:32):
all around the world whose lives could be in danger
if they do not have access to those kinds of tools,
and there are plenty of examples, including here in the
United States, where if your communications were open to surveillance,
then you could really suffer as a result of that,
even if you were not guilty of any crimes. I mean,
(33:54):
there were cases in the nssay of people who were
allegedly spy paying on communications that had no illegal activity
connected to them, but happened to belong to say an
X right, like an NSA contractor or agent was using
the tools of the agency to spy on people they
(34:15):
knew personally, or to look at things like let's say
someone is sending a nude photo of themselves to their
loved one or whatever. Being able to intercept that and
look at it. I mean, that happened a lot. And
you know, again that's not connected with illegal activity necessarily,
so there's no justification for intercepting that information and then
(34:38):
you know, saving it or looking at or whatever it
may be. So there is a real need for ways
to communicate securely and privately. However, we're not really talking
about the secure component of telegram in this case. We're
talking about a platform in which Durroff could conceivably be
made aware of illegal activity going on across his platform
(35:02):
that obligates him to take action. Failure to do so
indicates an element of complicity in those crimes. Now, if
everything were encrypted, then Durov would really be free and clear,
because true encryption would mean he would have no way
of knowing what is actually transpiring across the platform. The
people might make use of the platform to conduct illegal activities,
(35:25):
but that's beside the point, because people commit crimes all
the time on things like the road. Right, you don't
shut the road down. It's not the road's fault that
anyone did that, that's just where it happened. So if
everything were encrypted and there was no way to know
what anyone was doing on the platform, Durov would probably
have a really strong defense. But the fact that there
(35:48):
are all these methods that are not encrypted. In fact,
only one method is encrypted, that's what really gives him
potentially a huge problem, because you can make the argument, hey,
there's nothing stopping you from being aware of this illegal activity,
and the fact that you're not doing enough to curtail
that means you are complicit in that, and we're going
(36:09):
to hold you accountable. So that's kind of where he
finds himself today. The ferocity of authorities in this matter
also raise concerns about security and privacy. Some experts that
Pearson in his article on The Verge quotes, they say
that really the matter is more about how much did
Derov know about the illegal activity, not so much about
the private, secure communication aspect of Telegram. But still, we
(36:34):
do live in a post NSA prism world. We live
in a world where we are aware of the various
attempts to monitor communications, whether those communications are criminal or otherwise,
and we know that people have exploited those programs to
various degrees to the harm of innocent citizens. So seeing
(36:54):
authorities go after the CEO of a company that provides
that kind of communication, even though that's just one small
part of what telegram does, it does raise concerns. It
makes you worry about surveillance states and this almost pathological
need to have access to all information just in the
(37:15):
case that something in that giant mass of data represents
illegal activity. It's kind of the concern about being presumed
guilty until proven innocent. That's kind of the opposite of,
at least how we like to think the American system goes.
Once in a while, it actually is true that people
(37:35):
are presumed innocent until proven guilty. That's nice when that happens,
but yeah, something like this, it gives the lie to
that right. The implication is that you're presumed to have
been guilty of something. It's just that something may not
yet be discoverable. Pretty dark stuff. But yeah, my personal
(37:58):
opinion does doesn't really matter in this case. I'm curious
what other people's opinions are. But I feel I have
a complex reaction to this. I don't like the concept
of a platform allowing illegal content, particularly illegal content that
disproportionately hurts children, to continue to be able to do
(38:20):
that without repercussions. I find that to be really disturbing.
I appreciate the need for a place where free speech
can freely happen, but even free speech, at least here
in the United States, has its limitations. Free speech is
not meant to be absolutely free of consequence. Just means
the government can't dictate what you can and cannot say,
(38:44):
but there can be consequences to what you do say.
It's a fine line and it's complicated anyway. I hope
that you learned something in this episode, that you learn
more about what telegram is, where it came from some
of you out there maybe a telegram users. I know
that a lot of folks who use telegram are in
(39:05):
other countries, in places like Iran and India, and these
are places where governments can be quite authoritarian in their
desire to control the flow of information. Also, I do
find it somewhat ironic that when it was announced that
Durov was arrested in France, some countries, notably Russia, expressed
(39:28):
condemnation for that, saying this is a strike against free speech,
which is rich coming from Russia. I mean, that's the
same country that Durov fled from after Russian authorities essentially
tried to seize control of VK and Duv left Russia
to found Telegram in the UAE largely because of that.
(39:49):
And here you have Russia saying shame on you Frans
for arresting this guy who's a Russian citizen as well,
he still maintains Russian citizenship. And meanwhile, it's the same
country that caused Durov to flee in the first place.
So yeah, everything's politics. I guess that's what that boils
down to. That's a cheerful thought, you know what. I'm
just gonna leave that there, and I'm gonna go off
(40:11):
and I'm gonna have a snack, maybe I think a cupcake.
Gonna have a little cupcake to kind of soothe my
feelings on this matter. I hope all of you out
there are doing well, and I'll talk to you again
really soon. Tech Stuff is an iHeartRadio production. For more
(40:35):
podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or
wherever you listen to your favorite shows.