All Episodes

August 6, 2025 43 mins

Tea called itself a women’s safety app. Then, 4chan found it. Selfies, IDs, even private messages were left exposed after two massive data breaches. Tens of thousands of women were affected, but hundreds of thousands still want to sign up. We break down what went wrong, and what it says about the tech we trust to keep us safe. 

Dexter talks to journalists and 404 Media co-founders Emanuel Maiberg and Sam Cole about what happened, and what this could mean for women (and men!) whose information was exposed. 

Read + Watch: 

Emanuel’s coverage on the Tea data breach: 

Women Dating Safety App 'Tea' Breached, Users' IDs Posted to 4chan

A Second Tea Breach Reveals Users’ DMs About Abortions and Cheating

Tea User Files Class Action After Women’s Safety App Exposes Data

Sam’s coverage on ‘Are We Dating the Same Guy?’ Facebook groups: 

Man Who Sued 'Are We Dating the Same Guy?' Groups Files Class Action Lawsuit

Got something you’re curious about? Hit us up killswitch@kaleidoscope.nyc, or @killswitchpod on IG, or @dexdigi on IG or Bluesky.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:13):
My phone was ringing in I could see that it
was going to like the Google voice number, and it
was this guy in Canada whose day job is like
it adjacent, and he sounded like pretty frazzled and in
a hurry.

Speaker 2 (00:31):
Emanuel Meiberg is a journalist and a co founder at
four or four Media. Two weeks ago, he got a
phone call someone really wanted to tell him about this
viral app but this wasn't a pitch.

Speaker 1 (00:45):
And he was like, oh my god, thank god you
picked up. There's something going down on four chan. There's
like a big security breach. I think this is a
big deal. You really need to look at this. And
he was talking pretty fast, like I couldn't fully understand
what he was saying, but he sent me an email
with some links to four chen, and as soon as
I got there, I could tell that A it was

(01:08):
pretty bad as he was saying, and B it was
already too late.

Speaker 2 (01:24):
From Kaleidoscope and iHeart podcasts. This is kill switch.

Speaker 3 (01:29):
I'm Dexter Thomas, goodbye.

Speaker 2 (02:10):
A little while back, an app called t like Tea
the Tea that You Drink started going viral, and pretty
soon it was in the headlines as the week went on,
the headlines kept coming, but for all the wrong reasons.
What is the t app and how does it work?

Speaker 1 (02:28):
The t app is calling itself a women's dating safety app.
I think a good way to think about it is
if you've heard of this Facebook phenomenon of Facebook groups
called are We Dating the Same Guy? Where it's like
women are getting together on Facebook in order to find

(02:50):
out if they're dating the same guy and if there's
any other red flags.

Speaker 2 (02:53):
The reason that the t app went viral at first
was because of what it was promising to provide quote
dating safety tools that protect women. The idea was that
women could ask other women if they'd had any bad
or even dangerous experiences with a particular man, and that
all this communication would happen on a quote secure, anonymous platform.

Speaker 1 (03:15):
When you sign up for tea, the app wants to
verify that you're a woman, and the way that they
elected to do that is it asks women to post selfies.
The app uses this platform that Google owns where you
set up and deploy your mobile app, and they misconfigured
how that was used, and anyone could access a bunch

(03:37):
of data, including all those selfies, or like a large
number of than thousands. I think it was like seventy
thousand or something. Selfies were available via this open Google
Cloud compute instance.

Speaker 2 (03:53):
So a quick simplified explanation of what's happening here. T
was using an app development platform that lets you store
data in a couple of different ways. You can set
it to private and require authentication to access to data,
or you could set it to public and not require
any authentication at all. And this is what happened. Tease

(04:13):
buckets were public, meaning that essentially anyone who knew the
URL could access it. That spread on Fortune, and this
is how those users discovered the vulnerability.

Speaker 1 (04:25):
People were already downloading those images and sharing them and
making them available elsewhere, and we immediately contacted the app.
It took them a few hours to shut down this axis,
but by then the information was all out there and
reposted elsewhere on the internet. And four Chan, I think

(04:46):
I don't need to explain, is notoriously racist and sexist.
There was like this gleeful rummaging through the data and
sharing images of these women and saying how attractive or
unattractive they were, and trying to get as much data
on them as possible. In an attempt to humiliate them

(05:09):
and humiliate women in general who use this app in
order to identify and avoid men they thought were dangerous,
and this was like a haha.

Speaker 4 (05:25):
You thought that you would.

Speaker 1 (05:27):
Use this to do something against men, Now it blew
up in your face and we're going to make your
life hell. Or even trying to use an app like.

Speaker 2 (05:35):
This, the data was more than just out there. It
was being used.

Speaker 1 (05:41):
There was a version of the app a long time
ago where rather than ask for a selfie, it asked
for users' IDs. So some women it wasn't just like
their picture, which you might be able to use to
find them in various ways, but some of them were like,
here's a picture of my ID, and here's my address,

(06:01):
and here's my full legal name and eye color and
all of that. And people were like, you know, we
can find them. It's just like a bunch of vile
language about women and trying to talk about it in
a way that might scare someone.

Speaker 2 (06:14):
How many people's data was exposed.

Speaker 1 (06:17):
So at this point, I would say conservatively seventy two
thousand users.

Speaker 2 (06:24):
The t app has said that seventy two thousand images
were exposed, thirteen thousand of those were selfies and IDs.
This is identifying information that's accessible to anyone who knows
where to look. Essentially, the back end of this app
was left open.

Speaker 1 (06:40):
That's kind of the initial thing that people found. It
obviously requires technical expertise in order to find that out
and then use it, but not like a computer science
degree level of expertise. In fact, when this first hit

(07:00):
four chan, they weren't only talking about the data and
linking to it, they also offered detailed instructions on how
to access the data yourself.

Speaker 2 (07:11):
This obviously is very bad, but it gets worse because
a few days later they found another breach which takes
it to another level.

Speaker 1 (07:23):
A security researcher gets in touch with us and says, hey,
I could still access a lot of t app data
after t said they fixed the issue, and that is
true in the sense that they fixed the issue we
reported on, but he found a completely separate issue that
was even worse. And what happened there is you need

(07:45):
an API key in order to access all the data
on the app on the back end, and what he
realized is that every user gets a key in order
to let them interface with the app as a user.
But that's key also allows them to like query the
server for like other things they should not be able

(08:05):
to get, and he was able to access a bunch
of direct messages, right, you should not be able to
access the back end and see everyone else's messages with
like your personal user apikey. Yeah, that that is just
a huge, huge error. So it's like, you join the app,
you talk to other users, you discover that somebody has

(08:30):
information about a man your dating. You can take that
conversation into a direct message conversation where understandably you would
think that conversation is private. Right, and he could access
all of that. He managed to get his hands on
one point one million direct messages and those messages are
as recent as the day of the initial breach.

Speaker 2 (08:55):
Okay, So, an apikey is a pretty common way that
websites or apps can you to identify and authenticate a user. Usually,
the way this works is that your apikey is associated
with your individual account and the only things that can
unlock are the things that are connected again to your account.
But in the case of the t app, each user's

(09:16):
apikey could give them access to other people's messages. Also,
how does this breach happen? How does this stuff get leaked?

Speaker 1 (09:25):
So there's two parts of it. One is just the
way that they used this Google platform in order to
deploy their app. You should not be able to get
into it without what's called a token in order to
view the data. But just like the way they set

(09:45):
it up, the token was visible to everyone, so you
can kind of like ping the app and then see
the token, and then use the token to look through
all the information.

Speaker 2 (09:57):
At this point, you might have noticed something that neither
me nor a manual have really used the word hack.
We'll probably use it casually later, but there's a reason
that we don't really say it often in this situation,
and that's because when most people think of a hack,
they're thinking like Neo and the matrix. But if you've
been listening to what we've been saying, you're probably thinking,

(10:20):
wait a second, none of this really sounds all that hard,
and you'd be right. A lot of this did not
require any sophisticated knowledge at all. When the first breach
was posted, people were writing easy to follow tutorials on
four chan so that anyone could download the data for themselves.
To borrow a phrase that a previous guest used in

(10:41):
a past episode t was cartoonishly hackable.

Speaker 1 (10:46):
I was talking to my wife about this as I
was reporting the story, and she was like, who hacked this?
Who is the hacker that broke into this company? And
that is a fair question, and I think you can say, nominally,
maybe it was someone on four Chan who was looking
around because they saw the app in the news and
they were like, I don't know, fuck these women and
poked around in front of vulnerability and then shared it

(11:07):
unfortunate and he is the hacker. But here I think
the metaphor is more like, you put all your money
in the bank, and the bank left the door open
and the vaulted door open and left no one in
charge during the night, and somebody walked in and took
all the money.

Speaker 2 (11:23):
I mean to use your metaphor about the bank thing.
This is like the first issue that we discover is
that the bank left no guards, They left the door open,
they left it unlocked, and somebody walked in and took
some money.

Speaker 4 (11:36):
That is bad.

Speaker 2 (11:37):
This is like somebody else walks in and finds out, hey,
there's a key line on the ground. I can get
into this one person's vault. But wait a second. This
also opens almost the entire rest of the bank. I
can get into everybody's personal information.

Speaker 1 (11:51):
Also, yeah, I would tweak that metaphor even a little
to make it a little worse, where it's just like
I heard in the news, this bank just got let
me just check if they still left the door up,
And he was like yep, like they actually there's another
different door that's open, and I walked in and got
more stuff.

Speaker 2 (12:10):
The metaphors keep getting worse, man Like, I don't even
want to play this metaphor game. Basically, suffice to say,
hopefully anybody listening or watching understands this is really bad.
Being able to get people's photos that's not great. That's
not good at all, right, Being able to get people's
ideas that's obviously terrible. Being able to get direct messages

(12:31):
opens up and entirely different. It's just an entirely different
kind of vulnerability here. So now we know what happened
and it's not good. But what could happen next is
even worse. All right, let me paint the picture here.

(13:05):
Imagine that you went outside for a walk and you're
passing in front of an apartment building and you look
down on the street and you see this key that's
labeled Apartment one two three, and you pick it up
and decide to give it a try. So you go
up to apartment one two three, put the key in
the hole and it opens. And then you get curious
and you decide to try that same key on apartment

(13:27):
one two four, and that door also opens, So does
the door to apartment one two five, and so does
every apartment in every building in the city of Dallas, Texas.
That's roughly the population of women whose data could have
been exposed here. And it's not just what was leaked,
it's how those photos, how those addresses, how those private

(13:51):
messages could be used against these women.

Speaker 1 (13:53):
We reported on many breaches and hacks over the years,
and I would say this one is one of the
worst ones that have seen. And there's a couple of
things that I think really compound the issues. The first
one I think is you make this app for women.
All your users are women, and it's like for their

(14:18):
I don't know, interests, and it's somehow I wouldn't say
that it's like hostile to men, but it's like it's
not a space for men, right, and some men like
these men on four chance find that, I don't know.

Speaker 4 (14:30):
Offensive, the mere existence of this.

Speaker 1 (14:32):
One thing that people have done, now that all these
thousands of selfies are floating out there. If people recall,
like the foundation myth of Facebook is that Mark Zuckerberg
created this website called facemash, and what it did is
it took all the profile pictures from the Harvard Handbook

(14:54):
and put two pictures of two women next to one another,
and you decided who was hotter, and it created like
the ranking of all the female students at Harvard. So
somebody basically did something like that for the selfies in
the app, and they did it collects all these data,
all these votes, and then it presents like the top
fifty women who were voted most attractive and like the

(15:18):
bottom fifty, which is just obviously it's just such a mean,
terrible thing to do and a violation of privacy, and
it's like that alone, I think, is like really awful
and mean spirited.

Speaker 2 (15:28):
One four Chan user took the selfies and IDs that
they downloaded and created a website that they were calling
spilled t which used that data to rank the users
based on attractiveness. Someone else created a map on Google
Maps that was supposedly showing the locations of the women
who were affected by the breach. And I have to
stress again, once you have the data, things like this

(15:50):
are not hard to do.

Speaker 1 (15:52):
We just had in the UK this Online Safety Act
pass that now requires Internet platforms to vary verify that
users are of a certain age to view certain mature content.
And the way that a lot of platforms are doing
this for example, like let's say redd it is it
asks the user to upload a picture of their ID

(16:15):
or a selfie in order to verify that their adult.

Speaker 4 (16:19):
But like part of the deal.

Speaker 1 (16:20):
In order to make users feel safe is Reddit doesn't
see the image and this third party service that does
the verification, they promised to only keep the image for
seven days, but then they just hold on to your
verification and they don't have the image anymore. And anytime
you're like handling sensitive information, that is a good way
to do it. And it doesn't seem like te did that,

(16:44):
So that is like bad security practices. That is like
a pretty bad way to do things.

Speaker 2 (16:51):
Not only is it a bad way to do things,
it's precisely the opposite of what the t app said
it was doing. Their privacy policies said photos used for
verification purposes were quote securely processed and stored only temporarily
and will be deleted immediately following the completion of the
verification process end quote. Clearly they were not doing that.

Speaker 1 (17:16):
The other aspect, which is much more serious, is not
only do you create this app for women, you then
invite them to talk to each other over direct messages
about the most sensitive things that can happen to a
person's life. And you're talking about like husband's cheating. There

(17:38):
were messages in the data that we got about women
talking about their abortions. There were people talking about like
criminal things that they are accusing other men of doing.
And all that information is now out there. And not
only is it out there, the nature of the conversation,
paired with the type of data already out there, makes

(18:01):
it really easy to identify specific people. And we obviously
didn't share this information in any story. But imagine if
you're like one of these scorned men. Imagine if you're
just one of these four Chune dudes that is angry
about the existence of this app.

Speaker 4 (18:16):
It's just like, really horrible. What has the company said
since all this came out? They apologized.

Speaker 1 (18:21):
They said initially they closed the security issue with the
first hack. After the second hack, they said they fixed
that issue as well, and they also turned off direct
messages in the app, so it's like you can no
longer DM in the app. And then they said they
hired a cybersecurity firm to come and help them. I

(18:43):
don't know who that is. They said they contacted law enforcement.
I don't know which agency that is. They haven't said exactly.

Speaker 2 (18:50):
You just said something that t said that they closed
the security issue. How do you close the issue if?
I mean the data is just out now right? It
cats out the bag.

Speaker 1 (19:03):
Yeah, And this is what I meant, Like from the
very first moment that I saw the four Chan thread,
I was like, it's too late. It's like they can
stop the access, but the information has been extracted from
the app and shared elsewhere, so it's just out there forever,
and the images are out there forever.

Speaker 2 (19:25):
And this is where the lawsuit comes in. One woman
is suing Tea, and her lawyers say that she's hoping
that other women will join it as a class action lawsuit.
Just reading through the document, you can see one incredibly
important aspect of why this data breach could be such
a risk, So I'm gonna read from it quote many

(19:46):
users are domestic violence survivors who have intentionally moved to
new locations and taken steps to conceal their private information,
including their new addresses and that of their minor children,
in order to protect themselves from domestic violence perpetrators.

Speaker 4 (20:03):
End quote.

Speaker 2 (20:04):
As you can imagine, those ID cards that are out
there now made that all irrelevant. The lawsuit then goes
on to say the t quote calculated to increase its
own profits at the expense of plaintiff and class members
by utilizing cheaper, ineffective security measures end quote.

Speaker 1 (20:24):
It's like violation of reasonable expectation of privacy. There was
like an expectation of privacy and this person talked about
like very private things, and that trust and.

Speaker 4 (20:36):
Expectation were violated.

Speaker 2 (20:37):
And this is probably where you'd expect me to tell
you that the app is shut down. Everyone's sing everyone
parties over it's all done. But that's not what's happening.
The t app is still operational. They just announced that
they've signed up over a million new users again after
all these breaches, and some women are complaining that they

(20:59):
still can't get access to the app that's after the break,
all right. I want to be really clear here. There
are some people who do not believe that an app
like the t app should exist at all, and there's

(21:21):
a range of reasons for this. Maybe you think that
there's a risk that someone might get on the app
and say something that's either just frivolous or even a
straight up lie. Fair enough, but stick with me here
because I want to propose to you that it kind
of doesn't matter what you personally think about the morality
of what this app is offering for two reasons. The

(21:43):
first is that I think there's an issue that is
actually a lot deeper than the supposed morality or non
morality of what's happening here, but we'll get to that
in a bit before that. The other reason is that
this is not the first time that women have used
technology in this way. Women have always had to find
ways to warn other women about dangerous men that exist

(22:06):
around them. Sometimes that meant talking to other women directly,
but when technology became more available, of course they use
that too. And that's something I wanted to talk to
Sam Cole about. She's another journalist and a co founder
at Floral four media, and she's done a lot of
reporting on the concept that it looks like T is
based on these Facebook groups that had titles like are

(22:29):
we dating the same guy?

Speaker 5 (22:31):
A couple of years ago, these Facebook groups went pretty
viral where women were posting a picture of a guy
and saying, I'm going to go on a tender date
with him later? Does he have any red flags? And
red flags are like the code in the Facebook group
for something as a miss here watch out for some
kind of bad behavior. And then people can comment on

(22:52):
the posts and on the photos and say, yeah, this
guy pressured me into sex or ghosted me, or is married.
There's a huge spectrum of what a red flag is
in these communities. It was a much more localized and
I don't know, I guess more community based situation than

(23:18):
I would say.

Speaker 2 (23:19):
T is the concept, or we could say the promise
of T I think stems from a good thing. As
the website says, quote dating safety tools that protect women
and unfortunately there is a real need for that.

Speaker 5 (23:35):
We have like information about how this works and like
how many people are affected by that type of violence
and abuse on dates. I think it's like almost half
like it's close to the forty five percent or something
of college women experienced some kind of like verbal, physical, digital,
even like stalking type of abuse as a part of dating.

Speaker 4 (23:57):
Right, which is so high.

Speaker 5 (23:59):
It's a really risky thing to go on what is
essentially like a blind date, like someone who's a stranger
who you only know from pictures and text. It's just
the extremes that we have to go to to stay
safe because we're scared of something of the worst case
scenario happening, because we've heard it happened to other people

(24:20):
all the time.

Speaker 2 (24:21):
To bring in some of the complaints that I've seen online,
anybody could get on and post anything, They could lie
and say anything that they want to say.

Speaker 4 (24:29):
How do the communities deal with that?

Speaker 5 (24:31):
It's something that the communities are very motivated to keep
in check themselves because they want to, you know, make
sure there isn't actually like just straight up like false
allegations happening in the Facebook communities that you have to
agree to before you enter, and one of the rules
is no liable defamation or copyright infringement. You have to

(24:52):
be prepared to show proof of anything you say about
somebody and warn that you're posting at your own risk.
One of the rules says, if you don't have any
proof of allegations you make against someone posted in this group,
you're in danger of being sued for libel and or defamation.
It's like, don't be mean spirited, don't be judgmental, don't
make comments based on somebody's looks or age or occupation

(25:13):
or anything like that. Don't make fun of people. It's
not just a shit talking group. The purpose of it is,
and at least the stated purpose is to protect women
from going on dates with guys who are like known
bad actors.

Speaker 2 (25:28):
If you really think about it, this all makes a
lot of sense. These groups are focused on information. If
someone's posting bad information, fake allegations, or I don't know,
if someone's complaining that some guy likes to chew with
his mouth open or something like that, that makes it
really inefficient. If you just need to know if the
person you're going to see tonight is dangerous or not.

(25:49):
These groups are not for messing around. And that's where
Tea starts to differ from these community groups, because t
is promising that it's fun and I'm speaking in the
present intense here. Because the Tea app did not shut down.
It's still operational. They're still signing people up.

Speaker 5 (26:07):
It's so crazy.

Speaker 2 (26:08):
They are still posting on their Instagram the audacity. I'm
looking at this now. The caption of this Welcome to
the ultimate girls only group chat and app form. Whether
you're here to make new friends, share stories, or find support.
Tea is where women across the US connect, share advice,
and get each other. Tea is the girl's girl app

(26:30):
where the girls get to yapp. Welcome to the Tea party.

Speaker 5 (26:33):
Dude, shoot me. I don't know what to say. It's
just fucking crazy. It's fucking crazy. It's crazy that they
are trying to recover and damage control out of this
as fast as possible when like, your app has just
been blown to s motherians by hackers and you're like
exposing the data of all of your users and you're like,

(26:54):
we're fine, everything is fine. Come on back, come on
in if you're waiting to get in, and people are
right like they're trying still to get in.

Speaker 2 (27:05):
And this is where I have to say that they
have addressed this in some form, right, Ladies of Tea.
We have an update regarding the cyber incident that took
place last week and wanted to share it with you
as soon as possible. Tea was born from the mission
to empower, support and amplify the voices of women navigating
the modern dating world. Our mission remains the same. While

(27:28):
we acknowledge this serious cyber incident, we also acknowledge that
T is needed now more than ever.

Speaker 5 (27:34):
It's just wild. It's wild to like pr speak your
way out of such a disaster.

Speaker 2 (27:40):
One of the comments I was seeing was, y'all are
focused on the wrong things. We do not care about
the leak, work on approving us in the queue. There's
people still complaining about Hey, okay, stop talking about this
hack stuff. Why have you not approved my account yet?

Speaker 5 (27:55):
It's just wild. We're so like desperate for convenient solutions
at this point. Just get me what I want immediately
right now. I don't care if there's a huge privacy
disaster as a result of negligence from this app.

Speaker 2 (28:15):
By the way, this is also kind of important the
word T. It has a deeper background in queer and
black circles, but now it's just kind of used as
mainstream slang for gossip, Like if someone says, what's the
tea on our new coworker that just means, hey, if
you heard any gossip, please tell me. So there was
an element of this app that was less about safety

(28:37):
and more about promising fun, basically like any other social
media app. It was built off of fomo. Even if
you don't have any real concerns about men around you,
you don't want to get left out, which is really
counter to what these original community groups were all about.

Speaker 1 (28:54):
If you pull up a lot of the images of
the website and the promotional materials of the app, it
like a woman whispering in another woman's ear. All the
promotional videos on social media is like other women inviting
you to use the app and stuff like that, and
it's it's I'm not saying this is why the app
failed so terribly, but it's a guy.

Speaker 4 (29:15):
It's like, seeoh is the guy.

Speaker 1 (29:17):
It's a guy who started this app and he put
a bunch of women like as the face of it
and failed failed them, terribly, failed them, just absolutely terribly.

Speaker 2 (29:29):
We're talking here about a man named Sean Cook, the
founder of Tea. According to the company's website, he started
the app quote after witnessing his mother's terrifying experience with
online dating. Not only being catfished, but unknowingly engaging with
men who had criminal records.

Speaker 1 (29:46):
That's with everything else. It's like something that started as
like a community effort, a not for profit thing, and
someone saw that and said, Okay, this is very popular
and that I can probably make some money off this
by putting it on the app store.

Speaker 2 (30:02):
And we do have to keep in mind here that
this is not a grassroots organization. This is a company
that needs to make money. Back in May, the founder
got on a podcast and talked about his prospects for
getting angel investing and making money on this.

Speaker 6 (30:17):
I've wanted us to just build something really powerful. We
have been receiving a lot of business development interest in
partnering and investing in tea and acquiring to and we
do have some executives and some pretty renowned angel investors
in the dating space.

Speaker 2 (30:35):
The founder was actually answering questions about stuff, you know,
back when the tea app was on the upswing. He
was talking about, Yeah, we've gotten interest from angel investors
who are interested in us. There was modetization in the app.
You could pay for features in the app, and any
investor is not going to invest in your company unless

(30:56):
you've already shown that you can make money, and the
t app does the Through a subscription model, a user
can pay fifteen dollars a month and get things like
unlimited searches, reverse image searches, background checks, and phone number lookups.
By the way, reverse image searches phone number lookups, a
lot of that stuff can be done with Google or
tools that are available for very cheap or free. Tea

(31:19):
was just packaging these things in an easy to use
place along with a centralized chat with other users. And
that's where Tea's marketing comes in, where they make it
look like the place that all your friends are hanging out.
And in the early stages of an app, when you're
raising money, that's what you need more than anything users.
And of course there's a profit motive here to play

(31:41):
down what's going on, because really the way they're portraying
it is there was a hack problem fixed, something bad happened,
and we fixed it. The cyber incident quote unquote is over.

Speaker 5 (31:51):
Now, and this is probably like great for it. Just
show that, like we can fuck it up horrendously and
people are still banging on the door is like something
that I would bring to an invest Sure, Look how
popular this is like, how much this isn't filling a need.

Speaker 2 (32:06):
We had two catastrophic leaks and people are still trying
to get in.

Speaker 4 (32:09):
Yeah, that's so bleak.

Speaker 2 (32:11):
I had not thought about that that they could probably
still make money here. Honestly, why wouldn't they.

Speaker 5 (32:18):
We're in the fuck around and find out era of society.

Speaker 2 (32:21):
The fact that they're women who are still clamoring to
get onto this app might be really strange to you,
and you might be thinking, Yo, what is wrong with
these women? But also think about where the users are
getting their information from about the leak from the app itself.
Most of us really tend to trust the technology we use,
especially when it's billing itself as a safety app, because

(32:43):
I mean, if you can't trust the safety app, what
can you trust? And reading the t app's Instagram posts,
it makes it sound like this whole thing wasn't really
a big deal and whatever happened, it's been taken care
of by now, which again isn't true because all those photos,
the names, the addresses, potentially messages, it's all out there now.

(33:05):
It will never go away. And even for the people
who have read up on all this, we have to
think about the alternatives.

Speaker 5 (33:12):
Here, I guess people are kind of doing that math
and saying, even though it's clearly not a safe place
to trust with my data or with my information, I'm
weighing that against my need for safety in the dating world.
All of my information has already been exposed to the
Internet in the past by every big company, every big

(33:34):
healthtach company, every major corporation has had some kind of
big breach at this point, So what's another one. It's
trying to solve a problem that started with or is
facilitated by dating apps themselves, Like being worried about going
on a date with someone. Yeah, because you're cold calling
people on a dating app, not meeting them through your

(33:55):
community or through in person, means beforehand, you're totally right,
which is a new phenomenon. It's like a new thing.

Speaker 2 (34:02):
Yeah, we've now got an app to solve problems that
another app caused. There's a lot about the situation that
still does not make sense to me, and I'm not alone.
It's so bad that I've legit seen people online saying, hey,
I bet that this was some kind of con trying
to trick women into leaking all the information on their own.
And look, I don't think that's the case, but I

(34:24):
still am wondering, how could this happen? Is this just
some kind of negligence? Is there any indication that Tea
knew that their security practices were not up to standard
before this?

Speaker 1 (34:37):
It's sort of like, do they know how bad of
a job they were doing? I have not seen signs
of that, but that is in itself an issue. Right
if you're building an app that has one point six
million users who are invited to talk about like the
most intimate aspects of their life, even know that you're

(35:01):
doing a bad job, I would say that is the problem.
I think the question is more like, who was in
charge of making sure that people's information we're probably? Was
that even a job at the company?

Speaker 4 (35:13):
Yeah?

Speaker 1 (35:13):
Is that another hat that.

Speaker 4 (35:15):
The CEO war that they just pull.

Speaker 1 (35:18):
Some solution off the shelf. Those are the kind of
questions that I'm asking.

Speaker 2 (35:22):
I think those are questions everybody should be asking, because,
off bet, it has never occurred to me to make
an app like this, But if I were to make
an app where vulnerable women are coming together to try
to protect each other, honestly, four chan is going to
be my base case, is going to be that somebody
on four Chan is going to get mad about this,

(35:43):
and I'm going to have that scenario in mind and say, Okay,
how do we protect against this. Eventually somebody's gonna want
to attack this app. I'll making sure that this thing
is secure.

Speaker 1 (35:53):
But really I find offensive about t and I think
why it's such a big story whine so shocking is
this was presented to women as a solution for that.
Right It's like, here's an app to do this thing
that is very complicated and risky and private. We made

(36:14):
an app for it, Boom, you don't have to worry
about it, and had the opposite effect. Yeah, I guess
that's probably like a bigger lesson to take away here.
It's like you need to be very suspicious of any
app or technology that is suggesting and has an easy
solution for a very complicated problem.

Speaker 5 (36:35):
It's pretty clear that this is a conversation that needed
to be had that unfortunately exploded in a way that
as the result of a huge privacy violation. I think
it's a much more deeper societal problem that the fact
that this exists and needs to exist. It makes a
lot of sense why these exist. Rightly, wish they didn't
have to.

Speaker 1 (36:52):
The real slap in the face is how you know
this app handled your data? They just left it pretty.

Speaker 2 (37:01):
And this is where we come back to why I
don't really think it matters whether or not you like
the idea of an app where women talk about men
or not, because look, I've seen it, you probably have too.
There is a subset of the male population that's saying, oh, well,
it serves them right. If they're gonna get an app
and they're gonna be docs and men and talking about men,
well we're.

Speaker 4 (37:20):
Gonna get you docs two. Serves you right.

Speaker 2 (37:23):
And look, I'm gonna go ahead and leave my opinion
about that standpoint aside, because you probably already know what
that is. But let me just point out that it's
not just women getting docks here, because let's keep in
mind women were on this app to talk about men.
So if there are indeed messages floating around, then there

(37:43):
are also potentially details, potentially serious allegations about men that
are also floating around out in the open. Data leaks
do not discriminate. This is bad all around. But again,
this also comes down, I think to a way that
as a society we've just started to trust every app

(38:05):
we've used with incredibly sensitive information, and then we shrug
our shoulders when someone tells us to be careful for me.
This part, at the very least, is really cut and dry,
whatever the sas marketing might have been. If the core
premise of a company is an app that provides the users' safety,

(38:26):
and the founder's messaging is also all about safety, then
I would think that the response to a data breach
would be a lot more serious. Usually it would sound
something more like.

Speaker 1 (38:41):
Hey, there's been a breach. Here are the steps that
we are taking. We have hired this company, they have
done an audit, they have found this information. We are
contacting users and letting them know who is affected. The
people who are affected are going to get this in
that service in order to try and protect their privacy.

Speaker 4 (38:59):
Either pretty standard.

Speaker 1 (39:00):
Things that should be the tone that should not be
like we are trying to grow massively, still girlies and
get you all into this app.

Speaker 2 (39:10):
The thing here that I want to get to is
that this is an app, and I think maybe it's
worth really making that distinction between a community of women
who are working together to protect each other, yeah, and
an app. Because we're recognizing here that there is a

(39:32):
problem of violence against women. This is a real problem.
This exists, that women experience this an order of magnitude
more than men do. Misogyny exists. These are real societal problems,
and I think we've gotten so used to we can
use an app to fix that, like an app to
fix racism and apt to fix sexism. That idea is listen,

(39:52):
it's really seductive.

Speaker 5 (39:53):
I do wish that we could solve these problems with
an app, but like we've just lost a lot of
the skills that would have been required to solve this
in a way that would have been more private and
safe and effective, probably, which would be like knowing your neighbors,
knowing your community, trusting your friends, having a group of
people who you trust in. It would have been better

(40:15):
as like a group chat, but you can't really monetize
against that without turning into an app or website or
like a Facebook group daily. We would have done it
through in person connections, but it's become such a fragmented
aspect of our society that we have to use these

(40:35):
apps to facilitate connection.

Speaker 3 (40:37):
At this point, none of this had to happen.

Speaker 2 (40:43):
Tea's inability to implement just basic security methods meant that
they've violated the security and privacy of their users. And
T is still posting on their Instagram account. If you
look at at the Tea Party girls, they're talking about
how great and fun they're app is and how they're
welcoming hundreds of thousands of new users. They're not really

(41:05):
talking about the breach anymore. It's all about the fun
that you can have if you join everyone else on
their app. And I said, none of this had to happen,
but if you think about it, or at least if
I think about it, it kind of feels like it
was inevitable because we're starting to get frustrated with these
complicated problems that exist in society, and then somebody comes

(41:28):
along and promises that they can fix it with an app,
and we sigh and relief and we tap the install button.
We have appified a solution to misogyny. It was never
going to work. And again, I think our individual opinion
on whether or not the premise behind Tea was good
or not doesn't really matter here because I think two

(41:50):
things have definitely been proven here. First, just in the
way that this breach happened, I think it proves the
kind of just basic anger and hatred that win and
continue to be up against every day just for existing.
And also I think it's proven again society's complete inability
to take any of this stuff.

Speaker 4 (42:11):
Seriously.

Speaker 2 (42:19):
Thank you so much for checking out another episode of
kill Switch. Big shout out to Sam and Emanuel from
four or for Media.

Speaker 4 (42:26):
By the way, if.

Speaker 2 (42:27):
You're not already subscribed to four or four Media, please
check them out or really appreciate their work as always,
and let us know what you think about our work.
Hopefully you appreciate it too. And you know, if there's
anything you'd liked us to cover, or you just want
to talk, you can hit us at kill Switch at
Kaleidoscope dot NYC, or you can find us on Instagram

(42:47):
at kill Switch pod and you can't meet directly at
dex digit that's d e x d I GI on
Instagram or on Blue Sky if that's your thing, and
you know, leave us a review wherever you happen to
leave your podcast reviews. It helps other people find the show,
which in turn helps us keep doing our thing. This

(43:08):
thing is hosted by me Dexter Thomas. It's produced by
Shina Ozaki, Darluk Potts and Kate Osbourne. The theme song
is by Me and Kyle Murdoch and Kyle also mixes
a show from Kaleidoscope. Our executive producers are oswad Washin,
Mangesh Hakikatur and Kate Osborne. From iHeart, our executive producers

(43:29):
are Katrina Norville and Nikki E. Tour catch All the
next one

kill switch News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

About

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Special Summer Offer: Exclusively on Apple Podcasts, try our Dateline Premium subscription completely free for one month! With Dateline Premium, you get every episode ad-free plus exclusive bonus content.

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.