All Episodes

March 28, 2025 37 mins

What even is a crypto mixer? This week in the News Roundup, Oz and Karah dig into potential Slack-enabled corporate espionage, the recall of a Kim K-beloved product and the group chat that broke the internet. On TechSupport, The Washington Post’s technology columnist Geoffrey Fowler discusses 23andMe’s financial woes and what it means for the genetic data of the roughly 15 million people who bought DNA testing kits from the company.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Tech Stuff, a production of iHeart Podcasts and Kaleidoscope.
I'm os Vlosian, and today Cara Price and I will
bring you the headlines this week, including a bipartisan bill
that could change the future of the Internet. Then, in
our Tech Support segment, we'll talk to Jeffrey Fowler of
the Washington Post about the twenty three and me bankruptcy
and what that might mean for your genetic data. All

(00:23):
of that on the Weekend Tech. It's Friday, March twenty eighth. Well,
I'm very, very excited to finally be able to say it.
Welcome back, Cara Price.

Speaker 2 (00:41):
Hello, it's good to be back as Valashan.

Speaker 1 (00:45):
You're armed with a document dossier.

Speaker 2 (00:49):
I am, I have papers. I created a dossier of
all the tech stories that I want to bring up.
And then, of course you've already reported what happens with me.
But no, some of them, some of them are true
to my own weirdness and obsession. One of them is
Kim Kardashian's photoshoot for Perfect magazine, which came out a

(01:09):
couple of weeks ago.

Speaker 1 (01:11):
Well, tell me about that. I remember, Kim Kardashian, you
and I were entertained last November by her sort of
first trap photos with the Tesla robot. But is this
a development?

Speaker 2 (01:20):
So if you're Kim, you're gonna pose, and if you're Kim,
you're gonna post with the hottest stuff. Right, what's the
hottest stuff? The cyber truck and a Tesla robot And
those poses are not exactly of the doge variety. They
are sultry. But no, you know, there's actually a more
serious story here. The Associated Press reports that Tesla has

(01:44):
recalled nearly all cyber trucks, so it's more than forty
six thousand in fact, and the National Highway Traffic Safety
Administration warned that panels on either side of the truck's
windshield are in danger of detaching and creating a road
to hack. It's not funny, but it's like you're in
the hand, I mean, where you see it.

Speaker 1 (02:03):
It's funny because ironic, because it's supposed to look like
you're living in the future but supposed to be roane.

Speaker 2 (02:09):
It's RoboCop but with like an old hyende vibe. And
so the issue is actually the adhesive. It's vulnerable to
environmental embrittlement. According to the NHDSA.

Speaker 1 (02:20):
I thought that was your phrase.

Speaker 2 (02:24):
So drivers can get the panel replaced by Tesla for free, actually,
which I mean I imagining going to do that?

Speaker 1 (02:32):
Fool's Errand yeah, I mean Tesla shares a down forty
two percent for this year, and I can't imagine this
story will help on the subject of corporate drama. I
was drawn to this story this week of tech enabled
business espionage. Here's the headline from tech Crunch. Rippling seu's deal.
Deal denies all legal wrongdoing and Slack is the main witness.

Speaker 2 (02:54):
Can you explain to me how Slack is a witness? Actually,
Slack is a witness to a lot of our bacchanalia
for this show, So Slack as a witness actually makes
sense to me.

Speaker 1 (03:05):
So yes, I mean, this is kind of a story
about who's watching on You're on your work software or computer.
But Rippling and Deal are HR startups that offer payroll
and other HR resources and their rifles. This is the
Pepsi and Coke of the HR software.

Speaker 2 (03:21):
Well, that's the sexiest sentence.

Speaker 1 (03:22):
In Rippling recently announced a lawsuit against Deal in a
fifty page complaint, the ledges racketeering, unfair competition, misappropriation of
trade secrets and more. And here's the real intrigue. The
lawsuit centers around an employee that Rippling alleges acts as
a spy for Deal.

Speaker 2 (03:41):
The worst thing you could be at Coke's a Pepsi
spy and vice versa. But then again, I'm like, who
cares about trade secrets at Rippling and Deal? But I
guess it's the comtes.

Speaker 1 (03:54):
Twelve and thirteen billion dollar HR companies. They're almost exactly
the same size, they have a very similar offering and
so so yeah, I think the stakes are high, and
you know, there's been a kind of public relations war
going on. In parallel, Rippling created a game on their
website called the Snake Game. You of course remember Snake.

Speaker 2 (04:11):
I mean, I just download a black puzzle again, which
is the closest I can get, but Nokia.

Speaker 1 (04:15):
Well it's not the closer you can get because you
can now play Snake on Ripling's website. Here are the instructions.
Deal often claims to be a one stop solution for
all your global payroll needs, but their customers pay the
price for gaps beneath the surface. Play this game to
find the difference between Deal's claims and the reality of
their proce.

Speaker 2 (04:32):
This is the pettiest. This is pettier than some of
the stuff that my friends did in college.

Speaker 1 (04:38):
But I'm sure you want to play the game, right, Yeah.

Speaker 2 (04:39):
So it won't be interesting to the listener to listen
to us play. But just like imagine playing Nokia Snake,
you gobble up all of the alleged deal falsehoods that
are at play. But the lawsuit was filed by Rippling.

Speaker 1 (04:54):
That's right. And the reason it fascinates me, as I mentioned,
is because it's kind of about privacy, especially privacy as
an employee, right, Like, if you think you're alone on
your work computer, you're wrong, man.

Speaker 2 (05:07):
Well, especially if you're a spy.

Speaker 1 (05:09):
Absolutely, according to Ripling's own lawyers, the company keeps a
detailed log of what people do on Slack, Like when
an employee accesses Slack channels, or conducts searches on Slack,
or opens a document on Slack. All those things are logged.
And so the lawsuit contends that according to this logged activity,
the Rippling employee, who is allegedly a spy for Deal,
started looking at content associated with the word Deal at

(05:31):
a much higher rate beginning November, including perusing sales related
Slack channels that weren't necessary for his job in payroll operations.

Speaker 2 (05:40):
But when they say looking, we're looking at.

Speaker 1 (05:45):
Well, I'll give you. I'll give you a clear example
of this. So this morning, when you slacked me and
said how can I find the log in details to
all of our subscriptions, what I did was I searched
log in in Slack and then that's how I found it.
So you used me as a human function, but you
actually there is a search function.

Speaker 2 (06:02):
They're like OS has been trying to log in.

Speaker 1 (06:04):
Quite a bit recently. So that's exactly right. So this
dude was writing deal into the Rippling corporate Slack and
seeing what he could find allegedly.

Speaker 2 (06:13):
Fascinating, and so slack was basically, I mean, slack wasn't
doing anything. But like when people went looking for what
this guy was doing, Slack was like, here's what this
guy's doing.

Speaker 1 (06:23):
People didn't just go looking, They actually created a honeytrap. Explained,
So Rippling created a Slack channel and started a rumor
that it had a bunch of embarrassing information about Deal
in it, and of course allisuredly this employee headed straight
for that slack channel.

Speaker 2 (06:41):
And the spy employee searched for this channel and that
was picked up by the slack log is.

Speaker 1 (06:46):
That's according to the lawsuit anyway. It also claims that
when the alleged spy was asked by court order to
hand over his phone, he escaped the bathroom, locked the
door behind him, and possibly even attempted to flush the
phone down the toilet.

Speaker 2 (07:00):
As someone who's heard anxiety attacks, at first, I hear, well, honey,
that phone ain't going down the toilet, And then I'm like,
you know what, I would flush a phone down the toilet,
especially if it was a Nokia phone that things goes
right down, not my iPhone.

Speaker 3 (07:10):
I know.

Speaker 1 (07:11):
You have to imagine fishing your wet phone out of
the toilet having to fail to flush it is not.

Speaker 2 (07:15):
The the way I have fished phones out of a toilet.
I'd be a millionaire if you paid.

Speaker 1 (07:21):
Next, I have a story that doesn't exactly debunk the
claim that crypto is the Wild West. The story is
about something called a crypto mixer, and he guess what
that is, Cara.

Speaker 2 (07:31):
Like something between the world's Saddest job fair and the
Saddest Prome.

Speaker 1 (07:37):
That's what I thought too, but then I asked Google Gemini. Ah,
And here's how Gemini explains. A crypto mixer, also known
as a tumbler, aggregates cryptocurrencies from multiple users, mixes them,
and redistributes them randomly, making it harder to trace the
origin and ownership of the funds. So, of course there's
a public ledger for cryptocurrency, the blockchain, but after the

(07:58):
mixing process, the coins are we distributed back to the
original depositors in a random manner.

Speaker 2 (08:03):
It's like if everybody bought a bottle of vodka and
you made a huge batch martini and then you were like,
can I have my drink? You get your drink, and
you're like, I don't know if that was from my
bottle or who somebody else's voca battle.

Speaker 1 (08:14):
I think that that is very very well put, and
I'm sober. The metaphor the founders of this particular mixer chose, however,
was the Tornado, hence the name of Tornado Cash.

Speaker 2 (08:24):
That sounds like a celebrity baby name.

Speaker 1 (08:27):
Well wait till we get to the alleged perpetrator. In
twenty twenty two, the Justice Department alleged hackers, including the
Lazarus Group, which is allegedly run by the North Korean government,
launded billions of dollars in stone assets using Tornado Cash.
So the US took action and sanctioned the company, and.

Speaker 2 (08:44):
So nobody in the US could use Tornado Cash anymore.

Speaker 1 (08:47):
Yeah, that was the name. That was one of the
consequences of the sanctions. Another was criminal charges against two
founders of Tornado Cash. The Wall Street Journal recently spoke
to one of the founders, whose name is get this,
Roman Storm.

Speaker 2 (09:01):
I can't These are Kardashian children's names.

Speaker 1 (09:05):
That's right. So Roman Storm the founder of Tornado Cash.
But you know, all jokes aside. He was actually arrested
at gunpoint after federal agents stormed his home in twenty
twenty three to arrest him for his involvement in Tornado Cash.
He's now out on bail and says he's not guilty
of charges of money laundering and sanctions violations. In this

(09:25):
week's Wall Street Journal article, he maintains that the software
he created is neutral and has both good and bad
use cases.

Speaker 2 (09:33):
But it's my understanding that the whole point of the
blockchain is to create a ledger that traces back the
original transaction. So what could possibly be a good use
of crypto mixing, Well, Storm.

Speaker 1 (09:49):
Said, it's financial privacy. So an example he used actually
was that you could donate crypto to assist Ukraine in
their war effort without identifying yourself, for example, to Russian authorities.

Speaker 2 (10:00):
So that actually makes a lot of sense.

Speaker 1 (10:02):
There is some good news for Storm, we could go
the US actually lifted sanctions against Tornado Cash, and while
Storm is still awaiting trial, his ally seems to be
encouraged by the Trump administration's desire to make the US
the crypto capital of the world.

Speaker 2 (10:15):
Unbelievable. I love that story. Else, The next story that
I want to talk to you about is one that
I have not shut up about since probably we started
doing Sleepwalkers.

Speaker 1 (10:25):
Seven years ago, seven years ago.

Speaker 2 (10:27):
And you know what, a lot of people in the
US government have not shut up about it over the
last seven years either. But this is the first time
in my research that I've felt like, Okay, there's actually
going to be some movement on this because members of
the bipartisan members of the US government are getting involved.
So Section two thirty sounds extremely boring, and I bring

(10:48):
it up at parties and people look like shut up.
But if I want to kind of plagiarize the way
that Section two thirty has been described by those who
care about it, it is the twenty six words that
created the Internet, the twenty six words of the following.
No provider or user of an interactive computer service shall
be treated as the publisher or speaker of any information

(11:12):
provided by another information content provider. What does that mean
to you?

Speaker 1 (11:18):
Doesn't mean that much to me on the face of it,
But I know that those twenty six words have been
some of the most consequential in the history of technology legislation,
and arguably are lifetimes. They represent this addition to the
Communications Decency Act that was passed in the mid nineties,
and those twenty six words that created the Internet basically

(11:40):
meant that Internet platforms were not legally liable for third
party content like content created by users that lives on
their platforms when the law was struck. It was basically
designed to allow these Internet platforms to grow and to
host content without fear of liability.

Speaker 2 (11:58):
They did that successful.

Speaker 1 (11:59):
Boy did that work.

Speaker 2 (12:00):
And it was also related to free speech concerns, which
is still something that you know people are talking about today,
which is this idea of like, how much is a
two moderated internet?

Speaker 3 (12:10):
Right?

Speaker 2 (12:11):
And so, ultimately, as you were saying, Section two thirty
allowed companies to put less effort into regulating their content
because they're not treated like publishers.

Speaker 1 (12:20):
Publishers obviously are constantly subject to lawsuits for saying stuff
which is defamatory or for publishing content which is extreme,
whereas websites and social media platforms are immune.

Speaker 2 (12:32):
Right, And in the nineties, nobody could have anticipated how
this law would fuel the explosion of what we now
know as social media. The story here is that thirty
years later, there is now some bipartisan support for modifying
Section two thirty, and two senators, one being Lindsay Graham,
who's a Republican from South Carolina, and the other Dick Durbin,

(12:53):
a Democrat from Illinois, plan on introducing a bill to
set an expiration date for Section two thirty, and that
date is January one, twenty twenty seven. Mark your calendar,
less than two years from now, less than two years
from now. But the Information reported that the lawmakers don't
actually want to repeal Section two thirty outright. It's really

(13:14):
more of an invitation to the tech companies to negotiate,
which is something that they have been unwilling to do
without an ultimatum thus far.

Speaker 1 (13:22):
I think one of the interesting things about this story
is the first place you sort reported was in the
Information which is like the go to source for the
whole tech industry.

Speaker 2 (13:30):
And I think what that signals is that big tech
is actually watching very closely. And one of the things
that I've noticed is just this increased anger towards tech
companies in recent years from both sides of the aisle.
You know, there's the growing concern for the safety of
kids online and frustrations about failed attempts to regulate big

(13:51):
tech again on both sides of the aisle, which may
bode well for this yet to be introduced Bill.

Speaker 1 (13:57):
It's definitely a story we'll be following up with over
the come months. But right now, let's take a quick break.
We'll be back with a few more headlines, and then
we'll hear from Jeffrey Fowler of The Washington Post about
what twenty three and meters bankruptcy filing could mean progenetic data.
Stay with us, Welcome back. We have a few more

(14:28):
quick headlines to you this week before we dive into
our conversation with Jeffrey Fowler about twenty three and meter.
Casey Newton wrote this week in his newsletter about two
new studies, one from the MIT Media Lab and one
actually from Open ai itself, which suggests that heavy chatbot
use is correlated with loneliness and reduced socialization. Now, of course,
as Casey Newton and the studies both point out, correlation

(14:51):
is not causation. There's always a possibility that loneliness is
what propels someone to seek an emotional bond with a bot.
But the studies do point at a potent danger, which
is that engaging with compelling chatbots might pull people away
from connections with other humans, which may in turn drive
more loneliness and more dependence on a computer companion.

Speaker 2 (15:11):
This is the digital chicken egg. Am I lonely seeking
a chatbot? Or is the chatbot so hot that I
can't stay away?

Speaker 1 (15:18):
That's it.

Speaker 2 (15:19):
There's another story that Semaphore reported on that an off
Broadway theater in New York is offering live AI powered
translations for the play Perfect Crime. So theatergoers can scan
a QR code and choose from sixty languages. So actors
wear microphones that feed their voices directly into the translation system,
so no side conversations or audience noises get accidentally picked

(15:42):
up by the AI powered translation services. This is an
incredible use of AI, which is like, you could be
a tourist from anywhere and see this show and still
understand it.

Speaker 1 (15:52):
Finally, obviously we can't pass the headlines by without a
nod to this week.

Speaker 2 (15:58):
The dofist Pentagon pracs.

Speaker 1 (16:00):
Hashtag signal Gate. Just a few weeks ago we had
Meredith Whittaker, who runs Signal, the encrypti mesching app on
tech Stuff and Signal is absolutely the flavor of the
week because Jeffrey Goldberg, the editor of The Atlantic, was
accidentally included in a Signal group with JD. Vance and
Pete Hegseth to discuss highly sensitive battle plans for Yemen,
which are now absolutely all over the Internet. The lesson

(16:22):
here is that it turns out encryption doesn't work if
you invite journalists into your group chat. The story broke
the internet per Axios. Hashtag signal Gate is the most
interacted with story of the year to date. The Atlantic story,
which is titled the Trump administration accidentally texted me its
war plans has approaching five hundred thousand interactions on social media,

(16:44):
which is more than the second and third place stories
of the year combined. For your reference, those two are
Meet the World War II veteran that recently celebrated its
hundredth birthday from the venerable news source eleven Alive. And
Elon Musk's networth dropped twenty nine billion dollars in one
day as Tesla stock tanks from Business Insider. So lots

(17:06):
of emojis in the in government communication, many more than
one would suspect fireflames. So for our next segment, we're
going to explore what happens when a company that you've
given your personal data to hits a crisis. Now, companies

(17:29):
file for bankruptcy all the time, from major retailers and
restaurant chains to tech startups and bitcoin exchanges, but they
usually don't own the DNA sequences of millions of people.
So such as the case Carrot with twenty three and me, yeah.

Speaker 2 (17:44):
They have my oh boy, do they have my data?

Speaker 1 (17:48):
Was so you were one of the people in the
twenty tens when this was the hottest game in town
who couldn't resist.

Speaker 2 (17:52):
I did it in twenty twenty one because it was
a thing that it was a thing that somebody bought
me as a gift, oh okay, And I kind of
sat on it for a little bit.

Speaker 1 (18:06):
You're buying the dip as a user, not as an investor.

Speaker 2 (18:08):
One thousand percent, and I, I don't know, I just
looked at it. I'm obsessed with my family history and
to get a little bit more serious, you know, it's
like I have a lot of dead relatives and I
don't have the ability to ask them, you know, who
was our extended family. And so I decided to take
matters into my own hands. And you know, lo and

(18:30):
behold found out that I'm incredibly Jewish, which doesn't surprise me,
and that's it, but that was still exciting. I struggle
with something that I think is really at the core
of this bankruptcy filing and how it affects people, which
is I love to give my data away. I don't care.

(18:52):
I want it now. I want Wi Fi at La Guardia,
I want Wi Fi in the middle of the street,
and I'm going to give you my pass where I'm
going to give you my email address. I also report
in a technology podcast where I know that that's a
terrible thing to do, and such as the case with
twenty three and Me, where everyone was like, Kara, don't
do that, just don't do it. Talk to your family

(19:12):
members and don't do it.

Speaker 1 (19:14):
Now. Can I ask you? One of the things that
twenty three and Me was trying to do to kind
of make its business model more robust than just providing
a one off test that you literally never needed to repeat,
was to kind of build in this like social aspects
to the platform. Did you connect with any distant, previously
uncontacted relatives.

Speaker 2 (19:33):
Yes, and we figured out we were related. You know,
no profound It was not a profound thing. And look,
I know that for a lot of people what happened
was like people found out they had like a brother
or another family. You know, it was I think it
was a It had a profound effect on a lot
of people. It did not have a profound effect on me.

Speaker 1 (19:52):
So my sister is pretty interested in genealogy, family tree, etc.
Scarce and she, to her enormous credit, said to my father,
would you mind if I do twenty three and meters?
And he said, I would really prefer it if you didn't,
because I don't want to know if there's a serial
killer in the family. We both raised the eyebrows with this,

(20:16):
and of course the question arose, was there a suspect
in mind?

Speaker 2 (20:20):
You're like, are you exact?

Speaker 1 (20:24):
So she didn't do it? You know how paranora I am.
Of course I didn't do it either, But fifteen million
people did. And many of those people who know about
the bankruptcy, I'm sure are concerned about what's going to
happen to their personal data. Here to help us understand
is Jeffrey Fowler, who writes a column for The Washington
Post all about the user experience of technology, the good

(20:45):
and the bad. And what I particularly like about his
writing is that he really centers the user in his column.
It's tech journalism, but it's practical tech journalism. What does
this mean for me? How can I make things better?
What should I know? And so he wrote a column
this week that called both of our I.

Speaker 2 (21:00):
Jeffrey, welcome to tech stuff.

Speaker 1 (21:02):
Hello, Hello, So you wrote a story this week and
the headline was delete your DNA from twenty three and
me right now, which was quite a captivating headline. Could
you just start off by giving us a little bit
of the background On twenty three and meters. And then
while you wrote this story, I wrote.

Speaker 4 (21:17):
It because fifteen million people around the world spat into
a little vial and sent their DNA to a company
in Silicon Valley with the promise of unlocking all sorts
of things about our ethnicity, our background, our family tree,
and our health. But it turns out that the company
that we trusted with that information, that really really precious information,

(21:41):
was a terrible business, and it declared bankruptcy late Sunday night.
And if there's one thing I know, it's true you
do not want a company that is bankrupt to be
responsible for protecting your very precious information. And that's the
situation we find ourselves in. So the moment I heard
about it, I was like, we need to press published
on this story right now. And it seems like a

(22:03):
lot of people agreed, because so many people have been
trying to delete their data from twenty three and me
that the site has been down for large periods of
this week.

Speaker 2 (22:12):
And yet you, like me, could not resist getting a
twenty three in me account. What were your reasons for
getting an account originally?

Speaker 4 (22:19):
You know, I think it sort of tracks the arc
of a lot of our relationships with technology and with
data over the last twenty years. Look, this was the
two thousands for me. There was so much possibility out there, right,
and you know, in two thousand whatever, when I did

(22:39):
my twenty three and me test, what I could imagine
was what the company was promising, which was just a
fascinating idea, right that together, if they were able to
gather enough DNA data, they could use the power of
technology to unlock all kinds of secrets about the human body.
They could already even in those early days, tell you
curious things about yourself, Like I remember one information they

(23:00):
gave me is that I have a gene that makes
me likely to have wet ear wax.

Speaker 1 (23:06):
While that is, I feel so seen. I was always wondering.
It's a thing. It does.

Speaker 2 (23:10):
It's very validating.

Speaker 4 (23:12):
Yeah, And so while that's like the wet ear wax
is not going to change the world, they maybe could
learn things about cancer or all kinds of mysteries about
the human body. That is, That was and remains a
pretty exciting idea.

Speaker 2 (23:25):
All of these things are true about twenty three and
meters And as a user, I also experienced, you know,
some interesting information and connection with people that I didn't
even know were distant cousins. But as you said earlier,
this is no longer a viable company. Can you talk
a little bit about why it went bankrupt and how

(23:45):
it started to run into trouble.

Speaker 4 (23:47):
So all preface this was saying like, I've not been
tracking the financial fortunes of twenty three and meters over
the years, but I know at the high level they've
tried a lot of different ideas to make the business work.
Because it turns out that when you asked people to
spend roughly one hundred bucks to send you their DNA,
they only want to do that once, right, So twenty
three and ME had this problem, which was once people

(24:09):
did the test, that you couldn't get any more money
out of them. So they had to figure out other
ways to keep going as a business, to make some
money out of the data that they had. And I
think the big idea that they had was that it
was going to be useful for developing drugs, and they
made a big partnership with GSK, and GSK took a
stake in the company, and that went on for several years,
but it wasn't producing results at a scale that was

(24:31):
helping them and I guess the costs associated with it
were just gigantic. They've then tried other things. They tried,
I think most recently selling GLP once as kind of
a home kind of health kind of service like Pivot
to Health, and it just didn't work, leading to around
twenty twenty one that they went public and they were
valued at six billion dollars roughly, and then as a

(24:53):
Sunday night when I checked, it was around fifty million dollars.
They have no idea what it would be now that
it's in bankruptcy court. Probably less, probably less. And along
the way they had one other thing that we should
definitely talk about, which was they had a big hacking attack.

Speaker 1 (25:09):
Yeah. I wanted to ask you about that because I think,
you know, you sort of pointed out that there was
a time in the two thousands where we just couldn't
wait to give our data to all comers and see
what kind of surprising results that we got back that change,
which presumably became a headwind for twenty three and meters.
But there also some specific issues they had that may
have put customers off.

Speaker 4 (25:30):
Yeah, and I think the attacking attack is a big one.
My memory of it is that they had a problem
with users that were reusing passwords or using passwords that
had been otherwise compromised on the internet. This is a
common thing to all our listeners if you do not
already have a password manager and use it to get
distinct passwords on every single site, and Appy used and

(25:50):
do that now because that is a big risk. But
turns out a lot of twenty three meter users fit
in that bucket, and folks were able to get into
a whole bunch of accounts and then even offer to
sell some of that some of what they learn online.
I think it was mostly information about like family trees.
I don't think they got people's like DNA samples, but
still it was enough to really really spook people. And

(26:12):
you know, because at that core of it, I think
people are and even back then, we're pretty nervous about
the idea of spinning into a vial, and any breach
of that trust by by twenty three and me was
just it was just killer.

Speaker 1 (26:30):
After the break, we're going to get Kara to finally
delete her account, stay with us. So, Jeffrey, what happens
is you don't delete your data from twenty three and me, Like,
what would happen then.

Speaker 4 (26:51):
If you don't delete your data from twenty three and
me it is now essentially up for sale. So the
company has said that it is in bankruptcy court, which
means it has to find it either find a buyer
for the whole company or sell it off for parts.
And the most valuable asset they have is the DNA
data of sensibly still millions of people. Who is going

(27:12):
to buy that, We don't know. There's lots of speculation,
maybe insurance.

Speaker 3 (27:16):
Companies, insurance we have well, we have some laws, some
laws in the US that protect users from not users
protect people from.

Speaker 4 (27:27):
Having their their data, their genetic data used to keep
them from getting things like healthcare coverage. But it's not
an airtight law and it doesn't apply to other kinds
of insurance. And again, it can also be bought by
somebody who we have no idea what they're going to
do with it, and they may not try to do
anything with it for like a long time. Because that's

(27:47):
the thing about data. I think that is the key
takeaway lesson for everybody from this that you know, you
can think you know what's going to happen to it
at one time, and then in the future. It has
a totally different use, and that applies to our genetic data,
but all kinds of things about our lives.

Speaker 1 (28:01):
Can anybody delete that twenty three meter day secause? I
think I read in the piece there's a California law
that makes this particularly easy to do, or at least
possible to do. But what about for other people elsewhere
in the US or elsewhere in the world.

Speaker 4 (28:13):
I'm glad you asked this question. So we've been pretty
sort of gloomy so far in this conversation. But there's
a glimmer of good news here, so let's talk about that.

Speaker 1 (28:21):
I love it.

Speaker 4 (28:23):
Starting back in twenty eighteen, California passed a law that said, you,
dear consumer user citizen, have some privacy rights, and among
those are you have the rights to delete data that
companies collect about you. And then a couple years later,
actually during COVID, California passed another law that everybody was

(28:43):
so distracted by COVID about we forgot even existed, which
is a genetic information privacy law which comes even deeper
and says you not only have the right to delete
your data from a company, you also have the you
have the special right to delete genetic data. You have
the right to tell them to destroy your sample, and
you have the right to tell them withdraw me from
any research that's going on. So hoay California for this law.

(29:06):
Other states, seeing that the federal government in the US
was doing absolutely nothing to protect our data privacy rights,
copied it, and I am now happy to say they're
about twenty states that have versions of this law that
require giving people the right to delete their data. I
have a little map that think gets updated that I keep,
which is like my little sign of hope in the

(29:26):
dark dark days that regulation can work and we can
have some help. So the reality is the majority of
Americas are now covered by these state laws, and so companies,
including twenty three and me basically treat all Americans the
same way now, which is to say, yes, we will
delete your data. Now, a lot of people have been
writing to me saying like, well, how do we know
they're really deleting it? It boils down to like if

(29:48):
they don't, they could get into big trouble.

Speaker 1 (29:50):
But who's they? I mean, this is the receivers, the creditors,
I mean, in a bankrupt company, the stakes. So I mean,
who do you go after in the event they don't
take this duty seriously through the ankruptcy process.

Speaker 4 (30:01):
It's true that like they have very little left to lose,
you might argue, but there is a spotlight on them
because they're going through this process, right, and so there's
going to be a judge involved. And to be fair
to twenty three and meters, they have already said, look,
we are going to handle your data the same way
we always have throughout this process, and we're going to

(30:21):
try to look for a buyer that will uphold the
same use. But the truth of it is, because America
has no real laws that cover this kind of data,
that all that they would really have to do is
update the new buyer would have to update a privacy
policy and give you notice of that and kind of
then again give you that chance to delete it if
you don't want to be a part of that. And

(30:41):
the truth is most Americans are not paying enough attention.
Even if you know, even if I do another Washington
Post headline, but this time in all capital letters, you
know a lot of people wouldnt wouldn't pay attention to that.

Speaker 1 (30:53):
So you haven't started a genetic bank run. We don't
know how many are actually trying to delete their data,
certainly enough to crash the website, but a majority, as
far as we know.

Speaker 4 (31:02):
I asked twenty three and me yesterday and they wouldn't say.

Speaker 2 (31:05):
You've convinced an undisclosed amount of people to delete twenty
three and me. I'm gonna be honest, Jeffrey. I read
your article. I didn't do anything, and it's because I'm lazy.
I mean, there's there's no good reason. Like you, you
wrote an incredibly compelling article. I went on Instagram instead,

(31:30):
Like there's no good reason.

Speaker 1 (31:31):
But how do you actually do it?

Speaker 4 (31:32):
Okay, you want to do it together? Right now?

Speaker 2 (31:35):
Yeah? Yeah, hold on one second. Can we get your phone?

Speaker 1 (31:38):
Yeah?

Speaker 2 (31:38):
Okay, so I'm opening of course, I'm like, I'm opening
it with my face, So tell me how to delete.
I'm now in the app.

Speaker 4 (31:45):
It says, hey, Kara, so go up to I think
it's on the upper right corner.

Speaker 2 (31:50):
Okay, So I'm there. God, I'm gonna miss my cousins.
Keep on.

Speaker 4 (31:53):
You can look for settings. Once you tap on your
little profile.

Speaker 2 (31:58):
I's a chance to pass for an updated information.

Speaker 4 (32:01):
Okay. Once your in settings, keep scrolling all the way
down and towards the very end, you'll see an area
called twenty.

Speaker 2 (32:06):
Three and menu data Got it?

Speaker 1 (32:08):
Got it?

Speaker 3 (32:08):
Yep?

Speaker 4 (32:09):
Okay, then click view.

Speaker 2 (32:11):
Well there's an access your data or delete your data in.

Speaker 4 (32:14):
Mind, Oh you want to delete?

Speaker 1 (32:18):
We were like shock, we were already to folkate the
roads that Jeffrey, thank you.

Speaker 2 (32:22):
I wasn't sure.

Speaker 4 (32:23):
It will give you a chance to download some of
the data, including everything from their report about your health
and your family.

Speaker 2 (32:32):
And report summary. Here we go.

Speaker 4 (32:35):
And then after you get that stuff, Okay, you're going
to scroll down, you're going to click delete. Now. A
couple of things to mention about this. While you're clicking
on things. When you ask to delete, they also do
two more things. One they also delete your specimen if
you'd left them the physical jewel that you send them
in the mail. And two, they withdraw you from any

(32:56):
health studies that you might have opted into.

Speaker 2 (32:58):
I'm sorry, I just want to share one thing, Jeffrey,
and this has nothing to do with our podcast. You
and I, Jeffrey, are both likely to get wet earwax.

Speaker 4 (33:06):
Yeah, let me tell you if you ever live in
a really moist place that your wax is going to
sneak up on you.

Speaker 1 (33:14):
I'm deleting.

Speaker 2 (33:14):
Here we go, permanently delete. Oh gosh, here we go.
I'm like in this house, we delete twenty three and
meters and then it says profile data, so delete data.
So here what it says is we have received your
request to delete your data and have sent an email
to the email address link to you'rre trying to three
me account. Please locate this email on the strategy to

(33:35):
confirm my request.

Speaker 4 (33:36):
That's right, So your kind your email app and then
it's an email in there, and then you press do it.

Speaker 2 (33:41):
Do it permanently, delete all records permanent.

Speaker 4 (33:45):
Do remember. Should you decide that you really still want
to know about such things, you could swab again. The
one thing about your DNA is it's not changing. And
that's in some cases I guess a good thing and
sometimes of badly bad thing.

Speaker 1 (34:01):
Yeah, Jeffrey just decays. I have two questions for you.
The first is how often, as the technology columnst for
the Washington Post, do you have to talk people through
how to do banal technology tasks.

Speaker 4 (34:13):
We actually have a special thing on our website where
it's like, this is the box for us to give
people instructions for things. So in fact, if you pull
up my column on this and you scroll down a
little bit. I've got one of those boxes that I
was literally reading off of while we were going through
this process together. And that's fine. Look, this stuff is
hard yep. These companies can act in evil ways and

(34:34):
it's not your fault. So the whole premise of what
I try to do as a tech columnist of the
Washington Post is be on the side of the user
r and like, help you fight back when you can, right,
And like, I think that's that's super important.

Speaker 1 (34:49):
And the second is what do you want to take
aways from the twenty three meters story?

Speaker 4 (34:54):
One, perhaps we shouldn't have all sent our DNA to
Silicon Valley corporation that has Silicon Valley Corporation values and
ways of doing business. And the bigger one is that,
like we've been talking about here, that it's really hard
to know in any given moment what your data could

(35:16):
be useful for at some point in the future. And
so the only really reasonable thing to do to protect
yourself is to allow as little of it as possible
to be collected, which sounds like an insane person thing
to say in our modern economy when we're literally being
watched in every single potential dimension. I once did a

(35:37):
piece for the Post where I hacked into my iPhone
to watch what it did while I was sleeping at night.
Oh wow, and saw it sending data out to like
just like hundreds of like data brokers and all these
are sorts of things. So it was terrifying. It wasn't
even using the phone, but it was, you know, it
was communicating all this personal information about me. So how
do we deal with that fact? This runs a little

(35:57):
bit inttention of like I love technology. I want to
use is a cool phone, and I have seventy five
connected gadgets in my home that make all sorts of
cool things happen. I think the answer is you just
have to be vigilant, and you need the help of regulations.
And so that's you know, to that have our interest

(36:18):
at heart to sort of put boundaries around what these
companies can do.

Speaker 3 (36:20):
And that's why I'm.

Speaker 4 (36:22):
Here to sing the praises of the California Privacy Protection
Law and hopeful that we can maybe get some some
more Jeffery, thank.

Speaker 2 (36:30):
You, Thank you so much, Jeffery, and thank you for
helping me do something that I should have done many
days ago. You bet.

Speaker 1 (36:41):
That's it for this week for tech Stuff. I'm most
Flosian and I'm care Price.

Speaker 2 (36:45):
This episode was produced by Eliza Dennis and Victoria Dominguez.
It was executive produced by me oz Vaalashan and Kate
Osborne for Kaleidoscope and Katrina Norvel for iHeart Podcasts. The
engineer is Bihied Fraser and Kyle Murdoch mixed this episode
and he also wrote our.

Speaker 1 (36:59):
Thea Join us next Wednesday for a very special edition
of tech Stuff The Story when we'll share an in
depth conversation with Zach Brown, the CEO of McLaren Racing
from the McLaren Technology Center. So f one fans tune
in and please rate, review, and reach out to us
at tech Stuff podcast at gmail dot com. We really
want to hear from you.

TechStuff News

Advertise With Us

Follow Us On

Hosts And Creators

Oz Woloshyn

Oz Woloshyn

Karah Preiss

Karah Preiss

Show Links

AboutStoreRSS

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.