All Episodes

April 8, 2025 38 mins
After the announcement of 23 & Me’s file for bankruptcy, many customers are worried about their personal data and genetic information being sold to other companies. While the company insists customers’ info is protected, experts are telling 23 & Me customers to download and delete their data. What are some of the risks you take sending your DNA to private companies? Do you feel comfortable doing so and why? Jason Kelley, an Activism Director at The Electronic Frontier Foundation joins us tonight to discuss how you can protect your data from being stolen or used for nefarious purposes.

Listen to WBZ NewsRadio on the NEW iHeart Radio app and be sure to set WBZ NewsRadio as your #1 preset!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
It's nights with Dan Ray. I'm telling you easy Boston video.

Speaker 2 (00:07):
Dan Walkin's like very much as we head into our
tenth hour here on this Tuesday night. And wasn't it
cold out there today? Imagine? Must not have been a
fun night at Fenway Park. It wasn't a fun night
at Fenway Park because the Red Sox lost for the
second night in a row to the Toronto Blue Jays.
But the weather, they must have not had too many
too much in beer sales. I would guess hot chocolate

(00:31):
was the hot commodity tonight at the old Ballpark, that's
for sure. So the Red Sox just six and six
back to five hundred and hey, season is still young.
They haven't even hit the ten mark of the season. Okay,
many of you, I think know about a couple of

(00:51):
those websites there. One is called ancestry dot com and
the other is twenty three and me uh. And those
websites that used to heavily advertise on television. I don't
know that they are still well. One is in bankruptcy.
But they would tell you a lot, or they would
tell you that they had the ability to tell you

(01:12):
a lot about your ancestry. About your heritage, about whether
or not you were, you know, all from one part
of the world, or if you uh, your your heritage
came from many different countries. And they did that with
generally a DNA sample that you would send voluntarily, and

(01:35):
it was fun. A lot of people learned a lot
about their background. I never did it. I never had
the curiosity because I feel I know where my people
came from, you know, the old sod And I didn't
think that, although I guess I was a little bit
intrigued by it, because look, there could be all of
us if you go back five, six, eight, ten generations,

(01:58):
maybe there's somebody from a different part of the world
in your background, in your family's background. Well, twenty three
of me is now file for bankruptcy. Ancestry dot Com,
I think is still in business. But a guy that
knows a lot more about all of this stuff is
Jason Kelly. Jason is what's called an activism director at

(02:21):
the Electronic Frontier Foundation. So first of all, Jason, tell
us what the Electronic Frontier Foundation is, and then what
is an activism director. I'm sure that when you were
ten years old and they asked you in school, Jason
what do you want to be when you grow up?

Speaker 3 (02:37):
I want to be an activism director.

Speaker 2 (02:40):
Missus Smith, I'm sure that was not what you just said.

Speaker 4 (02:44):
How are you?

Speaker 5 (02:46):
You're right about that? Thanks for having me on, Dan.

Speaker 6 (02:49):
Yeah, I wanted to be a writer when I was
a kid, and you know, I do a little writing now.
As an activism director, I somewhat get to fulfill that.
The The Electronic Frontier Foundation is a impact litigation law
firm that's been around for over thirty years, not with
me the whole time. I've only been there about a decade,

(03:11):
but we started in the nineties and you can think
of us kind of like the ASLU.

Speaker 5 (03:18):
It's a it's.

Speaker 6 (03:19):
An organization that pushes for laws to be more privacy protective,
more free speech protective. But our focus is primarily around
digital technology around you know, the growth of technology and
laws that impact what you do online. You know what
computers know about you, you know what companies collect, that

(03:41):
sort of thing, and you know, the activism team focuses
on a lot of different stuff, but we think of
ourselves as one of the three prongs in the in
the fight that e FF is engaged in. So we've
got the legal team, which does impact litigations. So you know,
a law comes out or a company is you know,

(04:03):
taking advantage of someone, we go to court and file
a lot of briefs and things like that you can imagine,
just like the ACLU, and they join us in a
lot of cases as well. And then we've got a
technologist department that's like the smarty Pants that really understand
how computers work, and they help us understand new technology.

(04:24):
You know, we talk with people in the Senate and
they help everyone who needs to know how these kinds
of things work understand them. And then the Activism team
is sort of like the Comms department, you know. We
we make sure that that people can understand the confusing laws,
the confusing technology, and we overall try to get people

(04:46):
to take action. When there's let's say a law that
Congress is debating or a state is you know, putting forward,
we try to tell people, hey, here's what this law does.
If you care about it, you know, reach out to
your representa and let them know either way, which whichever
way it is. And you know, we were very very occasionally,

(05:07):
once every few years we may put on a protest
or something like that. A few years ago, I flew
well I didn't fly it, but we flew a plane
over Apple's headquarters because they were getting ready to make
some changes to how they encrypt their their devices that
we didn't like and a bunch of people were worried
about and we won that fight. So they didn't do that,

(05:28):
and that was that was one of those little stunts
that you know, only an activism director can pull off.

Speaker 5 (05:34):
So that's that's the basis of it.

Speaker 2 (05:36):
Okay, well, real quickly, we're going to talk about twenty
three and Me and the filing for bankruptcy and the
implications thereof. And I know that there's probably a bunch
of these companies out there, but the only ones that
I really saw advertised on TV to an extent was
twenty three in me in Ancestry dot Com. Am I
missing anything? I'd be other than those? As the two

(05:58):
big companies that are in this are.

Speaker 5 (06:01):
Those are the big ones. You're correct about that.

Speaker 6 (06:03):
There are a few that I don't exactly understand, you know,
what their business model is. But for example, there's ged Match,
is a company that has you know, let's say, ten
percent of the customer base of ancestry dot Com.

Speaker 5 (06:18):
There are a.

Speaker 6 (06:18):
Couple that are you know, that size, and they don't
have as you know, I think that everyone.

Speaker 5 (06:25):
Understands they don't have quite as much information.

Speaker 6 (06:27):
They might be a little cheaper, but yeah, basically there
are those two in the in the field.

Speaker 2 (06:33):
Okay. So now one of the big, the big companies,
twenty three in ME his file for bankruptcy, and the
article that I read said that a lot of people
are a little freaked out about this because obviously, when
twenty three and ME files for bankruptcy, there are some
elements of that country that have some value and they

(06:58):
have to dispose of that the material, whether it's you know,
desks and chairs or whether it's information panels that might
be of some value to companies. So it's theoretical that
that people could have this information which they confided into
twenty three and meters, which now is going to be

(07:18):
be bartered or you know, was sold to another country,
and that's kind of freaking people out rightfully.

Speaker 6 (07:24):
So yeah, I think so, you know, I think of
this a little bit like if you you know, gave
your credit card information to a company and that company
got bought by another company, and now you know, you're
not necessarily worried that they're going to run your credit card,

(07:45):
but you never really know. And in this case, instead
of a credit card which you can cancel, which you
can change the number and you know, block and everything,
your DNA is not going to change. And so anybody
who has that it forever, and they can't necessarily just

(08:06):
you know, do a scan and learn everything in the
world about you. It's it's more complicated than that. But
they do have a lot of information that they can
use in you know, a couple of different ways that
I think people should know about, and people I think
are rightfully concerned that they're not going to know what
happens with that information. I mean, I think that's really

(08:28):
what freaks people out, is is that uncertainty.

Speaker 2 (08:30):
Yeah, I want to talk about process, what people have
done and if they have Anyone who asks questions, by
the way, can join the conversation. But I'm going to
ask you a few questions, so don't assume I'm going
to ask the right questions. Ladies and gentlemen. Many of
you out there know more about this than I do,
but probably none of you know as much about it

(08:51):
as my guest does. So feel free to line up,
get some questions in and I have some questions when
we come back as well. Six one seven, two, five,
four ten thirty, six nine three, one ten thirty. Those
are that you brought. Those are the two on air lines.
We'll get you on night side, and I'm not sure
what we're gonna do later on tonight. So if this
interest you, feel free, whatevers you ask. As I say,

(09:14):
I told you many times, there's no such thing as
a dumb question. I learned that in law school because
that's the question you don't have the courage to ask
the professor. And that's the question that comes up on
the midterm or on the final exam. So with that
admonission six one seven, two, four thirty, six one seven,
nine three, one tenth thirty, we'll be right back on

(09:35):
night Side with Dan Ray here on Boston's WBZ. And
it's more than Boston's WBZ. We have listeners right now
with this hour all Oliver Eastern, the eastern half of
Canada and the eastern half of the United States. Back
on night Side right after this.

Speaker 1 (09:51):
Night Side with Dan Ray, I'm delling YOUBZ Boston's News Radio.

Speaker 2 (09:58):
With me. It is Jason Kelly he's the activism director
for the Electronic Frontier Foundation. Now, now don't don't get
you know, concerned about this, but if you've ever been
a patient, if you've ever been a client twenty three
in me, I think you've got to listen real carefully.

(10:18):
What advice, Jason, would you give to anyone who was
a client of twenty three and me submitted their DNA
sample And this might be five years ago, ten years ago,
there around for a long time. What do they what
should they be concerned about, and what can they do?
At this point, the companies in bankruptcy, it's eventually going

(10:41):
to disappear. What can be done? What can be done?

Speaker 5 (10:46):
Well?

Speaker 6 (10:46):
I think the biggest thing people need to know is
that they can ask the company pretty quickly to delete
the data that they've collected. It's not a super long process.
It's not something that's going to take more than you know,
ten twenty minutes. And that really, I think, for a
lot of people is going to give them peace of mind.

(11:07):
It's just going to make them feel a little bit
better about the fact that this company could sell that
data to pretty much anybody, and we don't have a
choice in where it goes. And that process is fairly simple.
You can log into the site, you go to the settings,
you go to your data, and you basically just say
can you delete the data there? There's an option for it.

(11:29):
It might be a little complicated. They have to email
you and you have to email them back. You know,
if you're like me, you might have lost the password,
maybe you haven't logged into it since you did it,
so you know, it's not going to necessarily be the
easiest thing in the world.

Speaker 5 (11:43):
But they do allow.

Speaker 6 (11:44):
You to do that, and that can delete both of
the data that they have that's on their let's say
they're back end in their system. And then you can
also request that they destroy the genetic data if they've
got that on hand as well, so you can.

Speaker 5 (12:00):
Make those requests.

Speaker 6 (12:01):
And I would recommend it for most people who are
concerned at all about this, because otherwise it's just going
to sort of eat away at.

Speaker 5 (12:10):
You, and I think it should.

Speaker 6 (12:13):
You know, a company buying this data can more or
less know whether you have certain diseases. They could put
together a list of connections with your relatives. And I
don't know exactly what they're going to do with that information,
but I don't like someone having it when I didn't
intend them to, and there's just sort of an unease

(12:36):
that comes with that. And I think most people are
going to feel better if they go online, go to
the account, log in, go to the settings, and just
choose to delete my data, and you can download the data
before you do that. That'll give you a little bit
of information. You won't be able to use the site
anymore after you request that they delete it, but you
will have a little bit of that information. And some

(12:58):
of it you can download is going.

Speaker 5 (12:59):
To be gibberish.

Speaker 6 (13:00):
It's the DNA data results that are going to look
like computer code to most people, including me. But in
the case of another service popping up that can use
that information in the future that you feel comfortable giving
it to, you could reuse it if you need to.

Speaker 2 (13:16):
That is all really great advice. One of the things
that has been controversial, and we can get into this
a little bit more in depth, is that police agencies
on occasion have been able to use this type of
data to sort of almost triangulate someone within your family

(13:43):
who might have been guilty of a crime. And explain
how that works, because if you can explain in a
way that people will understand well and appreciate it. Give
me an idea how that works.

Speaker 6 (13:57):
So let's say you've got a crime scene with a
coffee cup on it, and that coffee cup has you know,
somebody's DNA because they took a drink. Yep, the police
can swab that. They can run it through basically the
same kinds of systems that twenty three and me used
and it you know, the way it looks when they

(14:17):
get out the data on the other end is that
it's it's basically a series of letters and numbers. And
if they do everything right, and the software is not
always perfect, but if they do everything right, they can
ask twenty three and meter for them to run a
DNA sample through their database. They could do that with

(14:39):
any of these companies, and some of them require warrants
and some of them don't.

Speaker 2 (14:42):
And the DNA data sample, and the DNA data sample
they would be running through would be the DNA sample
that in this hypothetical they lifted from a coffee cup
at a crime scene under the belief that perhaps the
person who committed the crime that could be their DNA.
That could be their DNA that's on the coffee cup

(15:04):
and what happens next? What happens next?

Speaker 6 (15:07):
So they get a match, and that match isn't necessarily
the exact person that was responsible for leaving the DNA behind.
But if you're related to someone who was at that
crime scene and put that DNA on that coffee cup
and you put your your information into twenty three and meters,

(15:28):
you will come up as the match. So they will
look at these two comparisons and say, well, this person
isn't the exact person, but they have a basically a
you know, one in four connection to that person, which
means that it's probably their cousin or probably their you know,
sibling or something like that. And if more than one

(15:50):
person has put that data into the database, they, like
you said, triangulate it and they can pretty quickly, you know,
using all the other police skills.

Speaker 5 (15:59):
That they have, and you're out.

Speaker 6 (16:00):
Okay, well who lived there at.

Speaker 5 (16:02):
The time exactly?

Speaker 6 (16:03):
They can get more information to figure out we think
it's this person, not necessarily you, but someone you're related to.

Speaker 2 (16:11):
Well they could the police can show up at your
house with or without a warrned and they can say, gee,
would like to have a conversation with you. That's most people.
Most people are going to say, oh, sure, how can
I help you? Now? Some are going to say, do
you have a war, I don't have to talk to you,
or whatever. But that's only going to raise the suspicion

(16:32):
of the police.

Speaker 6 (16:34):
Maybe you know too much about the cousin and yeah.

Speaker 2 (16:37):
So in other words, so now they're going to say, gee,
do you have any relatives, you know, cousins or uncles
or aunts or whatever, who happen to live in pick
At City, Phoenix, Arizona. Uh, well, yeah, my cousin, Herbie
lives in Phoenix area. Really, how long has Herbie been there? Oh,

(16:58):
he moved out there and now ninety and we haven't
seen him since. Oh really, Okay, Now they may be saying, hmmm,
this is the where this crime occurred. They may not
even tell you that. Okay. They just might at that
point say thank you very much, And they might then
be able to solve that case by spotting Herbie walking

(17:21):
down the street with another cup of coffee, and when
Herbie throws that cup of coffee away, they recover and
they have a DNA match. Or they may put you
in the impossibly very uncomfortable position of saying, hey, we
believe that you know a relative of yours may have
committed this horrific crime and would like to ask you
some questions about your family tree. I mean think about it.

Speaker 6 (17:44):
Yeah, yeah, yeah, any anybody could be in this situation,
if you've never even committed a crime, It's very possible.
The data from research has shown that there I mentioned
one of the smaller services, ged match service has about
one million users. And just because of how similar you

(18:05):
know people are and how related people are, if.

Speaker 4 (18:08):
You go back a little ways.

Speaker 6 (18:11):
If somebody found your DNA and a coffee cup and
researchers had access to ged match's service, they could pretty
quickly find who you were even on one on out
of one million. Now twenty three and meters has fifteen million,
so everybody probably has a relative in this database. That's

(18:31):
one of the things to worry about is that even
even if you've never used it, you know, you might
want to check around with your with your relatives and
see who has because even if you are not in it,
you you kind of are, you know, I mean most
of us probably are at some level based on a relative.
So yeah, it's it affects everybody, even if you know
you've never used it, and even if you've never committed

(18:53):
a crime, because they might still come to your door.
And also they might get it wrong. You know, just
because you're d is on that coffee cup where someone
else's is doesn't necessarily mean that that's the person who
committed the crime.

Speaker 2 (19:05):
My guest is Jason Kelly. Jason's an activisim director for
the Electronic Frontier Foundation, and we're talking about if your
DNA has been voluntarily submitted to twenty three and meters
that company has filed for bankruptcy. UH. Feel free to
join the conversation. Any questions you like. My name's Dan Ray.
This is Nightside. We'll get some callers coming up on

(19:26):
the other side. Six one, seven, two, five, four, ten
thirty six one seven, nine, three, one, ten thirty. We'll
get to phone calls, more questions for my guest, Jason Kelly.

Speaker 1 (19:36):
Right after this, you're on Night Side with Dan Ray.
I'm Easy Boston's News Radio.

Speaker 2 (19:45):
All right, Thank you very much, Dan Watkins. My guest
is Jason Kelly. He's an activism director with the Electronic
Frontier Foundation. The point of this hour tonight is to
make you aware that when you submit information about yourself
DNA information to private companies. I'm sure you sign a

(20:08):
lot of documents and I'm sure that you give them
permission to maintain the file, and you don't read the
fine print, and you're interested in finding out. Gee, am
I twenty percent German? And my fifteen percent? Kenyon? What
is my background? Do I have Spanish? You know, that's
the way that they get you to do. And of
course you pay money to have this service provided. And

(20:31):
now with this company in bankruptcy, it's possible. Now Jason
has been kind enough to explain what you might be
able to do to eliminate or to get rid of
some of the information that you have voluntarily provided. However,

(20:52):
you may have some questions, and so what we're going
to do is we're going to go to phone calls.
We've laid it out for you as clearly as we can.
You know, we live in an age where all of
us give out information and all of us get bombarded
with phone calls, get bombarded with spam emails, the more ways.

(21:18):
And I'm guilty. I'm I'm on Facebook, I'm on Twitter,
I'm on Instagram. I'm because that's my business to communicate
with people. But I probably would be doing most of
those anyway, but it puts us we have you have
basically walked out into the middle of the town square

(21:40):
and stripped yourself naked and said here I am, here's
everything you need to know about me. Jason. How far
off is that analogy?

Speaker 6 (21:51):
I think that's about one hundred percent correct. Unless someone
lives in the woods and doesn't have a phone or
a computer, I think that's pretty much where most of
us are at. And and it's you know, whether it's
DNA or your location because you carry your phone around,
or the websites you visit, you know, all this information

(22:12):
can be collected and and what's worrisome for most people,
I think is it's not just that one company knows
you know, Google knows where your phone was, or Apple
knows where your phone was, and then another company knows
where you know, where you went online, what kind of
Facebook pages you like, and then another company has your
DNA information and might know what kind of diseases you

(22:34):
are predisposed to. What worries people and should is that
that information often gets collected by data brokers and put
into a big profile and so that you know, not
everyone has access to that, but but that's how it
all happened, so that they can then do what you said.
They can send you ads, they can you know, make

(22:56):
phone calls things like that to try to get you
to buy things, and in worse cases, you know, depending
on what's in that information, they could even blackmail you.
I mean that's one of the cost terms that absolutely
I would have you know, if a company bought my
DNA data and it said I had some predisposition for
a disease and I was a politician or something. Yeah,

(23:19):
you know, there's lots of things you could do with
that information to make money that aren't just advertising. Obviously,
their laws theoretically protecting you from that, but in the
case of a bankruptcy or in the case of a
data breach, which twenty three and me has had in
the past, you never know what's going to happen. So people,
I think, who are you know, worried about this absolutely

(23:40):
should delete their data.

Speaker 5 (23:41):
They don't need to.

Speaker 6 (23:42):
I don't think they need to run screaming into the
woods quite yet, but they probably should take more precautions.

Speaker 2 (23:48):
All right, let's see what people have to say. Let
me go first off to Scott in Quincy, Massachusetts. Scott, welcome,
you were first this hour on Nights Out with my guess,
Jason Kelly gright Head.

Speaker 7 (23:57):
Scott, Well, great, great show, great guest. So I work
at an academic medical center and I have my d
n A two different research thanks, so I'm not that
squeamish about it. And two questions. One what is about

(24:19):
the fate of the database? And the second one is
I'd like to ask Jason about the eff So I
think that the database should go to the people in
the FBI who maintain the Singapore database or the FBI's database,
so that it could be be used for crime fighting.

(24:43):
What do you guys think about that?

Speaker 2 (24:44):
By the way, are you moving around like in a
in a tunnel or something, Scott, what's going on in
the background? Are you okay?

Speaker 7 (24:53):
I get the phone up against my head and I'm
walking from the hospital past North Station.

Speaker 5 (24:58):
To me, No proble.

Speaker 2 (25:00):
I just want to make sure weren't getting mugged or
something like that. That's all, no problem, okay. So you
would like to see this this database given over to
the FBI or a police authorities. I don't think Jason's
going to buy into that, but let's see what he
has to say, Jason goright ahead.

Speaker 6 (25:20):
Well, I get where you're coming from. I really do
you know they caught the Golden State killer using data
from these databases, and I have trouble saying that's a
bad thing. But on the other hand, you know, I
have a twenty three and B I didn't necessarily say, yeah,
I'm happy with, you know, the government doing whatever it

(25:43):
wants to do with my DNA. And just to give
you an example, you know, right now there are laws
that protect you from your insurance company denying you coverage
based on your genetic information. But those laws aren't necessarily
going to be on the books forever. And there are
things I think could be dangerous about giving the government

(26:03):
that data. Now, if you want to give it to them,
if you want to send it over to them, you
know you can. I think people can do that if
they think it might help crime fighting. But I think
most people, at the very least should be given the
option to say, yeah, that's fine, you can have that
or not, because the government has a lot of authority,
it has a lot of power, and it's just it's

(26:25):
a little worrisome I think to know that they would
have you know that.

Speaker 5 (26:30):
Much detailed information.

Speaker 6 (26:31):
About you, because you might like who's in charge of
the government one year, you might not like who's in
charge of the next year, and you can never quite
tell what's going to change. So I get where you're
coming from, Scott, I really do. But you know, for me,
I think it's it's just a question of consent. Just
like if you want the government to have other information
about you, you know, usually you have to give it over.

(26:54):
This is something you don't have to give over, and
if you want to, fantastic. But I think people who
you know, submitted their data should be given the option
at least to say yeah, that's fine with me. And
and regardless you know, a warrant usually these companies respond
perfectly well to them, so they can use the databases
even though they don't own them right now, that would

(27:16):
include the FBI, So I think it's a it's a
good point, but you know me, personally, I would rather
fewer people have that information and more more control over
it wherever it is.

Speaker 7 (27:30):
Good good And the other thing I wanted to ask
about esf IS and maybe Dan could connect us off
the air through the producer. So I'm a medical equipment repairer,
and I have a lot of trouble with these big
companies refusing to provide me technical information or so many
parts to repair my hospital's medical equipment, even when patients

(27:55):
needed to be repaired quickly. It's a big problem. Is
that on the EFF radar screen? And could I connect
with you over that topic?

Speaker 6 (28:05):
Yeah, that absolutely is. We call those laws that would
make it easier for people to repair their devices, to
get technical support, to get like detailed device brochures and
guides about repairing, we call those right to repair laws,
and the unfortunately, due to a bunch of different things,

(28:25):
but mostly due to something called the Digital Millennium Copyright
Act that passed a few decades ago, any tool that
has a computer in it, companies have a lot more
power to stop you from repairing it, whether that's a
phone or a tractor or medical equipment. And every few
years there's basically a big meeting where the government and

(28:50):
a bunch of different groups come together, and EFF goes
to those meetings to push for more devices to be
exempted from those laws. So right now you know there's
a number of devices that are exempted from those laws.
I'm not sure exactly how medical devices.

Speaker 5 (29:07):
Fit into that.

Speaker 6 (29:07):
But I can say that this is the first year
ever that all fifty states have right to repair laws
in discussion. So they've been introduced in all fifty states.
They don't all cover everything. Some of them cover tractors only,
some of them cover video game consoles, and it's a

(29:30):
movement that's growing. I highly recommend if you're interested in
that kind of thing. And this is something that affects everyone.
It affects the environment, it affects how costly everything is.
You can go to repair dot org. That's the repair
association that helps move this forward through the through the laws.
They're doing some great work and most likely in Massachusetts

(29:54):
there is either a law already on the books or
one being put forward that would improve right to repair
options for p people.

Speaker 2 (30:00):
So Jason, Jason, Jason, Hey to do this? Yeah, I'm
gonna come up on a commercial break here. How about this?
Can I have Scott leave his phone number with my
producer Rob and when we finish at eleven o'clock, would
you be kind enough to maybe give Scott a call
at some point. I'd rather give him your number than
us give him. I'd rather give you his number than

(30:22):
us give him your number. If you know what I'm
saying is.

Speaker 5 (30:24):
That okay, Yeah, no problem, absolutely, Scott.

Speaker 2 (30:26):
You hang on there. Rob will take your number and
Jason will be back in touch with you. Take very
quick break. Want to get to more phone calls six one, seven, two, five, four, ten, thirty, six, one, seven,
nine thirty. Coming right back on night Side, Matt and
Chris are coming up. Got a little bit of room
for you if you want to try to sneak in,
feel free. Coming back on night Side.

Speaker 1 (30:44):
It's night Side with Dan Ray on w Boston's news radio.

Speaker 2 (30:50):
We're talking with Jason Kelly. Jason is with the Electronic
Frontier Foundation, and we're talking about privacy. Line is the
privacy issues. But the issue here is that twenty three
in me has gone bankrupt and the material said you
might have submitted to them, folks, could be part of
the bankruptcy proceeding. You have an opportunity to delete that

(31:12):
information if you'd like, let me go. Matt is up next. Matt,
you were next on night Side with my guest Jason Kelly.

Speaker 5 (31:17):
Go ahead, Matt, Hey, how y'all doing.

Speaker 8 (31:22):
So Less of a question and more of a kind
of a generalized you know, I really liked how everything
was laid out and how you all explained it. You know, basically,
you take one of these tests and they have your information,
and that's, you know, to me, a deterrent. But I

(31:45):
feel these companies twenty three and me, one of these
things they highlight is like colo rectal pros state and
some of those things that are in the family which
they probably can get from the DNA, and they're voting
that as a way to find a link to where
I guess your DNA and families DNA could be linked together.

Speaker 5 (32:11):
However, I don't think.

Speaker 2 (32:13):
That's true, Matt. I mean, I think that what it
comes down to is that if they have your DNA,
they can they can run your DNA through computer systems,
and I don't think it's got anything to do with
with diseases. There's probably a million people in their system.
We've had colo rectal cancer and it's not gonna You're
not gonna be related to a million people. But if

(32:34):
they don't connections with five people and five of them
have a proclivity or a medical history, that might be helpful. Yeah,
I get that, But where are you going with this?
Because I'm trying to one other call which I want
to try.

Speaker 8 (32:48):
To h yeah, no, no, no, absolutely, Here's where I'm
going with it. I was just on their website and
this is one of the services they were offering that
they could relate some traits to. So where I'm going
with it is, isn't there some sort of law, I
mean where they cannot just pull that information your DNA

(33:10):
off of there, or is there anything warrant wise you
know Fourth Amendment that is protecting that or j.

Speaker 2 (33:19):
Lawyer, Let's get Jason to respond to that question. Go ahead, Jason,
that's right.

Speaker 6 (33:23):
I'm not a lawyer. I mean there are laws, they're
different in every state. The genetic privacy laws that are
across the whole country don't really apply in this scenario
because you have voluntarily given your data over to the company.
So depending on which state you're in, you know, a
warrant has to come because twenty three and ME says

(33:46):
that they will only respond to warrants to let police
sort of go through the database. But a different company
might change that policy, and if they sell the data
to a different company, they could theoretically change that policy
as well, and they're not supposed to do that. That's
the FDC has said that when you, you know, go
bankrupt or you're acquired, you should hold on to the

(34:08):
same privacy policy.

Speaker 5 (34:10):
But the FDC has to enforce that law.

Speaker 6 (34:13):
So you know, there are different genetic privacy laws in
the books. They cover different things. In the case of
the data that you've given A twenty three and me,
you do have some protections, but I'll just say there
are not as many as you would probably think, and
in the case of an acquisition, they're fewer than you
would like.

Speaker 2 (34:30):
Let me make sure that we understand, Matt. Have you
submitted your DNA to twenty three?

Speaker 4 (34:34):
In me?

Speaker 5 (34:34):
I would never submit my good.

Speaker 2 (34:37):
Okay, So you're okay, Matt. I want to try to
get one more.

Speaker 5 (34:42):
I appreciate it.

Speaker 2 (34:43):
I thank you. I appreciate it.

Speaker 8 (34:45):
You should be going back, all right.

Speaker 2 (34:47):
Thanks. Let me go to Chris. Chris, down to the cape,
got you in under the wire?

Speaker 1 (34:50):
Here?

Speaker 2 (34:50):
Go ahead, you're on with Jason Kelly.

Speaker 4 (34:54):
Hie. Good night, I mean good evening.

Speaker 2 (34:56):
Anything, right, don't say good night yet, go ahead.

Speaker 4 (34:58):
Chris, not good night yet? Right. I've always advised people
against coughing up their DNA unless they're forced to. In
the Massachusetts if you're a serious FEMI and defendant, you
end up coughing it up. But you should always oppose
that because it's not only your DNA, as I understand

(35:21):
it to be, but perpetually for all of your successors.
So if you have a criminal descendant, they're going to
be stuck with your DNA and they're going to be connected.
I also understood that with ancestry dot Com when you

(35:42):
voluntarily and this other corporation one too, whatever it.

Speaker 3 (35:47):
Is, that yeah, that you cough up your DNA to them,
And so many people are excited in family members that
I've advised against doing this.

Speaker 4 (35:59):
Oh that's great. We will be able to put our
whole ancestry together. And isn't that a good thing to know? Well,
it turns out that in that fine print paperwork that
Dan referred to earlier, that that gets sold to like China,
and they find out about what your particular profile is

(36:21):
very personally medical and otherwise, and then can either market
it sell it.

Speaker 2 (36:28):
Yet told you, what are you a lawyer?

Speaker 7 (36:32):
Yeah?

Speaker 4 (36:32):
I am.

Speaker 2 (36:32):
I suspect you are. Do I know you? I have
a friend.

Speaker 4 (36:37):
I always oppose you what I always oppose any motions
of the commonweal.

Speaker 2 (36:43):
I understand that. So therefore you're a defense lawyer. So
your last name doesn't begin with the letter M, does it.
I'm trying to I know a couple of lawyers down
there in the Cape.

Speaker 4 (36:53):
No, okay, okay, cal An asked, no problem rhymes with rain?

Speaker 2 (36:59):
Okay?

Speaker 4 (37:00):
Uh?

Speaker 8 (37:00):
You know?

Speaker 2 (37:01):
If I know you tell me if no, No.

Speaker 4 (37:04):
I was involved in a in a cough up a
DNA that in that very famous Cape murder case in
the in the Outer Cape where the district attorney at
the time, good guy, good da Michael Ah. It's round
up of people that happened to live in Provincetown, Truro,

(37:24):
Wellfleet or east Ham to cough up their DNA at
various piece of joints, and the wives and girlfriend saying, well,
why wouldn't you give up your DNA if you're not guilty?

Speaker 2 (37:38):
No, I get it, I get it. Chris Well, I'm
flat out of time. I wish you'd called earlier. You
would be a really interesting call. Thanks for checking in,
Thanks for listening, but I got to get out of
the way for the eleven o'clock newscast and thank my guest.
Talk again, my friend. Keep in touch. Okay, all right,
thank you, Chris Jason, Thank you very much. Hellas, how
can people get more information on your organization, the Electronic

(38:01):
Frontier Foundation.

Speaker 6 (38:03):
Well, they can go to EFF dot org and I
put up a quick link. If they go to e
FF dot org slash twenty three in meters, we've got
some instructions there for people to delete their data. Just
follow the link there. But yeah, we you know, we're
a nonprofit organization. We're members supported. So if people are

(38:23):
interested in learning more, just go to e FF dot
org and you'll learn all the things that are happening
in the law right now and just in general with
technology and your digital rights.

Speaker 2 (38:32):
And as they say in church, if you like what
they do, be as generous as your means, permit. Chris Jason,
I really enjoyed this conversation. I appreciate the time you
spent with us tonight, and perhaps we'll do it again.

Speaker 5 (38:42):
Okay, thanks so much, Dan, great, great talking to you.

Speaker 2 (38:46):
Very welcome. Right back at you. Here comes the eleven
o'clock News. Yeah, we're going to talk about what happened
again today on Wall Street. Trust me on that. Maybe
you're ready to panic. I don't know, I'm still not
let's talk about it after the eleven
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.