Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
Kilda. I'm Chelsea Daniels and this is the Front Page,
a daily podcast presented by the New Zealand Herald. Sending
your DNA to a website to find out your background
has become a trendy thing to do. But is there
a hidden cost to this? DNA testing company twenty three
(00:28):
and meters is in a financial crisis, raising questions about
what will happen to the data it holds on fifteen
million customers worldwide. New Zealanders are concerned their genetic information
could be on sold and used for other purposes, including
by insurance companies or law enforcement. Today on the Front Page,
(00:49):
privacy lawyer Rich Shearer joins us to discuss can you
tell us how these DNA services work? What does it entail?
Speaker 2 (01:01):
What happens is that you usually you spit in a
little tube and you send it off to them, and
then they run their DNA sequencing over it to extract
the DNA sequence from it, and then they store that
DNA sequence as data on their systems. I think sometimes
they keep the sample itself, sometimes they get rid of it,
(01:21):
depends whether they want to keep it for future use
or whatever they want to do with it. But generally,
it's that it's not so much the physical sample that
I'm focused on, it's more the DNA sequence, the data
that is generated from that and stored by that organization.
What they then do is make it available to other
organizations for those organizations to use. They would, in theory
(01:45):
anonymize all of the DNA so that you have a
DNA sequence that isn't necessarily labeled with Chelsea Daniels or
Rick Scherer or anything else, and they put it together
with thousands of others and provide that to a research
organization or a drug company or something like that who
might be using it to say, well, we want to
assess whether we can detect certain sequences might give rise
(02:09):
to certain characteristics of a person, and therefore we can
design drugs, or we can predict disease and so on
and so forth. So there's a lot of research that
goes on around it. There's also been potential use by
law enforcement agencies of some of this material, not necessarily
because the organizations give it to law enforcement agencies voluntarily,
(02:30):
but the law enforcement agency might come along, so, look,
we understand that you've got DNA for a person, we're
interested in them as potentially as a subject of an investigation,
or indeed to match up against another person who's DNA.
We have to determine whether, in fact that is the
person that we're interested in. So there's a number of
different uses which are being made at the moment of
(02:52):
the sort of material.
Speaker 3 (02:54):
I did that too.
Speaker 4 (02:55):
I sent my spirit into one of those websites, and
it turns out of sixteen percent Irish and ten percent
Hungarian and the rest is Italian.
Speaker 3 (03:02):
That's me.
Speaker 4 (03:03):
I did it, and I found out, guys, I found
out I'm seventy five percent French.
Speaker 2 (03:08):
I don't even speak French.
Speaker 3 (03:09):
Well, oh, I sent my name and I found out
that I am, let's see here, ninety two percent dumbass
for sending a DNA sample to a random website that
now has full copyright and unregulated access to my genetic
code for the rest of time.
Speaker 1 (03:28):
So we're talking about sites like ancestry dot com twenty
three and MENI family Tree and that kind of thing.
Speaker 2 (03:35):
Right, Yeah, that's right. And I mean, you know, there's
a huge business worldwide obviously in genealogy, and this is
part of that in some senses, because the business model
for these organis these companies is get your information in
and then they monetize it by making development of others
at cost. You pay a little bit. I can't remember
what people pay to actually do to give people these things,
(03:57):
but it's not very much. And that's not where they
make a lot of them. They make their money out
of making it available to other organizations. But from the
individual's point of view and giving it to these organizations,
it's you know, obviously it's genealogy is really interesting. You know,
you might find another family member, or you might find
some characteristics of your DNA that you weren't aware of,
(04:17):
so it's very interesting. I can well understand why people
do it. That's why there are TV programs about people
trying to find origins that they're always right.
Speaker 1 (04:25):
Quite highly Now, it seems like a fun thing to do. Hey,
spit in a tube, send it off and find out
a few weeks later you're sixty three percent Scottish. But
when you send your DNA into one of these services,
what kind of privacy laws are you rescinding?
Speaker 2 (04:41):
When you do this, you free to their terms of
use and to their privacy policies. By and large, these
organizations are based in the United States, and so whenever
you agree to give any information to anybody, whether it's
using Facebook or going online or whatever, you are saying,
up to privacy policies, I guess that's the first gateway
(05:03):
that you go through as the privacy policy. Then there
are laws which might apply over and above the privacy
policy in the particular jurisdiction or country where the organization
is based. So in these instances you've got a privacy
policy which we can read. Unfortunately, as a person who
drafts these privacy policies, I have a great deal of
(05:24):
sympathy for people who think they want to try and
read them, because they're a good bedtime reading, but only
for lawyers. So you know, reading thirty five pages of
a privacy policy to try and discern exactly what the
organization is going to do with my personal information is
not something that I can recommend to anyone in their alone,
non lawyers. So there's the privacy policy, then there might
be the law in this case in the United States. Now,
(05:46):
the United States, it's the name suggests, but we tend
to forget, is made up of a lot of states.
The United States doesn't have an overarching privacy a federal
privacy law which applies across everybody in the United States
states that have passed them. Each law is different, so
some of them would have cover the certain aspects, others won't,
(06:07):
depending on what the information has been collected, how many
people they are collecting it from, and what they do
with it. So it is a very much a patchwork
of privacy protection. Obviously, as I said, you've got a
privacy policy. The American way of doing privacy policies is
very much you give us consent to do X, y
Z with the information that you're giving to us. And
(06:28):
so these policies do tend to say we can provide
your personal information to other people for these purposes, research,
et cetera. Because it becomes that that database of DNA
becomes an asset of the organization. They've created the data
out of your DNA. You have rights in respect of
your DNA yourself, but the data, the DNA sequence that's
(06:51):
created is not yours. It is data digital information, which
is created by the organization, so they own that. And
so the privacy policies will generally often say, look, in
the event of a sale of our business, we can
sell our assets, including that personal information you've given. Now
there maybe some controls around that, not necessarily the level
(07:13):
of control that you or I might want to have
in respect of very personal information like DNA. Twenty three
and meters on the verge of getting delisted now from
the Nasdaq.
Speaker 3 (07:22):
Drama has been unfolding this week. It's only set to
wrap up.
Speaker 4 (07:26):
Twenty three Meters built a multi billion dollar brand off
of collecting people's spit. The idea was that people could
take control of their health by learning about their genetics,
and it took off. But just two years after going public,
the company went from being valued at six billion dollars
to treating below one dollar. Now, twenty three and Meter
is rapidly approaching the November deadline to propose an action
(07:47):
plan or be delisted from the Nasdaq.
Speaker 1 (07:50):
What are the concerns around twenty three and ME going
under and what happens to the rights of the DNA
records that they have on file.
Speaker 2 (07:59):
They haven't gone yet. They're in financial difficulty. Now there
could be I guess two results of that or three results.
I suppose they could come out of financial difficulty, they
could get more investment, and suddenly they take off again,
so that would be all good. They could decide, look,
our shareholders are not putting any money more money into us.
Our share price is going down as it is at
(08:20):
the moment, and therefore, in order to act in the
best interests of our shareholders, we should sell our business
to someone else. In that business will include the assets,
which will include the DNA data set, So that could
be a scenario or of West Coast scenario. The company
can't do that and it can no longer continue and
it goes bankrupt, and a bankruptcy liquidator that would be
(08:41):
in New Zealand that is appointed and they then sell
the assets. Of course, at that stage all bets are off.
They would sell to whoever they can get the best
price for those assets, which could be anyone. So in
any of those scenarios where the assets are sold, we
don't have any control over who might be the purchaser
of those assets in what they then might use stand
information for whether they are bound by the privacy policy
(09:04):
that is in place is questionable. In most privacy policies
will say look, we can change this policy whenever we
like by telling you, and if you don't want to
use the service after that, then stop using it. But
if you do continue to use it after we've notified
you of the change, then you're bound by the new policy.
So it could be that there's a change in that area.
Some states in the United States might restrict that, but
(09:26):
as I say, it's a bit patchwork in the US
as to how that would apply. You know, West caase scenario.
This information could end up with a company that wants
to do something completely different with it, wants to target
us for some sort of promotional material, you know West
worst case scenario, it could end up in a jurisdiction
it has no privacy protection whatsoever. You know, it is
(09:49):
a great concent and I've always said it's a great
consent to give these organizations DNA material because you never
quite know what's going to happen to it. I think
the real difference I see with DNA compared perhaps some
other personal information that we might give is if you
give someone your credit card number or your driver's license
or whatever, it can be highly damaging for that material
(10:09):
to get into the wrong hands, whether it's through a
cybersecurity breach, privacy breach, or because it's sold off to
someone that you didn't expect it to be sold to.
But at the end of the day. You can change
your driver's license, you can change your credit card, you
can change your passwords and so on. You can't change
your DNA.
Speaker 1 (10:37):
So essentially these records could just be sold to the
highest better. Is there an opportunity, do you think because
you say that they're encrypted, right, so if I sent
off my spit to one of these companies, it would
be logged somewhere presumably x y Z female. Is there
a concern that if these were sold on to somebody
(10:58):
else that they could decrypt their information.
Speaker 2 (11:03):
If they're sold to someone else, they will be decrypted
because you can't use the information when it's an encrypted form.
They're not going to buy a blob of ones and
zeros that they can't even see what they are. They're
going to decrypt it. The decryption key would be sold
along with the information to who revise it, because otherwise
you can't interact with the data. That's not to say
that they wouldn't hold it in an encrypted form. But
(11:25):
encryption is not a binary. Yes it's encrypted, no it's
not encrypted. Encryption scales along stratum from not encrypted to
industrial strength. You know it's held in the Pentagon and
it's encrypted in two and fifty six encryption methods. Having
said that, technology and computing power is increasing at such
(11:47):
a rate that what we thought of as very secure
methods of encryption or of storage ten years ago would
now be looked at as opensleather. You know, I read
the other day that Chinese researchers managed to use quantum
computing to potentially decrypt the most strong encryption that is
generally used by sites at the moment two for six AES.
Speaker 1 (12:11):
So instead of female x y z, it goes on
to the new owner, and then that could they could
decrypt it and be like, oh, this is Chelsea Daniels's DNA.
Speaker 2 (12:20):
Yeah, I mean in theory, there's two things that they
do to try and protect people's privacy. One is to encrypt,
and as we've talked about, potentially it can be decrypted. Secondly,
they anonymize it. So your DNA sequence wouldn't necessarily be
held with a label that sees Chelsea Daniels. It would
just be held as subject one two five seven six. Now,
(12:41):
the other difficulty that we're now facing with anonymization is
the ability for very strong computing power in the very
good matching of information, in particular with artificial intelligence to
match disparate pieces of information together and suddenly come up
with ay, well, person X was having a red car,
they logged onto a computer in this place at this time,
(13:05):
and they bought something with their credit card. Matching up
those three pieces of information, a person X must be
that person who owns that red car. Now that's a
lot more difficult, obviously, with DNA sequences and so on.
But there's clearly a move towards using high powered computers
to denonymize people so you can be targeted. So information
is getting it's getting much more granular in terms of
(13:27):
the ability to pick out a particular piece of information
and then link it to reidentify the person.
Speaker 1 (13:33):
What would someone want with my DNA that information? Is
it all nefarious or is it kind of like a
pharmaceutical company wanting a great batch of DNA information to
then come up with a cure of the cancer or something.
Speaker 2 (13:49):
Obviously, having research to produce cures to cancer using massive
DNA data sets is a good thing, no question about it,
if it's done in controlled circumstances. But as I said,
what we don't know who might end up being the
purchaser of this information, and it may not be someone
who is quite as ethical as we would all want.
This is one of the issues always with technology is
(14:10):
that you don't quite know where it's all going to
end up.
Speaker 1 (14:13):
Another thing is when you're sending your DNA off to
one of these companies, you're not only rescinding your rights,
but in some cases your family's rights as well. Right
like the Green River killer was probably not too impressed
that his cousin of a cousin sent in one of
his dinner because that's how he got caught.
Speaker 2 (14:30):
Right, So you know, other people can be identified. And
I mean that's the same with any law enforcement issue,
where you're often faced with the argument when you say
privacy is very important, you often get a pushback from
law enforcement people saying, well, if you've got nothing to
hide while you're worried about well, privacy is a human right.
Our right to be let alone is a right that
(14:52):
is important to all of us. We don't walk down
the road without any clothes on, and that's because there
are some things that we like to keep private. You know,
some people are less private than others, but it's all
a matter of choice. There's also with any technology, there's
also always the ability for mistakes to be made, false positives,
false identifications and sign and so forth. So you can
(15:14):
imagine these data sets being very because it's massive amounts
of data, you can imagine artificial intelligence being run across it. Now,
artificial intelligence is great, it's doing fantastic things, but it
makes mistakes. It's all it's doing is statistically predicting what
you expect to hear from it under an l l M,
so allow of the large language model, it's responding in
(15:35):
the way that it thinks you want based on the
prompt that you put in. And we don't know how
the artificial intelligence algorithms might be applied to these sorts
of things. So, you know, to take my health insurance example,
we don't know whether the algorithm that might be applied
across a huge data set would actually deliver the right result.
It may be that they've missed that it's misinterpreted the
(15:58):
gene sequencing and therefore, instead of giving me a clean
bill of health, has said, well, no, we think that
the algorithm thinks that you might be more likely because
of your DNA to have heart disease. That may be
completely wrong. So there are all sorts of difficulties. That's
not a function of the DNA, It's just a function
of how it works in the way and watch these
sorts of data sets may well will be used in
(16:18):
the future.
Speaker 1 (16:19):
And one last question. If someone from your family turned
around and said they wanted to find out who your
ancestors were and they were going to use one of
these sides, what would you say to them? Don't just
straight up, no explanation, just no.
Speaker 2 (16:35):
I've always said that. I mean, it's a choice, right,
it's I think the main thing is for people to
understand the potential downsides of doing it and then to
balance that against well, is that potential downside worth it
for the buzz that I'm going to get out of
finding that I have sixty percent Scottish ancestry. I don't
(16:58):
think it is. There are perhaps other ways of doing
that then using a DNA. And as I've said, the
thing that always strikes me as different about DNA and
the giving of DNA to other people is that you
cannot change it. But the other issue, of course is
that with any databases that there are in the world,
they are all susceptible to privacy breach, cybersecurity attack and
(17:18):
any organization that says, oh, I do you don't need
to worry about that, we are fully protected and we
have the best security in the world is absolutely telling porkies,
because all of the organizations that have been hacked all
said that right at the start. And as I say,
with computing power, with artificial intelligence, the people who are
obviously at the forefront of using that are criminals today.
Speaker 5 (17:40):
Growing questions after a first of its kind data breach
targeting Jenny, how do you cite twenty three and me
twenty three and me confirm profile information whilst taken that
includes user names, passwords, gender, photo relatives in common and
the percentage of DNA you share with them.
Speaker 2 (17:58):
So we can expect and we're all experiencing much more
sophisticated attacks on people's systems from hackers and criminals. And
what better pot of gold than a huge database of DNA.
Speaker 1 (18:14):
Thanks for joining us, Rick. That's it for this episode
of The Front Page. You can read more about today's
stories and extensive news coverage at enzed Herald dot co
dot nz. The Front Page is produced by Ethan Seales.
Dan Goodwin is the sound engineer. I'm Chelsea Daniels. Subscribe
(18:38):
to the front page on iHeartRadio or wherever you get
your podcasts, and tune in tomorrow for another look behind
the headlines.