All Episodes

July 10, 2019 38 mins

While the new age of collecting and sharing genetic data is transforming our understanding of who we are, where we come from, and the health risks we might face, it also raises real concerns about privacy and security. But it’s not just genetic data that has the potential to be used in ways that we didn’t intend; it’s all data. In this episode, we’ll talk candidly about the current state of data privacy—who has access to your data, the things they’re learning about you (and others like you) from it, and why it's critical for you to educate yourself and read terms of service. Please note: the views and opinions shared in this podcast are those of the individual participants and do not necessarily reflect the views and opinions of 23andMe or their affiliates and partners.

Learn more about your ad-choices at https://www.iheartpodcastnetwork.com

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
This is Baritune Day with a very special episode of
Spit for You. Normally, with this show, I sit down
with a musician who you know and love, and the
scientists who you probably don't know but definitely love a
little bit more by the end of our time together,
and we take this approach to find the story in

(00:21):
the science of DNA. We have explored race, we have
explored adoption and sperm donation, but this show has a
different format. It's still me and two people, but we
are going back a little farther to the foundation of
what genetic testing even involves, to the data, and to

(00:41):
the big questions around what it means to spit in
a tube and share your data with other people with
a company, what it means for all of us to
live in this very connected world where our information is
a necessary input to get all the conveniences we have
become attached to. We're gonna talk about data. We're gonna
talk about privacy, We're gonna talk about permission and controls

(01:04):
in this hyper connected world we're living in. And we're
gonna do that because I think it's important and the
folks who I make this show with at I heard
and twenty three and Me gave me the keys. I
could say whatever I want, y'all. In the past few years,
I have started to question how the system is set up.
Do I really need to disclose as much about myself

(01:26):
as I have been in order to live a quality life.
At the exact time that I was approached to host
this show, I was really coming into my own as
a critic of technology. So I couldn't just say yes
to a show sponsored by a genetic testing company. I
had to do my research, and I read the terms
of service myself from front to back, which were very readable.

(01:51):
To my surprise, I had some privacy oriented activist friends
take a look and they said, you know, this is
pretty good. In fact, I demanded to speak with the
highest level people at twenty three and Me before I
would agree to do the show, because I wanted to
understand what they were doing to protect the information about
me that they had, and to use it in a
responsible way that I would always feel like I had

(02:12):
some choice and some agency in that matter. I got
onto the product because I trusted the representation of what
that product was, and I have personal curiosity. My older
sister had recently done a twenty three in me, and
she was curious about my results because we don't share
a father. So there's some mystery in there that science

(02:33):
might help resolve. But this episode returns to that original
hesitation I had about doing this show, that question what
happens with our information? And this show is not just
about twenty three and me and my information in those
terms of service. It's actually a bigger view. We're gonna
take on what it means to be connected in this
data driven world and what some of the costs are

(02:56):
on a security level, what does it mean to try
to have people sponsibly hold our data, and on a
larger system level, with artificial intelligence and algorithmic recommendations on
everything from what we buy to how many years we
might face in the criminal justice system. This stuff really matters.
So I'm happy you're joining me. I have two other

(03:17):
people to join me. Gina Matthews is a computer science
professor whose research focuses on algorithmic decision making, on bias,
on accountability, on essentially what we do with all the
data that we've collected. She was a fellow at the
Data in Society Research Institute. She received Magic Grant, which
is the best type of grant to receive the Magic

(03:39):
Grant from the Brown Institute for Media Innovation. Gina, Welcome
to the show. Thanks so much for having me. Thank you.
And we also have ri E L. Silverstone. He is
a data privacy and security expert. He spent the past
twenty five years focusing on really complex security processes and
policies for all kinds of organizations, including companies that work

(04:01):
with genetic information. His title is Incredible External Data Protection
Officer and Managing Director at Data Protectors. Thank you so
much for being here. Thank you for I'd like to
start with you, r L. Because I think there's information
that's about us, that's in systems and I want confidence

(04:22):
to know that it is being managed responsibly and securely.
And then further downstream, there's what we do with that information.
On this first point on securing the data, what's an
external data protection officer? So we bring the people together
that can protect data best. There is such a lack
of professionals with experience in the world, so we teach

(04:45):
on companies how to protect their data. We also interact
between the company and data protection authorities, especially in Europe,
and advise them on how to solve for demanding situations. Okay,
so you're really deep in there. How did you come
to this field? A few years ago? I was in

(05:06):
the middle of the computer revolution in Silicon Valley. What
do you mean you were in the middle of it.
So I went to high school in California, So you
were literally in the middle of in the middle. Yeah.
So I got very interested in computer science and security
and privacy back when I was fourteen. Did something happen
when you were fourteen to get you interested in this.
It's my high school. Some students were contacted by the FBI, okay,

(05:31):
because they were doing things they weren't supposed to do
and freaking and hacking like early hacking days. So the
word hacking is actually good. It's the freaking that's not good. So,
and just to clarify, freaking is when people are getting
unauthorized access to the phone network. Yes, you were saying,
the FBI based some calls to more than calls. The

(05:53):
FBI showed up in front of some of my schoolmates
and they had some splaining to do because they were
usually computers to do things that people didn't think they
could or would do. And that's all we do in security.
Your job is to decide what you could do with
the computer, or what you should do with the computer.
My job is what you shouldn't do. So since then

(06:15):
you've been on the journey that's got you now at
data protectors. But with some of the other type of
risks you've been exposed to. Your problems you've seen in
terms of trying to secure computer systems. Sure, so many
people don't know how to secure computer systems, not because
of evil, but because they just don't know. So they're
looking for people to teach them. And opinions vary. But
I've seen where data was used for bad for example,

(06:39):
to human trafficking, for example, to affect a balanced transfer
within the bank. But I've seen more computers and computer
security really affect people. For example, one of the systems
I was responsible for control the temperature of heart lung
machines at hospitals. Imagine if that computer just shut down
for no reason, people could die. One other questions that's

(07:01):
been on my mind every day because I opened up
my news app and I see the same story. I
see organization company government entity X has lost millions of
personal records of category. Why social security numbers, phone numbers,
bank records, entire social media profiles. Why does this continue

(07:26):
to happen Because companies generally see data protection as attacks
and they like paying tacts as much as you and I,
So they do either the bare minimum needed or they
are very sensitive for security elect twenty three and me
because they know they wouldn't have a business if there
was a break. Then they invest more resources and make
it the higher priority. That's it is. It's cost them money,

(07:49):
and then many of them don't want to spend it.
So that's the primary one. The secondary one is, look
a security people, we've got to be right a hundred
percent of the time. A hacker only has to be
lucky once, so processes have to be fixed. Well, I'm
glad someone like you is helpfully on the job. I
want to bring you in Gina. I'd love for you

(08:10):
to explain someone we know a bit from arial how
he got started. What was your introduction to computers and computing?
How did you end up in this field? I added
computers and computing as a second major when I was
an undergrad. Once I started with programming, I just really
liked the instant gratification nature of it, the ability to

(08:31):
change something and and also just build a whole world
that you wanted to build. I went to graduate school
at UC Berkeley, which also was kind of the heart
of the dot com boom there in Silicon Valley, and
was super excited about the potential of computing. I felt
like it was kind of a fundamental human good. It
was automating tasks that were drudgery, it was connecting people,

(08:52):
was giving more people voice. I just felt super good
about it. And then over time I just saw more
and more uses. Not that those other uses ceased to exist,
but I saw the funded uses, the pushed uses, drifting
more and more to surveillance and manipulation. Over time, I

(09:13):
just felt more and more compelled to try to do
something to change that direction. On the front end of
your career, your hands on with computing, and now you
are hands on with trying to shift the direction of computing.
But how does that early access, in that early atmosphere
shape your perspective? Now it's interesting. I went to a

(09:33):
twenty year reunion of my grad research group. I have
a lot of academic siblings who are very highly placed
at big companies, and some of them will even say
I'm basically just a network plumber. I mean, they're, you know,
incredibly talented systems building folks, but they feel very distant
from the impact of those platforms and systems on the

(09:55):
lives of individuals. I just can't do that. I've been
a system builder and I enjoyed that very much, and
I've been a security expert type person. But making the
cargo faster and locking the car down into its current
course doesn't seem to make a lot of sense when
you don't agree with the way it's being driven. Can

(10:16):
you explain kind of your approach on this topic, and
especially around big data systems or algorithms. What is your
role in this? I think we're right to worry about
accidental breaches or hackers getting in and taking data, but
I'm honestly more concerned about things that I consider to

(10:36):
be unacceptable that are perfectly legal. Um. Just today, it's
interesting I read uh that France is banning analytics on
judges decisions. It's interesting, right, that's public information, So the
pattern of decisions that judges make, they don't want people
doing analytics on that. It's the same kind of thing

(10:57):
when that happens to you and you realize it's not
just little pieces of data, but people are stringing it
together into big conclusions about your life. You feel violated,
and this kind of let us surveil you, let us
analyze you, but don't you dare do it to us
is a pattern I'm not at all happy with. So
I worry about the wild west of surveillance and manipulation

(11:22):
of individuals, and especially about big decisions that are made
about people's lives based on the little pieces of information
about them that make up big data. Do you think
that enough people understand the balance and the tradeoff that
we're making. I think there are decisions I've made historically

(11:43):
where I just didn't think about it. I was happy
to get the free storage from the service. I was
happy to get a discount or coupons from the grosser.
How do you see awareness of that balance or the
ability to challenge it developing over time. I think a
lot of people are uneasy with it, but don't see
an effective way to be a citizen of the modern

(12:03):
world without it, because we haven't demanded a different playing field.
The boundary for invasion of privacy really shifts over time.
A photo of me posted, is that invading my privacy?
You think? With facial recognition, you can find anybody. And
what if I'm just in a picture someone took someplace.

(12:25):
Do you think just your location information is personal data?
It's just longitude latitude. But if you assemble that over time,
you see where you work, where you live. Do you
go to a religious institution, do you go to a psychiatrist?
So many things that you might think are not private
become private with more ability to analyze and link it

(12:49):
with other types of information. So I want to play
a game with both of you. I call it what's
the worst that can happen? And so I'd like your
imaginations and your experience to be brought to bear on question.
I'll start with you R. E. L. There are theoretical
risks associated with data breaches and hacking, and then there's
things that actually have happened. What are the risks that

(13:11):
most concern you? Where are the multiple alarms ringing in
your business and in your life? That's a great question.
So Number one, most of our data, our name in
the US, even our picture, if it becomes hack. So
what I mean? We used to have that in the
phone books. We used to have it in our high

(13:31):
school yearbooks. We printed it in a physical thing called
the Facebook. So we have points of data that are
out there. The points of data by themselves do absolutely nothing.
I can even give you a copy of your d
n A, but it doesn't mean anything to anyone because
there's no context. When there is context, and when the

(13:52):
data can be made to do things it wasn't mean
to be, for example, not a GPS location, but the
GPS location for time, which shows you how you're going
around the city of London, for example. So when that
is out there, not only can it be marketed to
provide you things air quotes, but it is also dangerous

(14:13):
because of two things that people don't normally think about.
The first one is anticipating what you're going to do.
So it's a big company knows that there are thirty
year old single people in this building, they're more likely
to offer lattice there. But the biggest strengths there is
no technology today that would act to limit them trying

(14:37):
to change you or your preference. So not just anticipating
but I want, but creating what I want, manipulating right,
and it actually exists right. We used to call it marketing,
but in the future it can actually go into the
psychological realms. You have more and more data. You can
literally get into a situation that you're being manipulated to buy,

(14:58):
to sell, or to do any other action good or bid. Right,
that's pretty scary. You're playing a game very well, thank you.
It's the terrifying future, Gina. Do you want to jump
in on what's the worst that can happen? What are
the grounded fears and concerns that you have based on
your experience in the series. I love this game, and
in fact I play this game exactly when I give
talks about this. The thing that I think in fact

(15:19):
is happening to you and is super important is big
decisions made about your life based on that big data.
Will you have access to credit? Will there be police
in your neighborhood? Will you even hear about a job
that's available. How will your performance on a job be rated?
Will your insurance rates go up? Will you have access

(15:42):
to a university? Big, big decisions about our lives are
being made with little pieces of information that we cannot
anticipate the risk of your insurance rates go up? If
you purchase plus size clothing, what have you purchasing for
somebody else? Your interest rates go up if you charge
marriage counseling, because you know that could have a big
financial impact on your life. There's lots of little things

(16:04):
that you don't have any way of anticipating the outcome
it could have. Well, follow up question on the insurance
rates and plus size clothing, can you explain that a
little further because I did not know that. Yeah, so
information about your purchasing habits can be shared with insurance companies.
Just like when you go to get a job, you

(16:26):
can voluntarily sign away rights for them to look at
your credit history, your social media history. When you're trying
to get into the country and get a visa, there's
many times you can say no, but if you want
that job or you want that visa, you might have
to say yes. And then you have voluntarily given people
access to all sorts of deep wells of information about you.

(16:50):
That's a perfect segue to my personal experience with this.
I went on this data detox journey where I basically
collected all the information that had been collected on me.
It's a big social platforms search engines, credit agencies, data brokers.
I found people with profiles of me that I've never
heard of, and a lot of the information was inaccurate.

(17:11):
I was frustrated by that because I thought, well, you're
supposed to be big tech, you're supposed to know all
these things. But I was also relieved because I'm like,
thank goodness, they got that wrong. But then I'm like,
but they're going to make these big life decisions, Gina
that you're talking about. Do I want them making those
decisions based on accurate information about me or inaccurate information
like what is best for me? You know, Yeah, you

(17:33):
tell me, you guess correctly and it's private information, or
you guess incorrectly and you make an illegitimate decision about me.
And this is another fact. If you don't participate in
data collection, then people will make decisions about you based
on people that they determined to be like you for
whatever definition that is. Yeah, these are what shadow profiles

(17:54):
that some companies create to kind of mimic the population
that doesn't necessarily have an account with them, but they
can still be re presented in a database. Sometimes we
call them avatars, even though they don't have a picture.
But there are some companies out there that create an
avatar that knows to some degree what you're going to buy.
Next week, next year, which is why the intent behind

(18:14):
the g d PR in Europe was to give people
the right to rectify their data and also declaration where
companies got that data. You brought up g d p
R R E L and this is the general Data
Protection regulation rules set across the EU that a lot
of us experienced through new pop ups on websites we

(18:37):
visited and forcing us to acknowledge a very legal sounding
thing would have been the practical results of g d
p R, you know, in terms of what you just described.
So here's the situation out there. G DPR has been
in effect for a year. There's over three hundred thousand
cases in Europe involving either complaints from individuals or data

(19:01):
breaches data bridges about nine There's more than fifty six
million euros and penalties already levied in the first year.
Presumably is going to be much much higher. The law
has been created to some degree because they wanted to
get to a point where a potential penalty will be
painful for those organizations. All right, So we've got a

(19:23):
lot more complaint activity, We've got some fine levying activity.
Do you see a shift in actual behavior by companies yes, absolutely,
So I see behavior shift in two different areas. First,
in Europe, people are asking a lot more questions. The
individual citizen rate is more aware and they're asking more questions.

(19:46):
And companies that before didn't have a data protection officer
now does. So there's over half a million data protection
officers in Europe right now. Full employment for data protectors,
well there's only seventeen licens professionals. But so I'm hearing
a great job opportunity here for those For those that
you're looking for a new field to get into data
protector growing market, go ahead, absolutely. But the exciting changes

(20:09):
in the US. There are companies in the US that
have never worried about data protection to that level, but
now if they play in Europe, they have to play
by European rules. In the US, many companies have started
asking questions they never asked before. How do I protect
the data so I won't get fine? How do I
protect the data so I won't be in the newspapers?

(20:31):
And how do I protect the data so customers will
not stop buying for me? Well, that sounds like a
good story, and I'd love to hear Gina from you.
We've gotten shaken a bit by the negative possibilities. But
what are you seeing in terms of a shift in
a better direction, Whether it's you know, how we put
some limits around the recommendations and algorithms and these big

(20:54):
life decisions that are going to affect so many of us,
Where is the oversight emerging from at I think the
GDPR is a good first experiment in this space. I'm
happy to see it. I think in the academic community,
in the practitioner community, there's a lot of discussion about
things we can do and that's a step in the

(21:18):
right direction. So things like explainable AI algorithms so that
it's not just a recommendation based on machine learning with
big data that can't be explained. You can link it
back to a particular data point and its weight in
the decision. Doing things like looking at the demographics of

(21:41):
the training data, there's ways to look for patterns of
bias in decisions. Algorithms and systems developed and tested on
a certain demographic are not going to automatically perform well
on another. One of the things that emerges is how
old this challenge is. I remember learning that the medical field,

(22:06):
in terms of both treatments and drug development as is
very long history of using the male body as the
default human. Every recommendation, dosage is treatment. We're all designed
for this kind of average male body, and it turns
out that's not the average body in the world. There
are many different types of bodies, and we're going back

(22:28):
and people are kind of filling in the blanks. When
I'm hearing you talk about opting in and having more
accurate systems, having blanks in the data set, it seems
to me that a we want a representation of as
many as possible in this tool that's defining our whole future,
so long as be it's paired with a level of accountability, oversight,

(22:50):
correct ability, you know, in terms of we're going to
find mistakes. No systems ever perfect, but do we have
a mechanism to correct those mistakes and serve people in
the process. But to count on the will of a company, Plus,
it doesn't feel like a good long term plan for humanity.
That's exactly the right. As human beings, we stink in
self regulation just not gonna happen. Plus, look, the principles

(23:11):
are simple. First principle is transparency. Here's what I'm going
to collect from you, and here's why then there's the
accountability actually do only the things I said I'm going
to do, and then rectification. You've got to know about
something that is wrong about your data. If you don't know,
you can't fix it, and then your ability to both

(23:32):
correct it or have it deleted. I think surveillance is
sold to people based on the ability to benefit their lives,
but then they don't necessarily get that in return. Like
for example, I was just at a conference AI for
Good in Geneva. We're talking about for all sorts of
potential great things like waste in our food distribution systems,

(23:54):
or the ability to get better medical care, to reduce
inefficiencies our transportation system. That still doesn't mean that it's
going to benefit the people that you surveil and manipulated
to get it. I think we need better answers than
companies just out of the goodness of their heart, returning

(24:15):
to people that efficiencies that come from artificial intelligence and
machine learning and big data and algorithms. People think that
we're going to share the benefits, but I don't think
that we have a structure to really enable the sharing
of those benefits right now. I agree. I would like
to give us an example, a real example of something
that takes place today, and then I would like to

(24:37):
suggest the solution. Guess this is perfect, My co host
are l here taking us to the next level. I
think the solution will actually come with the help of
big tech companies like twenty three and meet now have
an incentive to request an overarching privacy framework in the
United States. Why well, the gdp R is all for Europe.

(24:57):
But in the US, the privacy laws that exists, and
there are many of them, more than a hundred and
twenty exists the level of a city, a county, or state.
But the company, like the ones I mentioned, we cannot
expect them to comply with something like seventeen thousand different
jurisdictional privacy laws. If every one of them is going

(25:18):
to be different from every other, there's no way to
comply with that. So they should champion, using their big budgets,
a situation where the US Senate the US Congress comes
together and makes a countrywide national privacy framework with a
national agency that regulates privacy because right now in the

(25:38):
US that's just the FTC and they have a day job.
I wanna remember some of the good things I asked
you what the worst case scenario was there there was
a flying game. I want to ask you, what's the
best case scenario? How would we operate if you had
your say completely or at least a bit more arial.
What's the best that could happen? Well, best solution would

(26:01):
be for everyone to be aware of his or her
value and also have their profile with certain bits, perhaps censored,
stored in a digital way for which they can give
access when they want the access to be given about
what they want the access to be given. Right again,
not to mention everything about Europe is great, because they

(26:23):
certainly have their own challenges. In Europe, there's a new
law called PSD two PC. The two allows people to say, okay,
my bank has such such information about me. If an
insurance company wants to see my data from my bank,
they got to ask me to approve or disapprove or
automatically disapproved the sharing of the data. To make it easy,

(26:48):
it's tied to people's cell phones, for example, So you
get a text the company wants to get your data,
and then you approve. Then they can make something ideally
better tailored to you. Now, the distance to get there
in the US is a bit long. But that's one example. Gina,
what's the best that can happen? Well, I think that
there's a lot of potential for optimizing our world. I mean,

(27:11):
our transportation system could use a lot of efficiency gains,
our food distribution system could use a lot of efficiency gains,
connecting people, getting people the information they need just in time,
helping us navigate our world find things. There's a lot
of power there. It's often powerful for both good and ill.

(27:33):
We don't look at both sides. We need a collective
will to say to turn technology to the uses we
want and prevent it in cases we don't. And really
that has to be done with legislation or some kind
of collective control. I'm worried about big decisions made algorithmically
about people's lives, but human decision making is no picnic either.

(27:54):
People make a different decision before lunch and after lunch
they make snap decisions based on crazy things. We haven't
exactly been following hiring law anyway. If you see a candidate,
you can make a biased decision based on gender or
race or disability status and not admit you're doing it.
But people at least do grow they consider other information.

(28:17):
You can appeal to them. You can convince them to
move forward and progress. Artificial intelligence systems, machine learning systems
put inside a black box are institutionalizing the past. People
think they're really progressive. I see a great potential for
a mix of human decision making and automated decision making

(28:39):
where people are deployed not to rubber stamp decisions, because
that's a bad use of people. They just blank out
and don't really engage on every decision, but to debug decisions.
But that is going to mean incentivizing teams of people
to actually find problems, and no company is going to
be truly incentivized to find problems in their system. So

(29:00):
we need adversarial testing, We need access to data. We
need to ask the question, what is going to incentivize
this to get better? Because there's so many systems. I've
been doing a bunch of work in criminal justice systems,
and in criminal justice systems, if they're making the wrong decision,
you know what, they say, you're just complaining because you're guilty.
It's not like a bug in your iPhone that a
bunch of people report and eventually it goes away, even

(29:23):
though in your world it feels like you're the only
one that sees this bug and no one else does.
In things like criminal justice, there may be no incentive
for it to get better unless we force transparency and
accountability and adversarial testing and things like that, which we're
not doing well. I feel like we're making some progress.

(29:44):
I think that I definitely was frightened a bit by
some of the visions you've painted in some of my
own experience and research in this area. I even think
privacy is often not the best word to describe what
we're really getting into. For me, it's words like permission, consent, sovereignty, agency,
self determination. Building those into the system as well is

(30:06):
a necessary evolution of this whole platform. Can we layer
in that level of accountability, transparency, adversarial testing so that
the benefits of these systems accrue to more and that
we're highlighting and reducing the downside so we can maximize
all the good stuff that we know it's possible. We
have a long long way to go and the world's

(30:27):
moving fast. This is not going to happen automatically. Listeners
should not be like, oh, there's a lot of ways
we know how to do this. It'll just happen if
we're patient. That's my point is that where we are
used to computer systems improving in this way, they will
not improve unless we fight for it. I think there's
this famous quote like the arc of history is long,

(30:49):
would have been towards justice. Big statement. I don't know
where it was originally stated, but I also don't think
it's quite true. I think people bend it towards justice,
And there's constantly in every era of technology, in every
age of any society, you gotta fight for your freedom.
You've got to put in the work and not assume
that a firm more updates just gonna come along with

(31:10):
rights installed, right, with dignity installed, with respect installed, saying no,
we're gonna have to argue for that, and constantly agitate
for it and fight for it and ask better questions
even if the answers aren't already right there. Doing this
kind of conversation is definitely helping me see that picture
even more clearly. Are Yeah, but there's one thing that

(31:31):
every company can do, and we should demand the company. Okay, yes, so,
first life cycle. Everybody knows the term life cycle. How
does life end with death? Companies that collect data frequently
forget the last step in the life cycle. So I've
seen companies that keep data, sometimes even credit card data.

(31:51):
There's twenty years old. What are you gonna do with
the twenty year old credit card data? You take that fact,
and you take the fact that again, I hate to
be cheerleader for the GDP are alone, but I see
your side job here. It's literally cheerleader for the GDPR.
But the thing is, the GDP also requires companies to
make sure the data is always up to date. So

(32:13):
since data, once it's put on the internet rarely does,
it's up to organizations to do automatically ideally deleted data.
When you collect the data, decide how long it's going
to live. The companies also get a lesser requirement to
spend money. I'm aware. For example, one company in the
US deleted some five yota bytes of data because they

(32:36):
had that much data historically. I know a lot about computers,
but YadA bites. I think I've heard that once in
my life. That's just a lot, right, It's just a
technical term for like a lot. Hold, Okay, well, I mean, look,
you think about there is a natural incentive I think
for organizations to just minimize liability. Data is seen as

(33:00):
the new oil in some circles, as the currency, as
this thing of value, but it's also a liability. Now,
this is a personal question for you are because we
spoke briefly about this, I want to bring it back
down to individual level and data because you Mr. External
Data Protection Officer, Mr. I've been in computer security since
I was fourteen years old. You've used twenty three and me.

(33:22):
I do, and we rarely actually have people on the
show talking directly about this service in particular. But I'm
just curious why you chose to use the service and
what you got out of it. When I was growing
up in Israel, I only have the twelve for relatives
in my family tree. My grandparents never talked about Europe
before the Holocaust. They didn't want to remember. At a

(33:43):
certain point, I felt a little bit ungrounded, so I
used twenty three and ME and I found some hints
for geographical distribution of my family. A year later, I
was in poland decided to go with the investigators from
archives and to look up things and careful what you
wish for Today I have about thirteen thousand blood relatives.

(34:06):
Is not aware of there were second cousins living next
door or one straight over that I didn't know existed,
and now they do and we're planning a reunion. But
it's hard to plan a reunion for people. Yes, that's
called a festival. That's right. Then we're gonna give shirts out.
So I have family in Australia, in the US, of course, Canada, Israel,

(34:28):
but I also have relatives in Uruguay, in Switzerland and
in many other countries. And now I have a family
that's beautiful, thank you. But before I used them, I
did my own assessment. I want to disclose exactly everything
I did, but but I certainly got to a comfort level.
And one of those things is the simple fact that

(34:49):
before they share your data with anyone, they ask your permission.
If you agree, then they share the data points that
you agree to share health or to be just familiar
with the people that you want them to share it with.
You have the control now in the world of the
identified data. In other words, when they collect two million

(35:11):
genomes and they find certain probabilities or discoveries or what
have you, I'm happy for my DNA to be used
for that. Maybe they'll find a better pill for me
and I can grow hair again. But before I used them.
I did my own due diligence. That's similar to my story.
I looked into it and I was like, Okay, I'm
good with this. There are parts of it that I'm

(35:32):
sure I maybe won't be so great with down the line.
I'll be vigilant and try to pay attention and look
out for that possibility. But for what I've gotten out
of it, it felt like a choice that I was
willing to make. And I think as we start to
wrap this up and think about what needs to happen,
it sounds like there's three levels of activism and engagement.

(35:53):
There's the individual. We should all be better about knowing
what we're signing up for, and to the extent that
we have a choice to not sign up, exercise that choice.
I know in some cases it's really more compulsory, but
where you can exercise that. Then there's the companies, the
people who are actually holding all this information making their
decisions and who won't always make good decisions out of

(36:13):
the goodness of their heart, and so making sure that
they are more accountable, incentivized in a structured way. And
then there's our g d p R cheerleader over here.
I will never let me forget that government at a
national level and potentially even international as the Internet knows
no bounds, but certainly at national levels needs to act

(36:34):
and step in because right now, without an enforcement mechanism,
it's hard for us to feel protected. It's actually hard
for companies too, maybe who want to do the right
thing to have a simple way to try to execute that.
Am I'm missing something in my attempt to kind of
synthesize everything in a paragraph, as far as what else
our listener needs to know about how to get to

(36:56):
the next step. First of all, it's your If you
don't care about there, it's freely don't expect others to
protect it for you. The second bit is this, every
time you get a frequent flyer card, any kind of
loyalty card, yeah, think about what it is you're giving up.
There are some things for which sharing data is perfect,
but in the reality of what you have is a
situation where your data is offered to roughly two million

(37:20):
websites every time you open the search, decide what it
is you want, and go find out who has it.
It's not as hard as it was a year ago,
but you have to take the first time. Gina Is
there any extra advice you want to offer to our listener.
Don't underestimate the power of showing up and asking questions

(37:40):
and writing and expressing your opinion and caring. There's not
nearly enough of that, and there's a lot of stuff
happening that really could use some eyeballs and some comments
and some outrage here here. I would second that, and
I want to thank both of you for showing up,
are asking questions, for sharing your opinions, and for so

(38:03):
clearly caring. Thank you for doing this special podcast, and
thank you for giving these issues this platform in voice.
I want to dig in more on today's topics and guests.
Check our show notes, and if you enjoyed the episode,
share it with a friend, all your friends, and be
sure to leave a review. If you want more surprising

(38:26):
stories about how we're all related, search and follow Spit
on iHeart Radio or subscribe wherever you listen to podcasts.
Spit is an i Heeart radio podcast with twenty three
in me. I'm Barraitune Day Thurston. You can find out
more about me at bartune Day dot com or on
social media wherever Barraitune Days are found.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.