All Episodes

June 6, 2019 28 mins

By now most of us understand the privacy consequences of all the data we handed over to social media and Internet companies. But what happens to the huge amount of health information we generate from health apps, DNA kits, doctors' visits, blood tests and fitness trackers? Some of it's carefully protected by law. Other data -- including intimate details about our lives -- can be sold to brokers who trade it like a commodity. How worried should we be?

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Every day you're generating data about your health. You might
not even be aware of it. Maybe your phone counts
how many steps you take. Maybe your watch measures your
pulse or your heart rhythm, or you use an app
to track your exercise or diet, and that doesn't even
count your medical data, the records that doctors and insurance
companies and pharmacies keep about all of us. All that

(00:25):
data goes somewhere, and it's valuable to someone. Welcome to Prognosis,
Bloomberg's podcast about the intersection of health and technology and
the unexpected places it's taking us. I'm your host Michelle
fay Cortes. The amount of health data is increasing fast,

(00:47):
from medical records, to health apps and devices, to our
shopping habits and online browsing. Every day we leave digital
footprints revealing intimate aspects of our lives. That comes with
benefits and risks. But no one has sorted it all
out yet and lost protect people haven't caught up with
the advances in technology. Having all that data promises to

(01:10):
help researchers come up with new treatments, and it can
improve doctor's care. But the risk is that personal information
you'd rather keep to yourself could be exposed. Here's Bloomberg's
health reporter John Tazzi. Good afternoon, Thank you for calling
Anthem Member services. My name is Kathy. How can I help? Hi, Kathy,

(01:32):
my name is John Tazzi. I'm a reporter with Bloomberg
News and I'm recording this for a story about medical
data and privacy. Um. I'm an ANTHEM member and I'd
like to request a list of who Anthem has shared
my personal information with. So you're a reporter with Bloomberg correct, Okay,
and you are inquired. I recently learned that I have

(01:52):
the right to ask my health insurance company what they're
doing with my data. It's one of the rights given
to me under a law called HIPPA. HIPPA stands for
the Health Insurance Portability and Accountability Act. It was passed
and it's the main law in the United States that
governs what medical providers and insurance companies can do with

(02:14):
our healthcare data. HIPPA determines how medical data can be
shared and what happens if it's shared improperly. It also
gives people rights over their data, like the right to
get a copy of your medical record or to find
out how your data has been shared with other parties,
but HIPPA doesn't cover everything. There is the idea, and
it is extremely widespread that health information is inherently going

(02:39):
to be protected by some law somewhere, but it's not true,
not at all. This is Pam Dixon. I am the
executive director of the World Privacy Forum, we're a public
interest research group, and has been a privacy advocate for
twenty years. She told me that people often assume there's
some kind of automatic protection for health data. That's not

(03:00):
the case. People universally believe that their health data, no
matter where it is, has some form of legal protection
and is somehow magically confidential. HIPPA applies to the records
that your doctor, other medical providers, and your insurance plan hold,
But more and more data about our health isn't just

(03:21):
in medical records. HIPPO covered data is a smaller and
smaller percentage of all of the health data that's out
there now, And it is so so important for folks
to understand this because much of the health data that's
that we're working with today is not covered under HIPPO protections.

(03:42):
Here's one famous example. Journalist Charles J. Hig reported that
target used detailed profiles of customers to predict when women
became pregnant, and then the company sent them promotions for
baby cones or diapers. The result was creepy in the
best case, and in the worst case, could have revealed
information they may not have wanted public. Increasingly, health data

(04:05):
is being collected by technology companies, data brokers, advertisers, and
other entities that are not subject to hip hop, and
it's being used and may be misused in ways that
a lot of people don't understand. Think about the apps
on your phone. Maybe you have something to track your
steps or to log what foods you eat or when

(04:26):
you exercise. Unless those apps come from your medical provider
or health plan, they're not covered by HIPPA, and that
means that the company is collecting your data are far
less restricted in how they use it, and how they
use it may not always be transparent. A study published
in the journal Drama Network Open in April looked at

(04:46):
thirty six top apps to help people with depression and
quitting smoking. Of them, we're sending data to Google or
Facebook for marketing, advertising, or analytics, But less than half
of those apps disclosed that. The authors wrote most apps
offered users no way to anticipate that data will be

(05:06):
shared in this way. As a result, users are denied
an informed choice about whether such sharing is acceptable to them.
This is the kind of risk that has some people
really worried. Even though some privacy advocates think hipp as
protections should be stronger, they're a good start. It's the
world of data beyond hip as reach that we need

(05:28):
to pay a lot more attention to. Because of the
lack of UH sort of a uniform standard across the
country with regard to data that it isn't protected by
hippa UM, there are concerns about the privacy, particularly of
health data. This is Aleana Peters. I'm currently a shareholder

(05:48):
at Pulsonelli, which is a national law firm. Aliana worked
for the federal government for about twelve years. She wrote
and enforced hippa regulations before she went to work for
a private law firm in twenty bike PAM. She's concerned
about the growing volume of health data that hippo doesn't cover.
The information that your employer holds about you related to

(06:08):
your health would not be protected by hippa UM. The
information that you share with social media about your health
or the groups that you participate in on social media
about health issues is not protected. There are applications that
are direct to consumer. That means they are marketed directly
to consumers and have everything to do with you know,

(06:30):
weight loss, to disease management, UM to disease prevention, because
they're marketed directly to a consumer and don't ever interact
with a healthcare provider on their behalf or with a
health plan that would not be covered by hippa UM.
So there's a there's a huge amount of healthcare data

(06:51):
UM out there that isn't actually covered by a standard
set of legal requirements. Here are some of the ways
you might be revealing data without knowing it. You use
a credit card to buy a pregnancy test at a
retail drug store. You order new pants online, revealing your
waist size. You search Google for symptoms of anxiety. You

(07:12):
subscribe to a magazine about diabetes. You use an app
to track your morning runs. You take a direct to
consumer DNA test, You take an uber to your therapist's
office at the same time each week. Just because information
about your health could be gleaned from these activities doesn't
mean it will be the problem is. We often don't

(07:33):
have a very good idea of where this data ends
up after it's collected. Some of it could end up
in the hands of data brokers. Data brokers are a
multibillion dollar industry made up of thousands of companies that
you've probably never heard of. They compile information about people
and sell it to marketers. They collect information from public
records and even that data that you might not realize

(07:55):
you're making, like your retail purchases, what groups you belong to,
online magazines and services you subscribe to, and information you
fill out in surveys or online registrations. They take all
of this information and make lists of people for marketers
to target. In testimony before the Senate Commerce Committee, in
Pam the Privacy Advocate described how the data broker industry

(08:19):
tracks people by the diseases they have and the medicines
they take. There are lists of millions of people that
are categorized by the diseases that they have, ranging from
cancer to bed wedding, Alzheimer's terrible diseases, some of them benign,
some of them relating to mental illness. There are lists

(08:40):
of millions of people and what prescription drugs that they take,
and these lists exist entirely outside of hip hop, outside
of what hip hop o the any kind of federal
health protection. Pam told Congress about some lists that the
darker sides of this business model. They included lists of

(09:04):
rape victims and people with genetic diseases. She found lists
for sale of people who had HIV and aids, of
people with dementia, and of people with alcohol or drug addiction.
There were lists of domestic violence victims and police officers
home addresses. The list of rape victims cost less than
eight cents per name. Pam said that some of these

(09:27):
lists were taken down within an hour or two of
her testimony, but most of them have reappeared at some point,
and six years after her testimony, she says not much
has changed. The data. Broker dossiers are often described as
marketing lists, but Pam said that doesn't necessarily mean the
buyers or marketers, and it also doesn't mean that the

(09:47):
lists are used as they're intended. For example, employers or
insurance companies could also be buying and using this data.
There's no law against this, So all of this points
to a knee for more protection. The laws we have
just don't reach far enough. But despite its limits, HIPPA
does provide a good framework for where to start. Here's

(10:14):
the good news. When data is covered by HIPPA, the
law gives people important protections. Healthcare providers and insurance plans
are barred from disclosing individually identifiable data under HIPPOP and
it goes further. As you might remember, the law also
grounds people rights over their data. It gives people seven

(10:34):
different rights, and the rights are really important because before
HIPPA there were huge problems. Pam says, it was really
difficult to get a copy of your own medical records
before HIPPA. Before HIPPA, good luck getting a consistent copy
of your health file. It wasn't a legal requirement anywhere,
so you you can predict what was happening prior to HIPPO.

(10:56):
It was a disaster trying to get your health information.
It also gives you the right to know if someone
has subpoenaed your medical records, which might happen in a
nasty divorce case, for example. And it gives you the
right to request an accounting of disclosures that's the list
of who your doctor or health plan has shared your
medical records with. The list that I'm trying to get

(11:16):
from Anthem. HIPPA also sets the rules for what those
entities can do with your data. They can't just make
it public. They can't tell a reporter or your employer
or a family member about your diagnosis, your treatment, or
any other private information without your permission. HIPPA does allow
medical providers and health plans to release data if it's

(11:38):
de identified. That means removing information like your name, address,
precise zip code, and other details. This d identified data
can be used for research. It can also be sold.
For example, when drug companies want to know which doctors
are writing the most prescriptions for their medications, they pay
data brokers who collect that information. Then pharmaceutical companies can

(12:02):
send their salespeople to doctors who are the highest volume prescribers.
The data they're buying doesn't have your name on it,
but it does represent you aggregated with other people, and
once it's de identified, it's no longer bound by hippa's protections.
Some privacy advocates I talked to described this as a

(12:23):
violation of privacy. The fact that you can't control de
identified versions of your data is really troubling to some people.
It's especially concerning because of the risk that some de
identified data could be re identified that it could be
matched back to you as an individual. Most experts I
talked to said that this risk is real, but small. Still,

(12:47):
the odds of being reidentified have increased since HIPPA was
first passed in the Here's PAM. The world has changed.
So back then, I mean, the statistic chance of reidentifying
records was enormously low. Now the chance of reidentifying records
is a little bit easier because computing power has advanced

(13:10):
so much and there's so many more data sets that
allow for more identifiability. But there are also benefits to
making de identified data available. Medical researchers rely on it
to learn about how to improve care, public health officials
use it to track epidemics and trends in population health,

(13:31):
and as a journalist, I often cite research or findings
based on this kind of data, from how common certain
medical procedures are to how often a new drug is prescribed.
I work in privacy and I definitely have an opinion
on privacy. I'm I'm for privacy and something that was
very hard for me to learn and it took years.

(13:53):
UM was the value of releasing data. Pam said she's
come to realize that trade offs between keeping data totally
private and using some d identified pieces of it. If
you want to cure diseases, you're going to have to
study the disease, and you can't do that without information
about the disease. Information about that disease resides in people's

(14:16):
experience with that disease as patients. We might also benefit
directly from having more of our healthcare data digitized to
learn about these benefits. I paid a visit to the
Commonwealth Fund. Welcome, Thank you. I was there to see
a man named David Blumenthal. I'm president of the Commal Fund,

(14:39):
which is a national health careful anthropy based in New
York City, and our goal is create hard for regal
system in the United States. David's office is in a
landmarked hundred and eleven year old mansion on Manhattan's Upper
east Side, overlooking Central Park. It used to belong to
the Harkness family, which endowed the Commonwealth Fund ascend the

(15:00):
three ago with money they made as investors in John D.
Rockefeller's Standard Oil Company. David Blumenthal is a big name
in healthcare. He worked as a primary care doctor at
Massachusetts General Hospital. He advised Senator Ted Kennedy on healthcare
and later worked for President Barack Obama as the country's
top health I T official. He helped implement a lot

(15:21):
called the High Tech Act, which updated some HIPPA rules.
It also gave medical providers billions of dollars in federal
subsidies to digitize paper records. The High Tech Act was
intended to modernize America's paper based healthcare system. As recently
as ten years ago, a majority of doctor's offices in

(15:41):
the United States still used paper records. David's a big
believer in how the accumulation of digital healthcare data can
help people. As it grows, it begins to represent the
healthcare experience of millions, are even billions of people, and

(16:01):
that is incredibly valuable. He says. Apps that draw on
patients data could help them take better care of themselves.
They could prompt people to get flu shots or alert
diabetics when their bud sugar gets out of whack. David
says he sees the benefits of greater access to medical
data as a physician and as a patient. Though he

(16:23):
works in New York, he lives in Boston, and he
still sees doctors at Mass General where he used to work,
and it's affiliated hospitals in the Partners health care system.
He finds it comforting that he can walk into any
of the dozens of clinics or hospitals in the system
and they'll still have his records. I have seen and
use that that connectiveness with my own care, and it's

(16:47):
enormously reassure um that you don't have to be you know,
your medicines will be known, your results of all your
tests will be known, and all that saving That could
solve some big problems in the US health care system.
There's a lot of evidence that patients are harmed all
the time because their care is fragmented and not coordinated.

(17:10):
A specialist who doesn't know all the medications you're on
might prescribe a new drug that has a bad interaction
with one you're already taking. One study of more than
half a million patients with chronic illnesses like diabetes or
heart disease found that people who had more fragmented care
had higher costs, lower quality care, and more preventable hospital visits.

(17:31):
This is a real problem that a lot of people
in healthcare would like to solve. Policymakers are trying to
make the whole country's health care system work better together.
They're trying to encourage different electronic medical record systems to
talk to each other. They're also making it easier for
patients on government health insurance like medicare to get access
to their health data. The goal is a health care

(17:52):
system that seamlessly relays important information that could save your life.
David gave me a classic example. You live in Austin,
but you get into a car accident in Chicago. Once
you get to the emergency room, maybe you're dazed or unconscious,
or you forget to tell the physician about analergy. But
if digital records were more widely accessible, that might not

(18:14):
be an issue. The merger room physician finds your Apple
phone and everything is on the Apple phone, or it
can access your record in a cloud because there's an
agreement to share those informations, and so that increases the
reliability of your ka, reduces the chance of an error,
reduces the chance of a of a bad outcome. That's

(18:36):
the benefit. The risk is most promised, and the sicker
people are the less concerned they are about price. But
there are also downsides. Just as with app collected data,
more traditional medical data sharing has its drawbacks. The risk
is that nothing is ever truly private. As soon as

(19:00):
your information is available electron reform, either in a server
or in the cloud, it is potentially David has experienced
this firsthand as a federal employee. His data was breached
in a hack of the government's employee database. I've given

(19:21):
up on the idea of privacy. It's just not feasible anymore.
It hasn't that I know, have happened to my health,
but it could um and I expected mine. Once data
is digitized and stored, there's a risk it might end
up somewhere you don't want it to. HIPPA requires medical

(19:43):
providers and health plans to tell you when your data
has been breached, and under the High Tech Act, if
a breach affects more than five people, the companies have
to report it to the federal government, which publishes a list.
Since two tho nine, when the reporting requirement into effect,
HIPPO covered entities have reported more than two thousand, five

(20:05):
hundred breaches that affected almost two hundred million individuals health records.
Health data breaches happen so frequently now that they rarely
make the news their routine. On average, there's a breach
of HIPPO protected health data every thirty one hours, and
that's only the data breaches that companies have detected and

(20:25):
that we know about. We know about them because the
law requires entities covered by HIPPA to tell us, but
under federal law, entities not covered by HIPPA generally don't
have to tell us when a data breach happens. The
state laws may require them to report breaches. They also
aren't bound by any of the other requirements of HIPPA.

(20:45):
They're mostly bound by the promises they make to you
in their terms of service, those long passages of legal
ease that you click through after you download an app
or sign up for a new service, and that's where
a lot of the privacy concerns about health data are growing.
There's not only the risk that your data might get
breached in an illegal hacking operation or stolen by a

(21:06):
crooked employee. There's also the risk that it might get
shared or sold in a way that's not necessarily illegal
but isn't completely transparent. Either. Facebook and Amazon can do
anything they want with your data or any other any
company that's not a covered can do anything they want
unless they have assured you in that fine print that

(21:29):
they won't. But since none of us need that fine print,
will never get around suing. So under HIPPA, we have
certain rights, the right to get a copy of our data,
the right to know how it's being shared and when
it is shared improperly, and it requires healthcare providers to
keep our identifying data close to not disclose it without

(21:49):
our permission. We don't have those rights over the data
we give to some app we download, or a new
fitness device or a social media service. We don't have
those rights over what with our credit card purchasing data
or our online searches. Partly because we don't have those rights,
sometimes our names and contact details wind up for sale

(22:10):
on data brokers lists labeling us as diabetics or dementia
sufferers or victims of domestic violence. Right now, the law
doesn't do a very good job of making companies be
really clear about what they're doing with our data and
making sure customers are okay with it. So what should

(22:30):
we do? I think it's a really good question, and
it's a tough question. Here's Ileana Peters the attorney and
former HIPPO official. Trying to decide what's best for all
industries with regard to the privacy and security of data
is extremely difficult. I think, certainly there are some things
we can all agree on, and maybe that's where we
need to start. Certainly, I think individual rights is one

(22:52):
of those things, you know. I think everybody should have
rights to their own data and should be able to
be at least participatory and how they're data maybe um
used or disclosed, why it should be deleted, how that
should happen? Um, you know, when they can get copies
of it, how that should happen. One possible model for

(23:12):
people looking to improve privacy policy in the United States
is a new law that recently took effect in the
European Union. It's called the General Data Protection Regulation, and
its strengthens privacy protections for consumers. It covers all sorts
of personal data, not just healthcare. The law mix companies
get more explicit consent from people about the data they

(23:33):
want to collect. It also gives people a right to
get a copy of their data, and it's supposed to
give them more control over what happens to it. The
United States doesn't have anything like it yet, and there's
no clear path to passing a new umbrella privacy law
in the US anytime soon. That means that even companies
trying to do the right thing don't have good standards

(23:55):
to follow. Pam Dixon, the privacy advocate, said, we've start
by creating a set of standards that companies adhere to
voluntarily that would give consumers more trust and how their
data is being used. So ideally, what I'd like to
see at a minimum, is some kind of structure that
allows for um privacy standards to be built. Is there

(24:19):
a privacy standard we could write for health data outside
of HIPPA. I think there is, and I think we
could find a lot of agreement amongst the stakeholders. As
I said, I think there's a lot of people who
want to do the right thing. It's just there's not
a standard yet. In the meantime, what can we do
as individuals to have more control over our data? First,

(24:40):
you can exercise the rights you already have under HIPPA.
Pam recommends everyone get a copy of their medical records
from their providers. If someone tries to steal your identity
later on, it will be important to have your original files.
If you have kids, get copies for your kids too.
You can also pay attention to what you're agreeing to

(25:01):
when you start using a new app or service. Here's Alana.
I read everything before I click I accept, but I
realized that I may not be the typical user. PAM
also recommends simply asking companies what data they're collecting and
what they're doing with it. You know, sending an email
to um an app developer and asking what happens is

(25:23):
always a great idea. I do that all the time.
If they don't email me back, I delete the app.
I'm a reporter, so maybe I'm biased about this, but
I think asking questions is a good way to show
the people were trusting with our data, that we're paying attention,
that we care about what happens to it, and that
we want some control. I spent about twenty minutes on

(25:44):
the phone with my insurance company. Most of the time
I was on hold, Hello, Yes, thank you so much
for patiently waiting. I can clearly apoll eventually. I just
wanted to make sure that she was really friendly, and

(26:05):
eventually she gave me the address of the privacy office
where I could send an email to request an accounting
of disclosures, one of my rights under HIPPA. I wrote
to them in April. At the end of May, they
sent me a letter that described how my health information
was released. Anthem said they're required by law to send
my claims records to a database run by the state

(26:26):
Health Department. The letter also said that my name, date
of birth, and contact information were exposed in a cyber
attack in t Anthem was hacked in a breach that
compromised data on seventy nine million people. It was the
largest recorded health data theft in US history. Anthem paid
a sixteen million dollars settlement last year over potential HIPPO

(26:50):
violations related to the breach. The company did not admit
liability as part of the settlement, and justin May, two
Chinese nationals were indicted in the crime. The Justice Department
called them part of an extremely sophisticated hacking group operating
in China that targeted US businesses. We got in touch

(27:10):
with Anthem about this. A spokeswoman there said the company
is committed to safeguarding customer data and there's no evidence
that the information stolen in the cyber attack resulted in
fraud against customers. So I know my data is out there,
along with millions of other people's. I don't feel great
about it, but at least I know. I'm more worried

(27:32):
about what I don't know. And that's it for this
week's prognosis. Thanks for listening. Do you have a story
about healthcare in the US or around the world we
want to hear from you. Find me on Twitter at

(27:54):
the Cortes or email m Cortes at bloomberg dot net.
If you were a fan of this episode, please take
a moment to rate and review us and really helps
new listeners find the show and don't forget to subscribe.
This episode was produced by Lindsay Cratterwell. Our story editor
was Rick Shine. Special thanks to Drew Armstrong. Francesco Levie

(28:15):
is head of Bloomberg Podcasts. We'll be back on June
with our next episode. See you then,
Advertise With Us

Host

Jason Gale

Jason Gale

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.