All Episodes

July 6, 2023 30 mins
Close to 84 million people use some type of healthcare app and many don't know that Health-related mobile and web apps are not bound by HIPAA, leaving room for risky data sharing practices. Virtru is a company that works for data security and privacy protection. On today's episode, we're talking to Mishi Choudhary, the General Counsel for Virtru to learn what ways you can protect your information.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:08):
Close to eighty four million people usesome type of healthcare app, and many
don't know that health related mobile andweb apps are not bound by HIPPO,
leaving room for risky data sharing practices. Virtue is a company that works for
data security and privacy protection. Ontoday's episode, we're talking to Missi Chowdry,
the general counsel for Virtue, tolearn what ways you can protect your

(00:30):
information. So welcome to the show. Thank you for spending some time with
us today and answering all of ourquestions about the legality of these healthcare apps
and mobile apps. Very happy tobe here. Unfortunately, we don't have
the kind of illegal protections we thinkthat we do when we use these various

(00:50):
apps to rely for health information.I think one of the major assumptions we've
had is just like we go toa clinic and we are covered by HIPPA
that to protects our sensitive information.We all assumed that when we rely on
these apps for our mental health orjust to track various metrics in our bodies,

(01:15):
we would also be protected. Butunfortunately, really as citizens don't have
that right, and we need towork a little harder to demand better regulation
and better laws. Yes, Iagree with that. Now, let me
clarify with you if that's okay,because I know certain hospital systems have their
own app and the doctors will say, this is how we communicate with you.
We'll send you your test results throughthis app, and we'll you know,

(01:38):
you can schedule your appointment through thisapp. Are those protected under HIPPA,
So that's a great question. Actually, that's a gray area right now,
and the apps themselves are not covered. And that is why the force
right now is to tell to tellthe doctors as well as the hospitals who

(01:59):
are asking patients that this is theway they are collecting information, to clarify
to patients that whether what they aredoing the collection of information and sharing of
information, whether they are trying tocollect or not, whether they're trying to
collect that information, it's covered orhip hop or not, they have to
clarify. However, the law rightnow says that apps are not covered,

(02:23):
but it's only the face to faceinteraction with doctors and hospitals that are covered.
That's the status of law. Someof the people who think that the
gray area means we need to bemore watchful of how we interact with our
patients are leaning towards ensuring that theyfollow the protocol, but the law does

(02:44):
not require them to do so.Oh wow, that's crazy. It is
right, and that's why the USDepartment of Health and Human Services had to
update their guidelines on cell phones andhealth information and confirming that hip hop privacy
rule does not apply to most healthapps. And we were all under that

(03:05):
assumption, well, that the federallaw creates a privacy rule for our medical
records during the flow of healthcare transactions. But then once we got this clarification,
as lawyers as well as patients,the entire interaction and effort became to
push towards more clarity and ensuring thatwhat our expectation about our data face to

(03:30):
face is also translates online. Wow. Now, I know one example of
this going south, I guess,or going sour involves like period trackers ministruation
trackers, and I know there wasalready a case about it. Do you
know the details of that case.There had been a few such apps where,

(03:53):
of course a lot of women nowrely on ovulation cycles as well as
tracking other information which period tracking appsenable. And there have been few instances
where in different states where we haveat least noticed that because these apps are
not covered by the law protecting thatinformation, that either law enforcement or others

(04:17):
have asked for information about those andbecause of these heavy implications of them,
there has been a lot of concern. In fact, the House Oversight Committee
also submitted letters to data brokers andhealth companies to understand what information and documents
was being collected and sale of thatreproductive health data. That has happened quite

(04:42):
a bit, and that's why thereis also a lot of call for not
only clarification, but such apps tobe not sharing any kind of data for
advertising. Where where I work,we're true, our company is a cybersecurity
company and make a secure data platform. What one of our engineering teams did,

(05:05):
and it was women led engineering teamwas used our technology as the base
technology to create an app which protectedend to end privacy, did not collect
any data, did not share anydata, and we were, off of
course, very proud of making suchan app. And I think that's why

(05:26):
we want to push such products andmake demands for such products where we are.
Anybody who wants to rely on anapp for population cycle, for understanding
what their body is going through,can do so without worrying about the fact
that either law enforcement will have accessto that data because they're in a certain
state, or some advertiser is goingto bombard them with various kind of advertisements

(05:51):
that they think is the right oneto target such a user. So I
think at least in the market,because now people are realizing what this means
and how invasive and these certain ofthese technologies can get. That's why there
are products that are emerging in themarket which can offer us that kind of

(06:14):
protection. But law still falls muchbehind where we would expect it to be.
Yes, definitely. Now you're talkingabout advertisers wanting to buy the information,
that just sounds crazy to me too, Like what what do they use
the information for? Is it?Does it have my name attached? I
mean how much information are they getting? So it really depends upon which app

(06:39):
we are using. But what wehave seen is collection of metadata, which
is about your usage of the app, and that could also mean how for
example, if you were on amental health app, that could also mean
information about how long you were onthe app, how long your sessions are
and what kind and you logging,what time you send messages, your approximate

(07:03):
location, how often you open thatapp, and then that's just the metadata.
Then there is of course intake questionnairethat needs to be filled out on
some prominent services in order to bematched with the right provider. Then these
questions also cover extremely sensitive information genderidentity, sexual orientation, age, mental

(07:26):
healthy history, sleep, habits ormedications, etc. And sometimes at least
we have seen evidence of the factthat these intake answers are also found to
be shared with some analytics companies.These apps have been allowing to share data
with that and data bokers Facebook orGoogle will have also been found to be

(07:51):
among the recipients of other information sharedby these apps, and these usually the
justify to say that the data isanonymized, but we've also seen that such
an anonymized data can be easily connectedto a person then combined with other information.
And that is why I think thatis increased attention being paid to privacy

(08:13):
plus also security of some of theseapps, whether even if they're not deliberately
sharing data, whether they're secure enoughthat other people cannot dip into that data.
Po Yeah, yeah, that's beenI think the concern since the Internet
really became more and more predominant,thing is making sure that any and all

(08:35):
information that's across it is secured fromhackers and people that you know, wish
people harm. And then we havethese apps, and I think we have
a tendency with the apps to forgetthat they're just as vulnerable, you know,
because we have this false sense ofsecurity that oh, well it's only

(08:56):
on my phone, no one canyou know, they're gonna not gonna hack
my found specifically. So yeah,and how long have you been working in
virtual crimes and being an advocate forthat. Oh, that's a very long
time now. And I would saythat I started getting very interested in technology

(09:20):
in about two thousand and four andfive, initial part of my legal career.
And I will say that when Facebookstarted and it was only available to
some of the Ivy League institutions,I got an account as a law student
and it was very fascinating and interesting. But I also wrote a paper in

(09:41):
law school exactly about collection of dataand what it means about being watched all
the time with various people. SoI think that my path about getting into
this was already said even when Iwas in law school, and so now
for a very long time, asI said, since two thousand and five,
and six. The view mostly hasbeen if we don't understand or control

(10:03):
technology, then other platforms and thetechnologies will control various aspects of our lives.
Today, you and I both relyon several aspects of internet, apps,
etc. And they've really made ourlives much easier, whether it's being
bills are going to do, goingbanking or shopping, or as we are

(10:26):
talking about tracking our health data.I think what my interest has always been
about that the convenience and comfort oftechnology should be available to customers without writying
about who's in the middle and whetherthey are being watched or any data is

(10:46):
being collected which can be misused invarious ways. Later and my entire career
has been about digital rights because technologyis pervaser and it's permeated in every single
minute I spend today in my life. But I also don't want to be
worried about rights and harms and othersuch things. And I thought so,

(11:09):
well, I'm a lawyer. Ican worry about these things on behalf of
society and work with other people toensure that we can build products and also
change policy and law to ensure thatI get to share all the pictures I
want but without dying where they willlined up actually exactly exactly now to that
point. I mean, even youtalk about technology being pervasive, even this

(11:33):
interview, I'm doing it using technology. I don't have to use tape the
way people did. You know,even before my time in radio and everything,
you know, I wouldn't be ableto do my job at all without
technology. Everything is on and onand through a computer, and it does
make for convenience. I feel likebecause these newer generations have never known life

(11:56):
without technology, they don't seem tobe nearly as concerned about sharing their information.
And they're like healthcare data, Fine, look at my healthcare data.
You know what I'm saying. Theylike kind of nonchalant about it. What
are can you express to people whomay have that attitude what the dangers are

(12:16):
of not having you know, obviouslythere's identity theft, but there are other
dangers as well when not having thisdata secured. Correct? Correct? I
think that's a great way to thinkabout it. I will say one thing
that in my work I also wasalways under the assumption that they're young,

(12:37):
as you rightly say, are suchdigital natives that the pictures are on in
various social media platforms, even atan age where we didn't even know how
to use many of these things.But I think that they really like to
control what data they share and withwho. So if I have the agency

(12:58):
and the option to say I wantto put out a TikTok video, but
I do not want someone else tobe recording information or sharing it with somebody,
I do not know. That distinctionis becoming much more clearer in even
the younger generation's head. They wantto have all the advantages that they also
want the privacy and the security partof it, which you and I perhaps

(13:22):
have had much more sensitivity about thesethings now. In the way it happens
is because the pervasive nature of technologytoday. Things might seem like, oh,
I'm just sharing a video, it'sonly a dancing thing, It's only
a bunch of stuff here and thereabout me. But they are also joining
the workforce, and we also seea lot of analysis is done based on

(13:45):
what information is found about people online. We've also seen insurance costs if,
for example, there's health data.If a lot of health data is not
being protected and it is being sharedit with third party companies which should not
have access to such data. That'swhy in the first place, we decided

(14:07):
to have hip hop. We decidedto have the privacy rule for people who
perhaps when through certain addictions. Thereis the confidentiality rule we are supposed to
maintain for all of time. Thatcan impact job prospects, that can impact
what is the price of insurance foreach person. Nowadays, we all get

(14:28):
something which is tailored specifically to us. So if people are getting access to
such information about other folks and alot of these decisions, how they are
being made, what is impacted aboutthese decisions is also going to be determinative
based on the information people have sharedonline, and that's why we are now

(14:50):
more concerned about this information now.Gender identity, sexual orientation, these are
all sensitive questions that determine a varietyof leable options in our life. What
we might want, how we wantto live, where we want to live.
These days, all of these thingsmatter a lot here and that's why

(15:11):
I think that being accessibility and beingaware of what this might lead to is
important. I will also say onething, Technological progress moves very fast,
and I do not know if you'vealready spend some time like everybody on chat
GPT and how people are using avariety of information to create something. We

(15:37):
do not know what might be offeredtomorrow. And because most technologies are data
hungry and they're based on the datacollected by, for and about people,
that's why we do not know whatit might actually lead to in future.
I don't think ten fifteen years ofold we were able to predict that social

(15:58):
media is also going to be thatdivisive. We all thought it was democratizing
a lot of interaction. It wasalso connecting people in ways that we had
never been connected. All of thosepositives exist that we've also seen the downside
of technology, which was not predictableat that time, and that is why

(16:18):
not only the current terms we needto be watchful for, but there might
be certain things in future that wemay not even have thought about, which
may come and present themselves in away that impact certain aspects and decisions in
our lives which we would have preferredto have more control over. Yeah,
the thing that keeps running through theback of my head is a virtual witch

(16:41):
hunt, if you'll pardon that expression, but I mean, you know,
back in the day the Salem witches. I remember one of the tests of
being a witch if you were suspectedwas they're going to hold you under water,
and if you save yourself, thenyou're definitely a witch. But if
you die, well, then maybenot well, now you're dead like and
I can see that being something thatoccurs with this virtual information. You know,

(17:07):
I hate to be that way,but we're humans and we have never
done everything right, you know whatI'm saying. And I don't see why
anybody believes that that's going to changeanytime soon. And so you know,
the next big mistake in humanity mightbe a virtual witch hunt where based on
whatever data was shared. You know, you may have shared it, like
you said, innocently, but nowit's a bad taboo thing or whatever,

(17:33):
and like you said, maybe youdon't get jobs, or maybe you don't
get healthcare, or maybe you know, maybe it gets worse than that.
I hope not, but you know, that's just a scary, scary thought.
And I don't think people think thatthrough. I agree. I think
human beings are much better at actuallyinventing and creating tools than using them wisely.

(17:56):
So you come up and on thesetools, and then we put them
to uses which perhaps only our mindscan come up with. So I also
think that the kind of changes thathave happened in the past fifteen years have
been an unprecedented scale throughout human history. Yeah, like two thousand and seven

(18:22):
is when iPhone comes and we're intwenty twenty three, and it seems as
if we have lived several lifetimes already. Oh yeah, yeah, and it
took time for all of us tofirst wrap our heads around the fascinating thing
this technology was doing. And thenI mean, I have to say that

(18:42):
just making phone having phone conversations withpeople internationally, to moving a video conversation
like the way we do these days, there was a time and that was
really not possible, right, Andall of that stuff has happened from the
fact that we were all on Facebookor some other thing to now saying,

(19:04):
oh, that's just for people whobecause every other week we think we see
something much cooler that emerges where everybodygravitates too. And we've also seen as
I was earlier saying, from howgood it was for democracy we thought that
the Arab spring was fueled by socialmedia, and to the fact how devisive

(19:25):
these things have become for our democraticsystems. It has happened so fast.
So it's also a little hard toblame ourselves to say, oh, we
are not watching for all the downsides, We are not being careful about all
of this. And that's why Ifeel that the responsibility lies so much more

(19:48):
on our policymakers, our regulators,our lawmakers, as well as people like
myself who work in this area,to say, you know what, we
spend a lot of time thinking aboutthese issues. That's why we need to
force others to think about it,create more public education, and ensure that
law is not so far behind,that policies are not so far behind that

(20:11):
we are putting all the burden onan individual that I can blame you to
say, why were you not carefulabout what you shared? Why were you
doing X, Y Z? Asyou rightly said, your entire job is
facilitated by technology, So why can'twe force all our thinking in a direction.
Their products are such that they protectour privacy. They give us the

(20:36):
convenience. The law and policy ismade in such a way that we are
protected without having to worry about thelegalis or that was the complicated terms and
conditions all the time, and peoplecan actually do go about their life their
lives doing what is more important becausenot everyone has time to think about the

(21:00):
matters. And that's why when Ijoined Very True, one of the most
interesting things for me was to enableand help a company which was building a
product where we were never collecting anydata. As a lawyer, it was
it's very refreshing for me when mycustomers come and ask, can you tell

(21:21):
me what data are you collecting andwith whom are you sharing? And our
simple answer is, we have noidea. We don't collect any data.
We don't look into anything, wedon't have access to anything you do using
our products, so we don't.This is not our business model. We
don't need your data. And it'srefreshing and I think that's the direction we

(21:42):
need to move into to demand betterproducts, better law, better policy,
so that you can record this withoutworrying about who's listening to our conversation in
the middle. Oh yeah, ohfor sure. And with the invention of
AI, I mean me personally,I've already seen plenty of information out there

(22:02):
that suggests with three seconds of aperson's voice, they can recreate your voice
in AI and it sounds like you, and then they can try to scam
your family, right yeah, Andit's like anybody that that puts you at
risk, it puts me at riskbecause we've been talking way more than three
seconds, you know, and it'sit's like, how do you how do
you protect that? Though? Imean, what are Obviously you're in the

(22:26):
business of protecting information, but wesee all the time where there's constant breaches
of that. Do you think thatthe government needs to be spending more money
towards developing more protective type technologies?I mean, are we behind the curve
in that so? Um I couldn't. The answer is yes and no.

(22:49):
So they have seen far more interestand investment right now from the current flighthouse
as well as various other initiatives thathave been announced to concentrate on cybersecurity,
to concentrate on data centric security aswell as a model of zero trust.

(23:11):
We've also seen much more awareness inthe society as well as demand for better
rights all across the world here.Europe has obviously been in the vanguard of
coming up with regulations etc. Onthese matters, but we are seeing a
lot of awareness here in terms ofprotection currently. Yes, we definitely need

(23:36):
to do a lot more. AndI will also say, until all of
these things happen both in terms ofinvestment from the government, better laws,
better policy. I do think thatthere is some responsibility we as individuals have
to shoulder, and one of thethings which I generally tell people is that

(23:57):
they do need to practice good cyberhygiene. One of the things as part
of cyber hygiene is that we cankeep all our apps updated. Many apps
have some security vulnerabilities, and oncea security vulnerability is discovered, then they
are patched, and we need toupdate to a newer version so that one

(24:21):
can benefit as soon as possible whenthose vulnerabilities are patched. The second thing,
and I keep saying this to everybody, is about passwords. Right now,
we are still stuck with passwords,so a password manager can definitely help
because we don't and we just cannotbe expected to generate unique passwords and remember

(24:41):
all of them, etc. SoI always say use a password manager if
you don't already. And then forthat one password manager, choose a strong
and something unique, which is likea long sentence, a combination of letters
and sentence, a dialogue from yourfavorite movie or a poem or your book,

(25:03):
etc. And that does help quitea lot the usual two factor authentication,
because then something comes on your phoneand you confirm certain things. And
the other thing which I would sayis that many apps we have to keep
them on a strictly need to knowbasis. So many of your apps don't
need access to your camera, microphoneimages or location unless it's really necessary,

(25:27):
So we can not give that access. We can limit the add tracking,
and we should always ensure that weare not connecting our social media accounts to
the apps using them to sign in. I'm sure you get a mental health
app and then it says do youwant to sign in your Facebook account or

(25:48):
Google account? One needs to bea little careful about how you establish that
signment. And these days there areeasier ways to understand how to control your
own privacy settings. And if wewere to spend like five minutes thinking about
that, it will go a longway in ensuring that our data collection is

(26:08):
limited and that we are at leastdoing what we can is under our control
to protect ourselves online. I lovethat, I love digital hygiene. I've
never heard that term before, butI love that the tips you gave were
incredible. We're almost out of time, but I wanted to ask you,
is there anything else we need totalk about that I just don't know to

(26:30):
ask you about. Now you've beenasking quite a few interesting and very helpful
questions, because that's exactly the kindof information we need to share. And
I will say that my deep interestis that getting people the healthcare they need
is of course a great thing,and the move to tellyhealth and medical apps

(26:55):
and other online health services for everythingfrom therapy to vaccine registrations, etc.
Is a welcome step to make itsimpler and easier, But it has also
made apparent shortcomings in our privacy lawsand how we protect patients. And patients
shouldn't have to trade over their privacyto benefit either any corporate interests or any

(27:21):
kind of other interests for access tolife saving treatment. So if we can
just demand more people protecting products aswell as policy and regulations, then we
are creilling to continue seeing progress inthe direction from which all of us can
benefit. I definitely agree with that. Is there any place that people can

(27:42):
go to get more information about this? So there is a project which the
Mozilla Foundation, a nonprofit which researchesinto these things, does, which I
quite like. What they do isthey rate a lot of health apt on
privacy and security every year, andit's presented in a very simple manner that

(28:04):
there are emojis to tell you howgood privacy is, how good security is,
so you can go there and lookat the apps you are using how
do they rate on that? Iwill obviously say that if you are concerned
about email security or sharing of documentationsthat are sensitive, whether it is health

(28:26):
or financial documents, please check outwere true and we are big believers in
practicing what we preach and the makingproducts that protect privacy. That's another place,
and nowadays I am also seeing alot of good material in easily accessible

(28:47):
ways coming out of nonprofits like aCLU Public Citizen EF who are promoting a
lot of privacy and security aspects oftechnology. Nice. That's awesome. I
think we've covered the subject matter prettywell. Do you agree? Yes?
I think so well. Thank youso much for joining us today and hopefully

(29:07):
we can have you back on theshow later to talk about more interesting ways
to protect ourselves online. Sounds verygood, re Becca, Thank you for
your time. This has been agreat pleasure. I hope you've enjoyed today's
show. Thanks for tuning into theshow on your favorite local radio station.
If you have an idea of atopic you would like to hear on the
show, you can send me anemail. The address is Rebecca Hughes at
iHeartMedia dot com. That's r EB E C C A h U G

(29:30):
h E s at iHeartMedia dot com. Also, you can now listen to
this show or pass shows through theiheartapp or on iHeart dot com. Just
search for Virginia Focus under podcasts.I'm Rebecca Hughes with the Virginia News Network
and I'll be here next week onVirginia Focus
Advertise With Us

Popular Podcasts

Dateline NBC
Who Killed JFK?

Who Killed JFK?

Who Killed JFK? For 60 years, we are still asking that question. In commemoration of the 60th anniversary of President John F. Kennedy's tragic assassination, legendary filmmaker Rob Reiner teams up with award-winning journalist Soledad O’Brien to tell the history of America’s greatest murder mystery. They interview CIA officials, medical experts, Pulitzer-prize winning journalists, eyewitnesses and a former Secret Service agent who, in 2023, came forward with groundbreaking new evidence. They dig deep into the layers of the 60-year-old question ‘Who Killed JFK?’, how that question has shaped America, and why it matters that we’re still asking it today.

Las Culturistas with Matt Rogers and Bowen Yang

Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.