Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Annie and Samantha and what kind of
stuff I never told you protection of by Heart Radio
and we're thrilled to once again be joined by bridget Todd,
the wonderful, the marvelous, the magnificent, and the very busy.
Speaker 2 (00:28):
We have missed you, Bridget Yes.
Speaker 3 (00:30):
I have missed you all too, my favorite authors. It's
nice to be in the company of some literary ladies.
Speaker 4 (00:43):
That's sounded like a foreign language, like who are you
talking about?
Speaker 2 (00:50):
You just bring on guests. I'm here, yes, yes, are you?
Do you enjoy write it?
Speaker 5 (01:01):
I used to enjoy it.
Speaker 3 (01:03):
I was the kind of person who would labor over
every sentence, every word choice, and so writing a paragraph
would take me forever. It's one of the reasons why
I'm a podcaster, because I feel it's so much easier
to say it verbally. Once I started podcasting, I was
like writing, who I don't know her? Say it verbally.
But I used to enjoy it, Like I used to
(01:26):
like send pitches to freelance editors and stuff, and but yeah,
much less lately.
Speaker 5 (01:33):
How about you?
Speaker 3 (01:33):
What is your relationship to writing? The both of you
now that you're going to be published authors.
Speaker 1 (01:41):
I love it. I we've we've talked about on the show.
Samantha and I have both done Nana Raimo, which I
know makes a lot of like editors roll their eyes,
but done that, and I have recently, during the pandemic
written a ton of fan fiction, like a actual shocking
amount perhaps which I've started publishing two I'll say rave reviews.
Speaker 4 (02:08):
It's true she has a man art. I do.
Speaker 5 (02:11):
People made art.
Speaker 2 (02:12):
I got it.
Speaker 5 (02:13):
People like it, Yes.
Speaker 2 (02:15):
People like it, So I do love it. It is
something I enjoy.
Speaker 1 (02:17):
But I guess one of the things about Nano Raimo
and fan fiction to a lesser extent is I was
kind of like you bridget with the editing and I
would get some in my head and like, this isn't perfect,
this isn't perfect. But Nano Raimo is like, I don't care,
get it done, but it has to happen. And then
you go back and maybe you get stuck in the
editing again. But it was a real nice for me
(02:40):
thing to do of getting out of my head and
just writing it.
Speaker 2 (02:44):
And now we're reading some of that stuff on Spinty.
Speaker 1 (02:47):
And then with fan fiction, it's nice because it's like
literally people like use this as an argument against it,
and I'm like, no, this is an argument for it.
Like it's a hobby and it's fun, Like I'm not
worried about like people will enjoy it.
Speaker 2 (03:01):
I like it.
Speaker 1 (03:02):
It's not gonna go anywhere, and that's fine, But I
just like doing it and I like writing, and it's
definitely kind of therapeutic for me.
Speaker 4 (03:11):
Right, I don't do a lot of writing anymore. My
form of writing has always been fiction. But I'm also
one of those who it's the mood. The way my
mood fluctuates is how it goes. And the more depressive
I am, the more likely I'm going to write, And
the more depressive and likely I'm going to write, it's
gonna be sad. So I try to stay away from
(03:31):
that because I'm more fictional than anything else. I think,
for the most part, doing this type of work was
a whole different beast. Doing things like when we write
scripts and outlines and researching has been such a whole
different beast that it's kind of the only way I
do it now. Just come to just space out. I'm like, yeah,
I'm done. I'm not touching a pencil, pen or the
(03:53):
typewriter or a keyboard typewriter. I'm that old. That's how
long it's been.
Speaker 5 (03:58):
All the quill pen.
Speaker 4 (04:02):
Look, guys, I love typewriters. I love them so much.
I wish I could have like several, because I like
I would collect them, kind of like Tom Hanks does
if I had the money.
Speaker 5 (04:11):
I had one in college. I thought I was so hip.
I didn't. I didn't use it. It was it was just
to like have and make me feel cool.
Speaker 3 (04:21):
You know, any any to your point about being able
to have outlets to write where you're not you're not
so in your head about it. Like, that's something that
I think is really important about, like having a craft
that you're honing.
Speaker 5 (04:34):
I have a post it ode on my laptop that
I'm looking at right now that says.
Speaker 3 (04:38):
Do it because it's like for me, so often the
need to do something perfect or to produce something really
good just keeps me from doing anything. And so I like,
I think that we need to have outlets where the
point is doing it. The point is like actually putting
pen to paper or finger to keyboard and not really
(04:58):
being in your head and.
Speaker 5 (04:59):
Worrying about like is it perfect?
Speaker 3 (05:01):
And so I think, like, yeah, like times like the
National Novel Writing Month. Having things like that like that
kind of get you in the practice of just like
just focus on writing, execute, just.
Speaker 5 (05:10):
Write, don't worry about it. Edit it later. Can be
really really powerful.
Speaker 6 (05:15):
Right.
Speaker 4 (05:15):
They have a scriptwriting month too for people who do
need to Speaking of which, Annie has also other wars
that are not necessarily published, but they have been out
because she's done with the Twelve Days of Halloween the
all of that, So she's got work out there she has.
I feel like she didn't. She's not known, but she
definitely has credit.
Speaker 2 (05:35):
I was just waiting for you to.
Speaker 4 (05:36):
Say, I know you need someone else to I got to.
Speaker 3 (05:40):
It's so funny how like it's so much easier to
like hype up a friend than it is to hype
up your own work.
Speaker 1 (05:49):
Yeah, oh my god. It just feels and I think
this is a lot of like conditioning we have. It
just feels like you're as a woman, like, oh, I'm
drawing too much attention to myself for oh, like what
if it's not that good?
Speaker 4 (06:01):
Right, Like it's that good anyway, So you don't want
to talk about.
Speaker 3 (06:04):
It, and no one wants to see no one wants
to seem like a bragger, but you should be bragging.
We should all be bragging. My friend Meredith Finder notte
a book called Brag Better that's all about the importance
of women bragging. Brag on yourself everyone and Sam and
listeners and listeners.
Speaker 1 (06:21):
Yes, yes, well, speaking of I know you've been up
to a lot of stuff, bridget Do you want to
give us any highlights of what we talked?
Speaker 5 (06:30):
What's happened?
Speaker 3 (06:30):
Well, we have relaunched season four of my podcast There
Are No Girls on the Internet.
Speaker 5 (06:34):
We launched it I think last week.
Speaker 3 (06:37):
That's right, So I've been It's been one of those
speaking of like doing it crappy and just like getting
it out.
Speaker 5 (06:42):
It was I don't know.
Speaker 3 (06:44):
I mean, I've been doing this podcast for years now,
but like got a little bit in my head and
was like paralyzed.
Speaker 5 (06:50):
With an inability to just like do the thing.
Speaker 3 (06:53):
Had some stuff to work through there, but you know,
it got done and it is people can listen to it.
It's ongoing now you can check it out. So that
was really exciting. We wrapped another podcast with Next Chapter
Podcasts called Beef, all about historical beef's and rivalries, which
is a kind of side interest of mine. Listen to
(07:13):
all eight episodes. It was super fun to make. My
favorite was.
Speaker 5 (07:16):
Probably Anne Landers and Dear Abby. Did you know they
were sisters? I didn't know that.
Speaker 3 (07:22):
Yeah, they were like sisters who had this intense rivalry
because they were both advice calumnists.
Speaker 5 (07:28):
Fascinating stuff. Definitely check it out. Yeah, exciting times.
Speaker 4 (07:34):
That's interesting. I wonder what like family events were, because
you know, those were the times that you sit together
and kind of moll over whatever troubles you might have,
boyfriend troubles, and the both of them trying to give
you advice. How annoying would that be?
Speaker 5 (07:45):
I know, can you imagine my head?
Speaker 4 (07:50):
I'd be like, Okay, one was bad enough, but both
of you have to put in your two says and
sound like your therapist. That's even worse.
Speaker 5 (07:57):
God, you know, Christmas Dinners were a love that family.
Speaker 2 (08:01):
Geez, was there advice like similar?
Speaker 3 (08:04):
No, that was something that they had very distinctive advice styles.
One of them their the her advice was like much
more kind of like traditional. Another's was like their her
advice was much more like progressive. They had they had
different attitudes, like they accepted things like you know, like
gay rights and like having people like gay couples come
(08:26):
in with advice. They accepted those things on very different timelines,
like they had very different styles of advice. Yeah, it's
and it's fascinating. Uh, people should like if.
Speaker 5 (08:36):
You were gonna, if you were interested.
Speaker 3 (08:37):
That's the episode I feel but that's the one that
I feel like is like I learned the most about advice.
Speaker 5 (08:44):
Well, I'll say one more thing.
Speaker 3 (08:45):
This is like I'm going on way too long, but
it's I'm fascinated by it. Advice columns were kind of
predated this like Internet era idea of people publishing things
and then the readers being able to write in before
the days of the internet, in print the only place
where people would publish something, and then the readers would
(09:05):
then submit letters being like oh I don't know about that,
or submit letters at all, and you know, getting advice
was advice columns. And so the whole concept of a
readership being in conversation and in dialogue with a print
publisher that is now like part of the Internet really
got it start in advice columns. So fun fact, that's random.
Speaker 2 (09:25):
So were you abby?
Speaker 4 (09:27):
Your team is an you said, alienators?
Speaker 5 (09:30):
I what a good question. I identify the team.
Speaker 4 (09:35):
As a beef, there's a team.
Speaker 5 (09:36):
Yeah, if there's a beef, there's a team. Exactly.
Speaker 3 (09:38):
At times I identify with both. I think that dear
Abby definitely struck me as the pettier of the two,
and so I definitely saw a lot of myself in that.
So I at times I feel I see both.
Speaker 5 (09:52):
I see myself in both of them.
Speaker 3 (09:53):
Put it that way, Oh that's great, good, Yes, I
definitely I need to check that out.
Speaker 1 (10:01):
Yeah, because for some reason, my Google updates all the
time to this day like gives me these kinds of
questions people are apparently still asking to advice columnist and
that when I read them, I'm like, this is wild,
this is a thing that is.
Speaker 5 (10:14):
Oh my god, I read advice columns.
Speaker 3 (10:16):
It's like it's like my one of my favorite things.
I read advice columns every day. It's funny because I
was just talking to someone about an advice column I
read yesterday.
Speaker 5 (10:23):
I think it was in.
Speaker 3 (10:24):
Slate where a woman had set her single coworker up
with a friend of hers and they went on a
date at a bar and the coworker was like, well,
we're not we can't go out again. And she was
like why and it was because her friend at the
bar ordered a glass of plain milk.
Speaker 5 (10:42):
And he was like, I can't see her again.
Speaker 3 (10:44):
And so the letter writer was like, how could I
convince him that he's being ridiculous and like, let her
order whatever she wants.
Speaker 5 (10:49):
She's a catch.
Speaker 3 (10:51):
The weird petty stuff that people write into advice columns,
it just I'm endlessly fascinated by it.
Speaker 2 (10:57):
Yeah.
Speaker 4 (10:58):
I feel like the update to that is the whole Creddit,
which I adore obviously.
Speaker 5 (11:05):
Oh I love them. I love them. And like there
are podcasts where people people just read them. Yes, I'm addictions.
Speaker 4 (11:11):
That's the entire podcast is they just read it and
then they talk about it. I'm like, wow, that's simple.
Can we do that? Yeah?
Speaker 5 (11:19):
No, like six page research.
Speaker 4 (11:20):
On no, no need twelve pages of things and reading
horrible things, thank you. Yeah.
Speaker 1 (11:28):
But it also just underscores like how people are doing
all kinds of things that I'm shocked by all the time.
Speaker 3 (11:36):
Yeah, it really, It's so true, And like reading advice
columns and all of the like am I like humans
are fascinating. We are, we have fascinating motivations. We our
behavior is really fascinating. Like the ability for humans and
our behavior and our range of behaviors will.
Speaker 5 (11:55):
Never cease to surprise me.
Speaker 3 (11:56):
I'll put it that way, both in negative ways and
positive ways.
Speaker 6 (11:59):
Yeah, yes, yes, yes, well, I guess kind of like
segueing into technology, and it's changed how things are happening
(12:23):
and how we operate.
Speaker 1 (12:26):
The topic you've bought today is so important, and I
know it's a question that a lot of people have,
So what are we talking about today?
Speaker 3 (12:34):
So this is an important topic. You know, it's almost
been almost a year since the Supreme Court overturned ro
versus Wade, And after that happened, probably immediately, one of
the biggest issues that came up was this connection between
technology and abortion access. You know, like should I delete
my period tracker app? Like what like can I google abortion?
Am I going to go to jail? And so namely
this idea that all of the data from our smartphones
(12:57):
could and would be used to prosecute abortions. So, you know,
back in the day, in the days before row in
nineteen seventy three, you know, it was a rough landscape
for abortion access. Obviously, we've made some really good progress,
you know, because of the expansion of things like pill
based abortion and you know, the ability to get pills
(13:18):
by mail. But in the days before Row in the seventies,
we'd all didn't carry like GPS devices in our pocket
or have this vast virtual network of mass surveillance that
could help criminalize people who need abortions in the communities
that help them.
Speaker 5 (13:32):
And I am.
Speaker 3 (13:33):
Sorry to say that Google, but the thing that most
of us use for our email, for our maps, for us,
our cell phones for some of us if you use
an Android or a Google phone, is a huge part.
Speaker 5 (13:45):
Of that network. And so today I want to talk
about some.
Speaker 3 (13:48):
Of the ways that Google has been collecting data related
to abortions, what Google has publicly said, what they're actually doing,
and some recent movement in news in that arena.
Speaker 4 (14:00):
You know, I haven't really been thinking about it because
we've talked about again as you were talking about the
period tracker and then Facebook, because we already know some
of the headlines that it's made in what they've done
and how they have helped the anti abortion movement so
much in the whole whole right winged conversation. But I
really haven't thought about Google, even though I should be, like,
obviously we know Google minds a lot of data, and
(14:22):
there's still conversation about how much information they keep and
they spread. But I haven't really thought of it on
like par with what's going on with abortion. So can
you kind of explain what we're talking about here totally?
Speaker 3 (14:36):
So I think what you just said, Sam, I've noticed
the same thing. I think that when it comes to
a lot of online harms, people think of Facebook, people
think of Twitter. Of course, Google is an interesting case.
I do think that anecdotally, Google has been able to
skirt a lot of public scrutiny about the way that
they handle some of this stuff in a way that
(14:56):
other platforms have not been able to skirt that kind
of scrutiny.
Speaker 5 (15:00):
I think a lot about this. I think it's because.
Speaker 3 (15:02):
Google offers a useful product for the world, right like, like
Google does make platforms and tools that are helpful to us. Facebook,
I feel like, is different and that like all they
really offer is like Facebook, WhatsApp, and Instagram.
Speaker 5 (15:21):
Some of those are, you know, globally, some of.
Speaker 3 (15:22):
Those platforms are like really useful, but here in the
United States, I feel like if Facebook disappeared, the tool
that they offer is not really as much of a
global good as Google. And so I think that for
that reason, I think that maybe that's why Google doesn't.
Speaker 5 (15:37):
Come to mind when you think about these kind of harms.
Speaker 3 (15:39):
But the reality is is that Google is really a
major player when it comes to the way that our
data can be misused, specifically to criminalize abortions. And so
shortly after Roe was overturned by the Supreme Court almost
a year ago, Google said that it would proactively delete
its location data when people visited quote, particularly personal places,
including a abortion clinics, hospitals, and shelters. Their statement said,
(16:03):
today we're announcing that if our systems identify that someone
has visited one of these places, we will delete these
entries from location history soon after they visit. This change
will take effect in the coming weeks. And so it's
really that last part of their statement that is important.
Like what does the word soon mean? Like my definition
of soon and Google's definition of soon might be two
(16:25):
different things. What does this will change will take effect
in the coming weeks? It's been almost a year, Like,
is that technically that is a matter of weeks? So, like,
is that what Google is defining as a couple of
weeks wouldn't be how I define it? And so this
was something that Google publicly agreed to do.
Speaker 5 (16:43):
They didn't.
Speaker 3 (16:43):
They will tell you that they just like decided to
do this as a public good on their own.
Speaker 5 (16:49):
That's not exactly true.
Speaker 3 (16:50):
There definitely were like digital rights groups behind the scenes
pressuring them to make this choice. But at any event,
they did, and they enjoyed a lot of positive glowing
press know for doing this. However, it's been almost a
year and Google has not really followed up in any
kind of a meaningful or consistent way in their promise
to delete this location history if they think that someone
(17:13):
is going to an abortion clinic. And that is really
concerning because we do live in a landscape where abortion
is so easily criminalized and your digital footprint can be
used to create evidence that you have tried to illegally
access information about abortion.
Speaker 1 (17:28):
Right And the thing is like a lot of times
you're using like maps or GPS to get to any place,
like to get to an abortion clinic. And I will
say about Google, like, there are a lot of things
that you don't realize it's tracking, even if because I'm
somebody who's like I go in and I turn off,
Like you're not listening to my voice, you're not doing this,
(17:49):
you're not doing this, it's still doing it, Like I
can't figure out how, but it is. And once I
found a map of like everywhere i'd been, even though
I don't really use GPS that much, and I think
it's because of like emails I get it's like, oh,
you're going on this trip or oh you're doing this
thing Like I'm trying to figure it out but I can't.
But it was creepy, like you can go and find
a map of everywhere you've been, and it is frightening
(18:12):
when it can be used in this very scary landscape
in a case where against abortion or something in a
case of abortion. So it's not like sometimes when you
talk about stuff like this, it can feel very as
we've said several times on this show when you've been
on like oh that's an Internet thing.
Speaker 2 (18:31):
I don't see it, so it's not.
Speaker 1 (18:33):
Real, but it very much is, and there is data
around this whole thing right totally.
Speaker 5 (18:41):
So this is not an Internet issue.
Speaker 3 (18:43):
This is not an abstract issue or a hypothetical, far
away down the line, maybe one day issue. This is
an issue that's happening to people in their real lives
right now. So, just to set the scene, according to
Google's own transparency report, they have already been subpoenaed several
times for users data. In the second half of twenty
twenty one, they received eighteen thousand and thirty seven subpoenas
(19:05):
and twenty three thousand, nine hundred and twenty four search
warrants for user information. So it's pretty reasonable to conclude
that prosecutors and states were abortion is illegal might already
be making requests for personal data from Google to prosecute
those looking for abortions. Also, in terms of like where
we've already seen this happening, we already know that people
(19:25):
have already been charged with a painting abortions or helping
somebody a pain an abortion, with their digital footprint.
Speaker 5 (19:30):
Being used as evidence.
Speaker 3 (19:32):
You might remember that when Roe was overturned Mark Zuckerberg,
when he was asked about it, he said that he
was hoping that encrypting Facebook's messages would help protect people
from quote overbroad requests for information, but that did not
stop the company from handing over user information. In June,
before Roe was overturned, Facebook turned over the private messages
(19:54):
of a mother and daughter facing criminal charges for allegedly
carrying out an illegal abortion, and so, well.
Speaker 5 (20:00):
This is happening now. It is not abstract.
Speaker 3 (20:04):
People are currently embroiled and in legal situations that where
they're facing jail time because tech companies were like gave
over sensitive information that they had from folks who were
allegedly looking up information about abortion or talking about abortion.
And so the reason why we have some insight into
(20:25):
this is because of a new report from the digital
rights organization Accountable Tech. They did a couple of different
experiments to get a sense of how Google is tracking
and whether or not they're actually deleting this location information,
and they're just not. So they said that they were
going to and they are not. Like there's not any
(20:47):
other way to interpret kind of what's going on here.
Speaker 5 (20:50):
You know.
Speaker 4 (20:51):
And as you're talking about it, I just realized I
get monthly information about where where you went this month,
like as if this is like hey, let's monthly review,
look at you. You traveled these places, as if to celebrate.
And now that I'm thinking about it, like wait, what
if I don't want to remember where I went the
last month? What is happening and I don't remember ever
(21:12):
saying yes I want this information, Yes, I want it
to be recorded, and yes, I want to be given
back to me. So if they have that so easily
given without me even prompting or asking for it, I
can't imagine what they actually have. And honestly, I can't
imagine how they decide what to delete if they're doing
specific locations, like how that would even come up, as
well as the fact that, yeah, if they have pinpointed
(21:33):
specific locations that would be easily trackable and easily used
against someone, right.
Speaker 3 (21:39):
Totally, And Sam, your point is such a good one,
and I think it's a good like it provides a good,
zoomed out understanding of the issue, right because I get
those two like I've talked about this on my own
show about the Spotify rap, which I always enjoy getting.
But all of these ways that we have been condition
(22:00):
to believe that surveillance is a good thing, Like oh,
I want to get a summary of all the different
places I've traveled in the last month. How and it's
you know, all these pictures of where I was set
to music that might feel exciting to get. And I'm
certainly susceptible to being excited to information about how I've
lived in my life but that is digital surveillance and
(22:20):
the way that we've been conditioned to think it is
not just commonplace to have our devices and our platforms
tracked and surveilled by big tech companies and used for
profit by those companies, but that we should be happy
that it's actually like cool and fun to see this
information about ourselves and get these little, you know, summaries
(22:41):
of what we listened to, where we went, what things
we bought, what we ate all of that, Like, there
really needs to be a fundamental reassessment of the relationship
between users and tech companies when it comes to our data,
because we deserve privacy.
Speaker 5 (22:56):
Privacy should be the standard.
Speaker 3 (22:58):
But even if you dress it up with a little
mon or a little newsletter or whatever, that is still
surveillance and we should ultimately be questioning whether that, like
that fundamental relationship between user and platform. That's my soapbox
about about surveillance. Like I get upset about it because
it gets me too, Like I, even though I know
all of this stuff and I feel very strongly about it,
(23:18):
when I get a pretty you know breakdown of what
I listened to at the end of the year, I
like it. And so's it's so insidious how we've been
trained to like things that ultimately may not be good
for us.
Speaker 1 (23:30):
Yeah, yeah, I did. I was thinking about this the
other day because when I was in third grade, I
wrote this. We were supposed to tell a story, and
I wrote this story called the Right to be Forgotten.
I was like, what was going on with me? Great title,
I know, but now to say it became a case
against Google where people were like, I don't want to
(23:53):
be remembered in certain ways where you can search me
on the Internet, and it was a case with the EU,
and so I revisited it like a couple of years
ago as an actor, and I did like the Right
to be Forgotten.
Speaker 2 (24:07):
And people afterwards were like, what are you talking about?
Speaker 1 (24:10):
The kind of that idea of like maybe I just
don't want people to know there are certain parts I
just want to be private.
Speaker 2 (24:16):
Oh my god, it is a time traveler, everybody.
Speaker 3 (24:20):
It is wild that you're saying this literally. Yesterday I
did an interview with a UCLA Internet researcher named Olivia
Snow and we were talking. We were supplicitly talking about
the right to be forgotten when it comes to platforms.
I cannot believe that this is a concept that you
have been you know, grappling with since you were a child.
Speaker 4 (24:42):
She does this a lot. There things that pop up
like she's like, that was my idea. I was like,
what ten years old?
Speaker 1 (24:52):
When I was nine, I was doing that was like
the year I wrote out like all the like top
things I wanted to do before I died, Like.
Speaker 2 (24:59):
I don't know what was going on going on? It's okay.
Speaker 1 (25:06):
Well, going back to that study you mentioned, the accountable
tech thing, they did run a couple of specific experiments, right, Yeah,
it's really interesting.
Speaker 3 (25:19):
I'm going to kind of gloss over it, but folks
should definitely check out the study.
Speaker 5 (25:22):
You can find that online.
Speaker 3 (25:23):
But essentially they did I think three different experiments where
they had a staffer buy an unused, totally new Android
smartphone and then start a new test Google account and
they accepted all the default privacy settings, and they had
to staffer travel from one state to another two different
(25:44):
abortion providers, so in one and they traveled from Columbus,
Ohio to Pittsburgh, Pennsylvania, and wound up that like planned
Parenthood of Pittsburgh and to see, you know, later how
that data would be handled In one test, a staffer
when they logged into the test account that they made
on her browser, so like she took her.
Speaker 5 (26:04):
She took the phone to the clinic.
Speaker 3 (26:05):
And then when she logged on on the browser I
think thirty days later under the Web and App activity
tab after she completed this test, she found a Google
Map search query from the abortion provider that she visited
stored on her account, So like, they're just not deleting
the data, like it's just like there's no other way
to interpret that information. And they did these tests in
(26:27):
three different ways, and each time there was some information
continued to be stored on these Google accounts, and so
according to Accountable Test, by retaining both location search query
and location history data, Google jeopardizes the health, safety, and
legal status of the users who visit reproductive care facilities
in states where abortion is criminalized.
Speaker 5 (26:45):
If prosecutors in a state with.
Speaker 3 (26:46):
A restrictive abortion law receive a tip about someone seeking
an abortion, a subpoena would likely force Google to hand
over this sensitive data. And again just to underscore that,
Google said that they were going to delete this data
soon after a person visited. What they said was going
to was one of these sensitive sites, including abortion providers.
(27:07):
And so this test that accountable Tech ran thirty days
is how does how was Google defining soon in this situation?
Speaker 5 (27:15):
Right?
Speaker 3 (27:16):
Like I would argue that a month is enough time
to have deleted this, but perhaps Google disagrees. And so
it does really feel like this is a situation where
they're I mean, I'm no lawyer, but misleading the public
saying publicly they're doing one thing and then doing another
thing in actuality.
Speaker 5 (27:34):
That could get people in serious legal trouble, you know.
Speaker 1 (27:38):
Right, And that's that's the thing is like, I know
we're going to talk about this later, but if if
company is telling me a thing, you know, I think
a lot of people use the word like they call
me naive, which I don't think is fair. But like
if I'm like, oh, I trust that because that's what
they said, and then it could be used in a
criminal case despite what they said, like that feels both
(28:04):
wrong illegal, but also it's very scary, Like that is
very frightening scenario.
Speaker 3 (28:11):
Exactly, and I think it really underscores how scary and
dire and complicated things can be for someone who is pregnant, right,
Like it's already it can already be a scary, intense time.
But adding on to this vibe of you can't trust
what these companies who say they are going to do
X to keep you safe a little bit more you
(28:33):
can't trust what they're saying. It just adds to this
complex web where it's difficult to know who you can trust.
Speaker 5 (28:40):
And so I should note that most today, according to.
Speaker 3 (28:44):
The Washington Post, most criminal charges for abortion are stemmed
from a human telling the authorities, not just from your
digital footprint being scraped by Google or something like that.
But we know that your digital footprint can be used
as evidence. And so you know, we live in this
climate where the condictive ex lovers, nosy neighbors, or even
just self appointed vigilantes who may have no connection to
(29:07):
someone who is pregnant at all, all of these people
can be threats. In Texas, a man filed a wrongful
death lawsuit against three women for allegedly helping his then
wife of pain pills that allegedly were used to induce
an abortion last year. If you're someone who is looking
for an abortion in a state where it's been criminalized,
not only do you need to think about who you
(29:28):
can trust, who has access to this information about you
who's in your community again, like even just like your
nosy neighbor next door who doesn't like you for no reason.
But also on top of that, you need to think
about your digital security. You need to think about how
you are accessing information and data online, and you need
to think about whether or not Google is being upfront
(29:48):
about what they're publicly saying about how they use that data.
It is an incredibly high burden that nobody should have
to deal with. It's already hard enough to be in
this situation. On top of it, you should not, as
a regular person, have to be parsing lies from a
platform like Google to make choices for your life and
for your health.
Speaker 1 (30:21):
I know that you came on and talked about it,
I think pretty soon after roe versus Wade got overturned.
Speaker 2 (30:28):
But kind of the like.
Speaker 1 (30:30):
Unfortunate tips that people need to know about, like you know,
different as I was saying, Gmail being used, but like
having a separate Gmail account and using a private browser
and all these things that Like it's like you were saying,
it's a lot to ask of someone to know that
they need to do this.
Speaker 3 (30:53):
Yeah, it's the which we shouldn't be asking this of people,
like you shouldn't have to be a digital security expert
just to make choices for your health. You shouldn't have
to know how to parse corporations public pr speak from
what they're actually doing. Like that is a lot for
someone to have to navigate. And so right after Roe
was overturned, we did an episode of There Are No
(31:14):
Girls on the Internet with a computer scientist and digital
security expert, doctor Jen Golbeck. If you know that name,
maybe you've seen her popular TikTok series educating people on
how to be more secure when navigating abortion information online.
I should tell you, like, I am not a lawyer,
I am at a digital security expert, So I want
to be clear about that. But I wanted to, just
if it's possible, play a little clip of what she
(31:36):
told me, because she is the expert. I don't want
to like summarize what she said. So you know, you've
mentioned a couple of like really great tips for folks
if you're if you're you know it, looking for abortion
pills and you want to do it, you know in
a way that you're going to be less likely to
be tracked, you know, using a tour browser, using incognito
mode when you search using.
Speaker 5 (31:55):
Public Wi Fi.
Speaker 3 (31:56):
Are there other tips that you want to shout out
for folks if you if they if they might need
this information.
Speaker 5 (32:01):
So that's all important stuff, I would say for sure.
Speaker 7 (32:03):
The most important one is that you are not paying
with a credit card or a debit card connected to
your name, So figure out how much your medication is
going to cost. Use cash by like a Visa Vanilla
gift card, which you can get anywhere for that amount,
and then pay with a Visa Vanilla gift card. So
much of how we're tracked is through credit card number,
(32:26):
so definitely do that. And the other way that we're
really easily tracked is through email address. So set up
a fresh email address that you are only using to
buy this abortion medication. Proton mail is the one site
that I recommended for this. It's free, it's encrypted, it's
really good and secure. You can just set up an
email address, use it to buyer medicine, don't use it
for anything else. If you do that, gift card fresh
(32:48):
email address on something like proton mail. You know, I
love Gmail, I use it right, but they track the
hell out of.
Speaker 5 (32:53):
You on Gmail.
Speaker 7 (32:55):
So proton mail email address, Vanilla gift card. You get
a eighty percent of the protection from tracking just from
those two measures, So you know that's easy and accessible
to anybody.
Speaker 5 (33:07):
Definitely do that.
Speaker 3 (33:09):
So yeah, but she doesn't speak to like a map,
location data. And I think that's an interesting point that
when we were having conversations about how to stay secure,
it was really like searching things, paying for things. But
I guess it stands to reason that in twenty twenty
three you might be using Google Maps to access just
like where am I going if I need to go
(33:30):
to go get an abortion, like where am I headed?
And it kind of reminds me of like the throwback idea,
how before the ubiquity of things like GPS, you had to.
Speaker 5 (33:39):
Just know where you were going.
Speaker 3 (33:41):
And I certainly there are places that I've been multiple
times in my life. I could get there through muscle memory.
Now with GPS, I'm like, oh, how do I get there?
And Yeah, it's just a good reminder of how much
these platforms like Google have become commonplace in our life,
and how they've gotten between us and the things that
we need to do, and.
Speaker 5 (34:02):
We don't even we don't even necessarily really think.
Speaker 1 (34:04):
About it, right, it's it's like a part of your
every day. It's like, how I'm not going to get
out a map. I often remember this time I had
to like proNT out a map quest page to go
meet a friend and I could never find her, and
my dad and I got in a huge fight about it.
Speaker 2 (34:20):
I was like, this is what the map quests.
Speaker 1 (34:24):
But now it's just like part of her every day
life and we don't think about it until it goes away,
and then you realize, like how much you rely on
that kind of stuff. And that's true with this the
clip that you just played of like you know, email
and using your credit card and your debit card, and
it kind of it bums me out because it felt
(34:45):
so much like how I went about downloading music illegally
for a while. And this is like a health procedure,
Like yeahal health procedure, and we're treating.
Speaker 2 (34:55):
It like I'm on lime wire and the middle.
Speaker 4 (34:58):
Ratle Wow throw it back.
Speaker 8 (35:02):
Yeah.
Speaker 4 (35:02):
I'm thinking about how people get the season desist from
fbis for downloading movies today and even maybe a knock
on the door. They go to that level just for
a movie. Don't get me wrong, you know whatever capitalism
it is where it is. But all the stuff that
I'm thinking about as we're sitting here talking about it,
I'm like, Yeah, I use GPS just to go home,
(35:23):
and half the time it's because I want to get
traffic information. I need an ATA and I need to
know which was the best route and all these things,
and Google Map is my way to go obviously for
all of that. It's not just for the direction, and
sometimes it's to tell me that something's closed or open
half the time. And I can't imagine not using it
at this point because I'm so reliant on that for
(35:43):
my information, because I don't listen to live broadcast like
radio when they used to do I guess they still
do traffic and weather and such, but I don't. I
literally say hey, Google, what's this? Something hucks back to
me okay, but you know just how quickly that becomes
the solution. And then hearing her talking about these processes,
it reminds me of trying to do like a spy movie, yeah,
(36:06):
where we have to go get a burner phone with
a burner credit card and make sure that we take
out the chip just in case that that gets followed.
I'm like the links that people are gonna have to
go to and most of the people who are having
to access this type of care do, as you said,
do not know how to do this, probably can't afford
to do some of these things, may not even have
(36:26):
access because we know that vanilla card, I don't buy
that because there's a three dollars fee at the very least,
that pisses me off.
Speaker 3 (36:33):
Yeah, no, you're angry about the wrong thing, right, No,
it's a totally a legitimate thing to be angry about.
And I think, like you know, I don't want to
make it sound like I am on like a like
I am so much more digitally secure than you, like,
I don't use these things.
Speaker 5 (36:49):
I have a smartphone I could not find I cannot.
Speaker 3 (36:52):
Get anywhere without Google Maps. I use these tools regularly.
I'm not suggesting that people who are not actively doing
something that is potentially criminalized or illegal should stop using
Google Maps, should stop using Gmail, should should do these things. Right,
that's not necessarily realistic in twenty twenty three. I don't
(37:13):
live my life that way, but I do think it
is worth stepping back and just questioning what we give
up when we get these conveniences in our life and
just having an awareness of that, because yeah, I want
to be able to find the quickest route home, I
want to be able to avoid traffic. I want to
know if there's a police officer with a scanner there,
(37:33):
or a toll or whatever, or a road closure that
is convenient. That is how that is a convenient way
to live modern life in twenty twenty three. But it's
not just something that I'm being given without a cost.
And so I think that we should be really aware
and have a really good sense of what those costs are,
because it can be easy to think that there is
no cost and there's no such thing as a free lunch.
Speaker 5 (37:54):
There is a cost.
Speaker 3 (37:55):
You know, My dad is someone who does not trust GPS,
so he doesn't have when he gets in the car
to go on a trip, he's got those old.
Speaker 5 (38:03):
School big road atlases under his.
Speaker 3 (38:06):
Seat still in twenty twenty three, God love them, and like,
but it's like as inconvenient as that is, that like
that is he's gaining something, So it's it's it's really
it's really about all of us making a cost benefit
analysis of what we are giving up versus what we
are getting and not thinking that like, oh, we're just
getting this cool new way to have a modern convenience
(38:28):
in our life that's just coming at no cost for us.
Speaker 5 (38:30):
There is a cost and we should be aware of it.
Speaker 4 (38:33):
And that's the thing is, like the conversation is, should
there be a cost? Why can't we have access to
these amazing things? Yes, the privacy thing maybe, But even
on top of that, the fact that there is a
chance of being persecuted and prosecuted for a thing that
shouldn't be even a law, you know, to begin with,
and having conversations like people who are going after people
and who they're going after, and we know that these
(38:55):
laws typically go after the marginalized people as well as
those in the lower or the social economic status. And
it's that that really is the burden of it all,
is that what this is doing is waging a war
and going after and persecuting people who in themselves are
already down and out. I guess it is the best
way to put it, or already put at the bottom
(39:18):
end of the poll, like they're not having the opportunities
or the ability to even defend themselves. Some of the
cases that have come forward as abortion cases having miscarriages
who have been taken out of context and say they
try to get an abortion and a lot of them
are refugees or in the immigrant status that I've seen
so many stories of that, and they have no way
of defending themselves because they don't get legal help. Typically
(39:41):
the good legal help will say it that way or
even the hearing. And when we talk about people who
are being held for immigration violations, like we've seen so
many conversations, well not enough because it doesn't get publicized
where convictions and things are of that nature coming out
from that or going after people who cannot get good representation.
(40:01):
And again, these laws even though it does affect everyone
and we should all be talking about it, but we
know that who are truly affected are those who can't
represent themselves and advocate for themselves. And that's the whole conversation,
is what is happening, who is who is doing this?
And why can't we be safe from this? Why can't
we like this shouldn't be one of the cost benefits.
It shouldn't be one or the other. It should be
(40:22):
that we have the rights to do this and feel
safe at the same time.
Speaker 3 (40:26):
I mean, like you put it so well, privacy is
a right. Digital privacy is a right, especially when it
comes to things that are sensitive like our health information
or health data visiting places like shelters or clinics. We
should have an expectation of privacy around those issues. If
you are someone who is already burdened by being a
(40:46):
marginalized person in society, if you are already burdened by
you know, maybe not having stable housing or not, or
having difficulties in your life on top of that, you
shouldn't have to know what a tore browser is just
to make safe decisions for your because of the whims of.
Speaker 5 (41:01):
A company like Google.
Speaker 3 (41:02):
But right now, that's the landscape that Google is creating,
and they have the power and the ability to change that.
They have said they are going to not do that,
and yet they just aren't. And so I completely agree
with you. I think that Google needs to decide if
they are going to be in the business of creating.
Speaker 5 (41:20):
A world where privacy is a right.
Speaker 3 (41:24):
And people have the expectation of privacy or not, and
if they're not going to, don't say otherwise. I think
that's kind of one of the reasons why I get
so angry about this, is like they're able to enjoy
this public perception of proactively trying to keep people a
little bit safer while not doing that. So why did
just say it if you weren't going to do it?
You know.
Speaker 1 (41:45):
Yeah, and I've been making all this money. I can
only assume they just really make a lot of money
off our data.
Speaker 3 (41:50):
So they're like, oh, like, I will talk all day.
Let's just say your assumption is correct. Your assumption is very,
very correct. And yeah, I think that, like, we need
to fundamentally change our understanding when it comes to the
relationship between users, our data and platforms. Like the fact
that Google it's so extracted, they take so much from us,
(42:14):
they can lie and misrepresent what they take from us
and how they take it, and they can make billions
of dollars off of it. Something is wrong with that equation.
We need to fundamentally rethink the relationship that users have
with platforms.
Speaker 8 (42:25):
Yes, and there has been some attempts recently to kind
of change the situation.
Speaker 5 (42:45):
That's right.
Speaker 3 (42:46):
So just last week, nearly a dozen cent A Democrats
wrote to Google with questions about how it deletes users'
location history when they have visited these sensitive locations like
abortion clinics, expressing concerns that the company may not be
consistently deleting the data as promised. These included Senators Amy Klobachar,
Elizabeth Warren, and Mazie Herano asking for answers from Google
about the types of locations they consider to be sensitive
(43:08):
and how long it takes for the company to automatically
delete visit history. Again they Google said it was going
to be quote soon, Accountable Tech found that it hadn't
happened in thirty days after that visit, So, you know,
I think these senators are right to at least get
some clarity about, well, what do you consider soon? And
just a few days ago, Google was sued by an
(43:29):
anonymous complaintant claiming that Google unlawfully collects health data, including
abortion searches, on third party websites that use Google tech.
Jane Doe, who is the complaintant their legal representation, is
looking to get the case certified as a class action
suit and claims that her private information was intercepted by
Google when she used the scheduling pages on Planned Parenthood's
(43:51):
website in twenty eighteen to search for an abortion provider.
So I think it's interesting that she is trying to
pursue this as a class action litigation because I do
think there is a class of people, like a large
group of people who are facing harm because of Google,
and it's not individuals, it is all of us. We
are a collective harmed group. And so I really am
(44:13):
interested in the fact that this isn't just.
Speaker 5 (44:15):
One person doing Google.
Speaker 3 (44:16):
She is trying to do it as a class action
lawsuit because it is representative, in my opinion, of a
collective harm.
Speaker 4 (44:23):
That's interesting because we know that there's some class action
suits happening with Facebook, and it's a pretty big deal
because there's a couple with billions of dollars in hand.
We know that the EU has gone after a couple
of companies as well belowly vis it Twitter for billions
of dollars in fines possibly, But that's what's going to
change anything. We know that it's not actually about humanity,
(44:44):
it's about the money, and if it's going to cost
the money, then they're more likely to do something as
long as they don't have to pay out whatever it
may be, and just ten dollars a few pennies to
millions of people. There's a lot of money, so that's
actually a smart way to go.
Speaker 5 (44:59):
I think Yeah, to just do a quick plug.
Speaker 3 (45:01):
If you are a US resident who used Facebook between
May twenty fourth, two thousand and seven and December twenty second,
twenty twenty two, you can file a monetary claim as
long as you do so before August twenty fifth, twenty
twenty three. And so it is. You know, for a
company like Facebook, it probably won't won't be more than
like a slap on the wrist for them. It probably
(45:22):
won't be something that they really like feel. But if
you are in the group that I just mentioned, I
absolutely am in that group, and I'm filing for my claim.
I think that anybody listening who was in that group
should file for their claim because that is the only
way that these companies have. That's the only kind of
I mean, it makes me sad to say, but like,
(45:42):
we don't have a lot of recourse for companies like
like Meta and Google. There's not much that the average
user can do that will actually make them feel this.
And one way that we have is hitting them in
their pocketbook. And so, yeah, get your settlement, even if
it's ten dollars, Get your ten dollars and buy yourself
a coffee whatever.
Speaker 4 (46:00):
Knowing that it's building up just we're just building up
the cost. And that's the thing is like we see
in so many things we've talked about civil suits when
it comes to like rape cases and sexual assault cases
and why they're important even though so many be like, oh,
you're just trying to go after the money. No, this
is a big deal because we know this is a
form of punishment that is more likely to happen rather
(46:21):
than the guilty not guilty judicial level of punishment. We
know this, So having things like this is really important
because it does make a stand, whether it's big or small.
Same thing with the dominion lawsuit that happened with Fox.
It was a big deal, like people were mad that
they settled, but that cost really did hit them in
the end. And these types of things. Is this conversation
(46:43):
and it's going to be a precedent of what can
be filed later on because obviously things are building up.
As long as Google holds out and not actually deleting,
they're going to get more cases. If this goes forward
and wins, they're going to get more piled on. And
that's a good thing. A sad thing that it has
to get to that point, but it is a good thing.
So it's something that definitely we should watch definitely.
Speaker 3 (47:05):
And you know, one question that I have is how
is it legal for Google to say one thing about
their privacy policies and do another.
Speaker 5 (47:13):
I again, I'm no lawyer.
Speaker 3 (47:16):
I don't know how it is legal, though, Like I
fully do not understand how Google can mislead the public
about data privacy practices. How that is legal under the
Federal Trade Commission. I'm the expert.
Speaker 5 (47:27):
Don't understand it.
Speaker 3 (47:28):
Maybe it's not legal, but whether or not it's illegal,
it is certainly unethical. It is unethical to say one
thing and do another, and it is certainly unsafe. And
the bottom line is that people deserve privacy and it
shouldn't be up to the whims of the people who
run Google to decide whether or not we get it
or how.
Speaker 5 (47:45):
We get it.
Speaker 3 (47:45):
We really need to rethink this this schema where the
people like tech leaders at Google, they are the ones
who decide whether or not we get privacy. People deserve privacy,
especially when it comes to their health information, full stop, period,
end of sentence.
Speaker 4 (48:01):
Right, And I wonder how much the government has done
because they've really taken away a lot of any kind
of power against these companies because we've seen so the
amount of updated agreements in ordering to use anything like
I think I got it for TikTok. I got well, TikTok.
We know they're underflying. We had that episode, but they
had a new agreement. I know Samsung continues to do
(48:23):
it every few months and the amount of privacy is
going away like it's it's interesting how quickly they're changing it.
And there's nothing you can do because you're already sucked
in to that system and that's what Google is doing.
And because of the way that the government has allowed
for these companies to be territorial and to continue to
gather that data. I'm not a conspiracist, but I think
(48:44):
there's a bit conspiracy behind this and about how much
control they want to have over the individual citizens when
that's concerning.
Speaker 3 (48:51):
Yeah, And I think the I'm with you, you don't
tell that the conspiracy theorist at all. And I think
the first thing that we can do is really understand
and what's being asked of us. And it's not easy
because they don't make it easy.
Speaker 5 (49:05):
Like when you.
Speaker 3 (49:06):
Get that twenty five page thing that you have to
scroll down just to like log onto your phone.
Speaker 5 (49:12):
Come on, be real, Like, who is reading that?
Speaker 3 (49:14):
Like, I don't false anybody for being like, oh, I'll
just hit agree. But it shouldn't be that way. You
shouldn't have to be a trained lawyer or trained in
understanding tech speak and have to parse a thirty page
privacy agreement that you have to click through just to
get to your email, just to exist safely online with
(49:35):
privacy and dignity, like.
Speaker 5 (49:37):
People deserve that.
Speaker 3 (49:39):
We have created, and we've accepted and tolerated and created
a system that is so burdenous to the average citizen,
and it should not be.
Speaker 2 (49:46):
That way right right, And.
Speaker 1 (49:50):
There is so much going on right now in terms
of technology and tumultuous times and technology and understanding those
kinds of things, and you Bridget are amazing at explaining
those kinds of things. So we always love having you here.
But you're also like breaking down these issues all not
(50:12):
without us on your own show.
Speaker 3 (50:17):
Thank you for that little introduction. So yeah, I don't
know if other folks feel it, but I think in
this particular moment in time, you know, we always talk
about this sort of like the future of tech and
what's next, it feels like we are in that moment today,
right when it comes to platforms, with the state of
Twitter and like what new platforms we're going to you know,
(50:38):
pop up and where a we're all going to spend
our time digitally, when it comes to conversations about the
rise of AI, and when it comes to the increasing
threat of you know, expanding tech surveillance, like we talked
about today, it feels like a very weird time for
technology and the Internet and where we're going in the future.
And it feels like all that future conversation is actually
here today, and so we're starting we have a new
(51:01):
season on There Are No Girls on the Internet exploring
how increasingly it feels like this future of technology is
happening now, precisely because it is so important that the
voices of people who are traditionally left out of these conversations, women,
people of color, queer folks, disabled folks, working folks are
not left out of those conversations.
Speaker 5 (51:18):
And so if you want to.
Speaker 3 (51:20):
Parse what it all means, where we've been and where
we're going, and what we need to know to make
sure that our voices are centered in these conversations, please
check out the new season of My Pod.
Speaker 5 (51:29):
There are no girls on the internet where we are
doing just that.
Speaker 1 (51:32):
Yeah, definitely, listeners go check it out. It does feel
like we're in the is it Moore's curve? It feels
like we're in the like real upward.
Speaker 5 (51:41):
For yeah, kind of siry? Good? Like what is it?
Speaker 4 (51:45):
Like?
Speaker 5 (51:45):
Good? She? Is it geometry? What is my curves and stuff? Good?
Math reference? And wait a minute, wait math?
Speaker 1 (51:55):
You know that might not even be the correct term,
but I think it is. And if it's not, then
that's really funny.
Speaker 4 (51:58):
So whatever, Yeah curve.
Speaker 1 (52:05):
Yeah, soll we always we love having you, We've missed you,
and we always take up so much of your time,
so I appreciate it.
Speaker 2 (52:12):
We could just talk to you forever about all kinds
of things.
Speaker 3 (52:15):
Oh my god, the pleasure, the pleasure is so mine.
This is like, Yeah, I could talk to you guys,
you all all day because it's just so nice to
connect on these issues and I it's nice to be
able to talk to them with YouTube.
Speaker 2 (52:26):
Yes, yes, it really is. Well, where can the good
listeners find you? Well?
Speaker 3 (52:31):
As I said, you can check out my podcast. There
are no girls on the internet. Wherever you podcast we are, there.
Speaker 5 (52:36):
Can follow me. I'm still on Twitter.
Speaker 3 (52:38):
Kind of at Bridget Marie can follow me on TikTok
at Bridget max Pods, I can follow me on Instagram
at Bridget Marie and DC and I would love to
have you on any of those platforms.
Speaker 2 (52:48):
Yes, and also beef, Oh yeah, beef.
Speaker 5 (52:51):
Don't don't sleep on beef.
Speaker 3 (52:53):
It's a little bit different than my usual content because
it's just like so much of my content is like
heavier and based around the Internet and like just like
here's what you need to know if you're trying to
have a good time and just nerd ad on some
historical rivalries like Who's not definitely check out, be super fun.
Speaker 1 (53:08):
See what's going on in the advice call of World exactly. Yes, Yes, well,
thank you, thank you, thank you so much again for
being here, listeners. If you would like to contact us,
you can. Our email is Stephani your mom stuff at
iHeartMedia dot com. You can find us on Twitter, I'm
most Stuff podcast or in instagrament TikTok at Stuff I
Never Told you, also on YouTube.
Speaker 2 (53:27):
We do have a book coming out.
Speaker 1 (53:28):
You can pre order it as Stuff you Should Read
Books dot com. Thank you as always to our super producer, Christina,
our executive producer Maya and our contributor Joey.
Speaker 2 (53:37):
Thank you and thanks to you for listening stuff I
Never told you. Disrection of by Heart Radio.
Speaker 1 (53:41):
For more podcasts from my Heart Radio, you can check
out the Heartradio Apple podcast, or wherever you listen to
your favorite shows,