Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey this is Annie and Samantha.
Speaker 2 (00:06):
I w goome stuff.
Speaker 3 (00:07):
When ever told you protection if? I Heart Radio?
Speaker 4 (00:18):
And today we are bringing back another episode that we
did with Bridget Todd, who is one of our favorites.
We always love talking to her and hearing what topics
she's brought to us. This is one I thought was
really fascinating about the first lady of the Internet and
the Linna image and the history of that and how
(00:39):
the model that provided this image got no credit, got
no compensation, but it was so foundational to a lot
of the early Internet and it's just something that really
stuck with me and I still think about, especially as
we're seeing all these advances right now.
Speaker 1 (00:56):
So I thought, oh, why not.
Speaker 3 (00:58):
Bring that one back. Enjoy this classic episode.
Speaker 1 (01:07):
Hey, this is Annie and Samantha. I walk good stuff.
Speaker 3 (01:09):
I never told you are picture of iHeartRadio.
Speaker 4 (01:20):
And today we are once again thrilled to be joined
by the fabulous, the fantastic Bridget Todd to welcome Bridget.
Speaker 5 (01:27):
Thank you so much for having me. It's always such
a joy when I get to start my week talking
to you all.
Speaker 1 (01:33):
Yes, I feel like you have an extra glow.
Speaker 6 (01:36):
Maybe it's because you've been like soaking in all of
the sun on the beautiful beaches abroad. Actually, I've been
stalking you on Instagram and I'm like, how is this
woman always traveling?
Speaker 1 (01:46):
And I miss and I'm sad that I'm not.
Speaker 2 (01:48):
I'm just kidding. Oh you we actually so.
Speaker 5 (01:51):
I was in Mazatlan, Mexico for a clips to be
in a paspitality. We actually it was one of those
trips where we'd invited all of our friends, were like,
we're gonna.
Speaker 2 (02:02):
Get a big house on the beach. It's gonna be amazing.
Speaker 5 (02:04):
And then all of our friends are in and then
one by one, by one by one, I'm just there alone, essentially.
Speaker 1 (02:10):
No enjoyed it.
Speaker 5 (02:14):
I enjoyed it. I love Mexico. It is one of
my favorite places. It was my first time in mazat Lawn.
Ten out of ten completely recommend.
Speaker 3 (02:22):
Okay, did you see the eclipse and totality?
Speaker 2 (02:26):
I saw the eclipse of totality.
Speaker 5 (02:28):
It was my first time ever being in the path
of totality of an eclipse. Have you have either of
you ever experienced this?
Speaker 3 (02:35):
No? Yes? Once?
Speaker 2 (02:36):
What were your thoughts? Anny, I am dying to know.
Speaker 4 (02:43):
Oh my gosh, if this was a different podcast. We
would go into a whole separate thing because I had
like a relationship issue that was happening on this day
and kind of a drama situation. So a lot of
times when I look back at it, a lot of
the pictures i'd I was like, oh wow, we were fighting,
but it was also a work event that I was at,
(03:09):
so there was that layer. But it was beautiful. It
was so cool. I it sounds silly, but I love space,
like I like love the stars are like my favorite thing.
So it was really really cool to see. It was
not what I quite expected because the glasses. Samanth and
I were joking about this recently, but the glasses feel
(03:30):
so funny because you're like looking around like they're not working,
and then you like find the space you're supposed to
look at. I was just like, very very happy to
see it, honestly, like all that drama I was working aside,
I remember thinking, this is really cool that I get
to see this, and I'm really happy that I get
to see this.
Speaker 2 (03:50):
Yes, that was what I remember too.
Speaker 5 (03:54):
I burst into tears and the next day I woke
up in the middle middle of the night in a
panic because I was worried that I will forget what
it looked like being in totality like that, Like that's
how like I had never seen it, but I did
never seen anything like it. Anybody who listens to their
no girls on the internet is probably so sick of
(04:15):
me talking me about this eclipse, and I am fully
making seeing this total eclipse like my personality.
Speaker 7 (04:21):
But it's.
Speaker 5 (04:23):
But yeah, and I'm already planning where I will go
for the next one. So I guess I will see
y'all in I think, what is it, Spain?
Speaker 2 (04:31):
So that's that's the for you one.
Speaker 5 (04:34):
The next the next one that you can see if
you can go to Spain and see I think in
twenty twenty six. Ok So, but then the one that
you're referring to, Sam is like, don't quote me on
any of this, but that's supposed to be like the
big one, the big one that we will probably be
able to see in our lifetimes, and I think it's
in parts of the African continent.
Speaker 2 (04:54):
I want to say, Morocco. Don't quote me on that either.
Speaker 6 (04:57):
Yeah, I did say at the one point like it
you would be able to view in the US. That's
the next time you'll be able to it. I don't know.
It's like the actual, like the totality, as you say,
but like I don't know because I know nothing about this.
Speaker 1 (05:11):
That's the only I know the date, in no kindness.
Speaker 2 (05:14):
For this past eclipse.
Speaker 5 (05:15):
Earlier this month, we had done so much planning, including
like looking at farmer's almanacs to see what the weather
and cloud coverage is like this time of year. And
that's how we settled on Mazetlan, Mexico, because it was
the place that is closest to us on the East
Coast in the United States that was most likely to
not have cloud coverage in April, because you could see
(05:37):
it from Vermont and upstate New York and Texas, but
a lot of those places in April might be cloudy.
And so I have friends who are in Vermont and
upstate New York who were like, Oh, We're just.
Speaker 2 (05:45):
Gonna see it from our house, and I'm like.
Speaker 3 (05:47):
Oh, will you.
Speaker 5 (05:48):
Then on the morning of vi eclipse in Mazetlan, Mexico,
we'd been there for a week. Every single day it's
like a beautiful, cloudless, blue sky day. So I wake
up on April eighth, the day of the eclipse, and
it's cloudy, the first cloudy day we've had in Mexico
for the entire week we have been there. Luckily, during
the eclipse time, the clouds did part, so we did
(06:09):
get to see it. But there would have been a
lot of feelings had we not been able to see it.
Speaker 4 (06:14):
It's a lot of pressure to put on a trip
like that. Honestly, it's pressure.
Speaker 1 (06:19):
To put on the eclipse. I mean, it exists. It's
not their fault.
Speaker 5 (06:25):
So you can can contact like the manager of the.
Speaker 2 (06:29):
Sky to be like, actually we didn't get a good view.
Speaker 6 (06:33):
We were like, okay, for y'all who are Christians here
tell this God.
Speaker 5 (06:39):
Truly, when we get canceled, we were like getting a
little superstitious, like the things that we were doing to
try to like ensure good sky.
Speaker 2 (06:48):
It was getting a little a little out there. We'll
just sleep it at that.
Speaker 1 (06:52):
He brought a shaman in. I guess we're going in.
Speaker 4 (06:55):
Oh my gosh, Bridget, I want to ask so many
questions about this later on my Oh well, I'm very
glad that you got to see it.
Speaker 3 (07:06):
It is.
Speaker 4 (07:07):
It is amazing, like truly, and over on the other
podcast I Do Savor, we did an episode on like
weird companies making money off of the eclipse with their products,
and I have heard from so many people about the
foods they made for the eclipse, and it's brought me
so much story so oh oh yeah, like totalityea, like tea,
(07:30):
oh oh my gosh, so many things like this. So
I feel like we have a couple of we have
some years to brainstorm things like this.
Speaker 1 (07:39):
Next giant celebration, but keep that.
Speaker 3 (07:43):
In the back of your head, you.
Speaker 5 (07:44):
Know, Oh, next time, definitely doing eclipse themed food party
or dinner party or something.
Speaker 2 (07:49):
I love that.
Speaker 3 (07:51):
Yes, there's a so many puns.
Speaker 4 (07:54):
I will I will hold myself back for now, but
I have to say I am very, very excited to
talk about the topic he brought today, Bridget, because it
is a thing that I love of, like the history
of something I think a lot of people don't question
the history of and it's fascinating and I didn't know
about it.
Speaker 3 (08:12):
So can you tell us what we're discussing today?
Speaker 5 (08:15):
I feel the exact same way, And today we are
talking about the Lenna image.
Speaker 2 (08:19):
Is this something that either of you had ever heard of?
Speaker 3 (08:21):
I do not know.
Speaker 1 (08:23):
I had not known, So.
Speaker 5 (08:25):
Even if you're listening and you're like, what is the
Lenna image. I've never heard of this image. I've never
seen this image. Even if you don't know this story
and you don't feel like you've ever seen this image before,
you kind of do know this image because, as Linda
Kinsler puts it in a really meaty piece for Wired
that I'll be referencing a few times in this conversation,
she writes, whether or not you know her face, you've
(08:45):
used the technology it helps create practically every photo you've
ever taken, every website you've ever visited.
Speaker 2 (08:51):
Every meme you've ever shared.
Speaker 5 (08:53):
Owes some small debt to Lenna, and it really is
exactly as you were saying, Annie, one of those stories
that is foundational to the Internet and technology that you
don't necessarily think of, don't necessarily think of how it
came to be, and especially I think it's one of
those stories that says a lot about technology on you know,
(09:15):
here on Sminty, We've had plenty of conversations about this.
I've had many conversations about this on There No Girls
on the Internet, about how things like massogyny are kind
of can sort of be baked into the foundation of technology.
And I think that is one of the reasons why
tech is so often perpetuating misogyny, not because it's some
sort of an unfortunate bug, but because this misogyny can
(09:37):
be sort of foundational in some ways. And I think
this image really is a good example.
Speaker 2 (09:42):
Of what I mean.
Speaker 5 (09:42):
And I think, especially as we're having conversations about the
rise of things like new toify apps and AI generated
adult content creators, we're seeing what is kind of becoming
a marketplace that is, men making money off of the
bodies and or le labor of women without their consent,
certainly without their compensation. And I think this situation in
(10:06):
the Lenna image, where the image of a woman went
on to create this entire field of technology without her consent,
can perhaps really tell us something about where we're headed
in twenty twenty four.
Speaker 4 (10:17):
Yes, absolutely, especially when you consider where it comes from,
which I know we'll talk about. But also, yeah, these
conversations we're having now about like actors perhaps not given
their consent to being used in certain ways just and
honestly it ekes sends to all of us if you've
posted an image online, right, not consenting to using an
image to an image getting used in a certain way.
(10:39):
But so much about this history is fascinating because it
feels so standardized, which is odd. Can you tell us
about that?
Speaker 2 (10:48):
Totally?
Speaker 5 (10:48):
So for folks who don't know, the Lenna image is
literally an image of this woman, Lenna Lenna Forsen. She
is a woman from Sweden who in the seventies was
a model. So this kind of sensual image of her
wearing a tan hat with a purple feather flowing down
her bare back, staring kind of seductively over one shoulder.
(11:09):
That image of her was published in Playboy in nineteen
seventy two. She was essentially a playmate. That image would
go on to become what's called a standard test image.
Speaker 2 (11:18):
So big caveat here. I am at an engineer.
Speaker 5 (11:22):
If I say something that is you're like, if you're
an engineer listening, and you're like, that's not totally correct.
Speaker 2 (11:27):
I am at an engineer.
Speaker 5 (11:28):
But here is a definition of what a standard test
image is that I found from caggle dot com, which
is like a developer community site. They say a standard
test image is a digital image file used across different
institutions to test image processing and image compression algorithms by
using the same standard test images, different labs were able
to compare results both visually and quantitatively. The images are
(11:50):
in many cases chosen to represent natural or typical images
that a class of processing techniques would need to deal with.
Other test images are chosen because they present a change
of challenges to image reconstruction algorithms, such as the reproduction
of fine detail and textures, sharp transitions and edges, and
uniform reasons. So basically, to put that in Layman's terms,
(12:11):
a standard test image is like a test image that
tests to make sure that the technology is working as
it should be, or like rendering the way that it
should be. Lenna's image is not the only common standard
test image. There's also one that is like a bunch
of different colored jelly beans on a table. There's another
one that's called peppers that's just a bunch of different
(12:32):
colored like red and green peppers, like calapeno peppers. So
this is just a thing that becomes a way for
technologists to test that the image generating technology is working correctly.
Speaker 4 (12:43):
I do think this is very interesting for a lot
of reasons. But if you have like jelly beans and peppers,
those are things to be consumed and then when you're
thinking about where they got this image from, a Lena like,
how did this happen? How did this image become this
standard testing thing?
Speaker 2 (13:03):
So this is actually a pretty interesting story.
Speaker 5 (13:05):
The story of hell Letta's Playboy picture becomes this standard
test image that is everywhere and very ubiquitous, starts with
computer and electrical engineer Alexander Sawchuk. According to the newsletter
for the Institute of Electrical and Electronic Engineering, or the
I triple E, as I have found out it's sometimes called.
I was talking to somebody about this and I was like, oh,
(13:26):
the I.
Speaker 2 (13:27):
E E ee, and they were like, it's just the
I triple E.
Speaker 1 (13:32):
I E. Actually the I E exactly.
Speaker 2 (13:39):
So it's the summer of nineteen seventy three.
Speaker 5 (13:42):
Alexander Sawchuk was an assistant professor of electrical engineering at
the University of Southern California and also a grad student
in the sip I lab as a manager.
Speaker 2 (13:50):
As the story goes, he's like.
Speaker 5 (13:52):
Frantically searching around the lab for a good image to
scan for a colleague's conference paper. He had just sort
of gotten bored with their usual stock test images because
they mostly had come from like nineteen sixties TV standards.
Speaker 2 (14:04):
And then we're just a little bit dull.
Speaker 5 (14:06):
He wanted something glossy and sort of like fresh and dynamic,
but he also wanted to use a human face specifically.
Just then, as the story goes, somebody happens to walk
in holding the most recent issue of Playboy magazine.
Speaker 2 (14:19):
Why this person was.
Speaker 5 (14:20):
Bringing Playboy Magazine into his workplace, I'm cannot.
Speaker 2 (14:24):
Tell you how good you just.
Speaker 1 (14:27):
Come into your world institute?
Speaker 6 (14:29):
Cool?
Speaker 5 (14:29):
Yeah, Like, I mean, I do think that that sort
of like gives you a sense of like the dynamics
that we're dealing with, Right, that somebody just happens to
walk in with the but the most recent Playboy.
Speaker 2 (14:39):
Under their arm.
Speaker 5 (14:41):
Right, The engineers tore away the top third of the
centerfold so they could wrap it around the drum of
their Mrhead wire photo scanner, which they had outfitted with
analog to digital converters, one for each red, green, and
blue channels, and an HP twenty one hundred mini computer.
So all of that to say is that they effectively
(15:01):
cropped this image so that you can't see the models
bears in the image, so it's just a picture of
her from the shoulders up looking over her shoulder.
Speaker 2 (15:09):
It's still like quite a seductive.
Speaker 5 (15:10):
Photo, but the full photo has her like bare booty
in it. She's wearing I look like a feather boa
and like thigh high.
Speaker 2 (15:18):
Stockings looking over her shoulder.
Speaker 5 (15:20):
So back in the seventies and eighties, this image was
really sort of like used in very limited cases. You
could only really see it on dot org domains. It
was pretty limited to like engineers. Then in July of
nineteen ninety one, the image was featured on the cover
of Optical Engineering, alongside that other test image of the
different colored peppers.
Speaker 2 (15:40):
Funny enough, I took a look at that cover.
Speaker 5 (15:43):
It's all black and white, so I'm like, ah, I
think they're trying to demonstrate that, like these images had
all these different dynamic colors, but both of them are
rendered in black and white, kind of rendering that meaningless.
So this is when Playboy gets wind of this, and
they are not happy because it's basically copyright infringement, which
this is not related to the story, but I always
(16:04):
have to add this whenever it comes up. How litigious
Hugh Hefner and Playboy were. I always think this is
very rich because, as y'all probably know, Hugh Hefner made
an entire lucrative industry off of images of Marilyn Monroe
that she did for a calendar company, for which.
Speaker 2 (16:21):
She was only paid fifty dollars.
Speaker 5 (16:23):
Many years after that photo shoot, Hugh Hefner bought those
photographs from the calendar company republished them without Marilyn Monroe's
consent or permission in nineteen fifty three. That was the
first ever issue of Playboy. Hugh Hefner paid five hundred dollars.
She got fifty dollars. Right, So, whenever I read about
how litigious Playboy is, which they're very ligious, I always
(16:44):
had to chuckle at that that, Oh, like, you don't
want somebody profiting off of your intellectual property, but had
no problem profiting off of a woman's body without compensating
her fairly or even her consent.
Speaker 1 (16:57):
Interesting, this is like par for the course for him.
Speaker 2 (17:01):
Oh my god, don't even get me started with Sue Hefner.
We will be here all day.
Speaker 6 (17:08):
The things that came out after he died, which I'm like, Wow,
he had a pretty good, like powerful handle on people
not talking until he died.
Speaker 2 (17:16):
Oh my gosh.
Speaker 5 (17:17):
I was listening to an episode of celebrity memoir book Club,
where they read a lot of X Playmate and ex
Playboy bunny memoirs. Some of the things that they write about,
I'm like, oh my god. Like even even Lenna in
an interview, she talks about how in the seventies, after
this photo shoot, she was invited to go to the
Playboy mansion and the quote is something like, they made
(17:40):
it clear in the invites that I would have to
spend time with Hugh Hefner while he was in his
dressing robe, and I said, no, thanks, I mean.
Speaker 3 (17:50):
She already knew.
Speaker 7 (17:52):
Was like yeah, yeah.
Speaker 5 (18:06):
So Playboy threatens to sue these engineers, and at this point,
the engineers, it sounds like, had like grown very fond
of using this image, so they fought back.
Speaker 2 (18:14):
Eventually Playboy back down.
Speaker 5 (18:15):
Because, as a Playboy vice president put it, quote, we
decided we should exploit this because it is a phenomenon.
So yeah, by his own words like, oh, let's exploit this.
Speaker 7 (18:26):
Yeah.
Speaker 2 (18:26):
No, talk about the fact that this is two groups.
Speaker 5 (18:30):
Of men fighting over who owns this image of a woman,
in one case being used in a manner that is
completely without her consent or control.
Speaker 2 (18:39):
It just it already from the beginning.
Speaker 5 (18:40):
It just feels to me like men fighting over how
they can use a woman's representation that I think is
so foundational to some of the conversations we're having about
technology like AI right here in twenty twenty.
Speaker 4 (18:52):
Four, absolutely, and she did become pretty foundational, right.
Speaker 5 (18:59):
Oh, absolutely, So this is when the image of Leta
really becomes super popular. The whole drama about the cover
catapults this image into what you might think of as
like early Internet virality or popularity. This was in nineteen
ninety five. The use of the photo and electronic imaging
has been described as clearly one of the most important
events in history.
Speaker 2 (19:20):
It is truly hard.
Speaker 5 (19:21):
To overstate how ubiquitous this one image is in technology.
There is this fascinating interactive piece by Jennifer Ding at
the Pudding.
Speaker 2 (19:29):
The piece is so cool.
Speaker 5 (19:30):
It's like one of those interactive pieces that has a timeline.
Speaker 2 (19:32):
Definitely check it out.
Speaker 5 (19:33):
But in that piece, Ding actually includes a freeze frame
of the show Silicon Valley on HBO, where in the
background there is a.
Speaker 2 (19:43):
Poster with the Lenna image on the wall.
Speaker 5 (19:45):
Right, So this image is also included in scientific journals
just all over the place. Ding found that within the
dot edu world, so like websites related to education, the
Lenna image continues to appear in homework questions, class slides,
and to he hosted on educational and research sites, ensuring
that it will be passed down to new generations of engineer.
(20:06):
So this became so popular that Lenna herself is often
called the first Lady of the Internet.
Speaker 3 (20:13):
Wow.
Speaker 4 (20:13):
Yeah, I kind of like her taking that picture having
no idea that this is what would happen, which, yeah,
I mean, I guess that speaks to the next question,
why did this image take off the way that it did?
Speaker 2 (20:25):
Well? If you asked David C.
Speaker 5 (20:27):
Munson, who is the editor in chief of the i
E or the iter fully Transactions on image processing, he
said that the image happened to meet all of these
requirements for a good test image because of its detail,
it's flat regions, shading and texture. But even he will
not leave out the obvious fact that it's also a
(20:47):
picture of like a seductive, sexy young woman.
Speaker 2 (20:51):
Duh right, Like that's that's definitely part of it. He says.
Speaker 5 (20:54):
The Lena image picture is of an attractive woman. It
is not surprising to me that the mostly male image
processing research community gravitated toward an image that they found attractive,
and so I do think there's something about these highly
male dominated spaces where it's not just that there's a
lot of men that it's like their worldviews, their interests,
(21:16):
their perspectives, their biases that are really taking up a
lot of space in these in these spaces. I just
think that men feel like these spaces are theirs, and
that they are free to decorate those spaces with the
pretty women that they think they feel like they should
be able to use without their consent or compensation. I
just think that, like Annie, you mentioned earlier that the
(21:38):
other test images are these things that you consume, right,
like peppers or jellybeans. There's another famous one of a
baboon that's like has different colors on its face. It's
interesting to me that it's these things that are not human,
things that are like animal or that you consume that
like throwing a sexy young woman into that mix. I
don't think maybe seem like a huge departure for these guys.
Speaker 4 (22:00):
Yeah, and again, when we think about things in the
realm of AI or even I know I've complained about
this many times, but in the worlds of fandom are gaming.
Speaker 3 (22:11):
It's like that.
Speaker 4 (22:12):
It's like, you can come into our world on our
terms and you wear what we want you to wear.
You are here because we let you be here in
this male dominated space. But you're gonna do what we want.
It's not up to you, and that's the only way
that you can be in this world. But that being said,
there has been some pushback recently ish right, Bridget.
Speaker 2 (22:37):
Yeah, So one thing about what you just said.
Speaker 5 (22:40):
When I was researching for this episode, some of the
different engineers who had contributed to this images popularity, they
were quoted when they actually met the actual real Letta
at a conference that she was invited to, they were like,
I can't believe she's a real person. A part of
me was like, you didn't even see her as a
real human. They just saw her as something that they
had an image in a picture that they had been
(23:03):
consuming for decades, and they had so removed her from
being a real, breathing human that meeting her in real
life was like they were surprised that she was real.
And I think that really speaks to the sort of
fandom element that you were talking about, This idea that
like you can come if you are a fantasy and
in some ways not even a real human.
Speaker 2 (23:25):
You know what I'm saying. You're like, do I.
Speaker 4 (23:28):
Ever, Yeah, like, don't say anything that I don't like, Like,
keep quiet and look the way I like.
Speaker 3 (23:35):
Then you can be here. But oh you're a real person.
Oh no, I don't want you hear it at all. Yeah.
Speaker 5 (23:43):
So you're exactly right, Annie. All of this happened, but
it was not without pushback. Around like the twenty tens,
people started publicly asking whether or not this image of
a woman from Playboy should be so fundational to technology
es actually in education settings, you know, given conversations about
the need for more women in these spaces and how
(24:05):
to make these spaces more inclusive and more diverse.
Speaker 2 (24:08):
That's really around when you start.
Speaker 5 (24:09):
Hearing like people in public being like, wait a minute,
maybe this isn't so cool. In twenty fifteen, Mattie Zugg,
who was then a student at the Thomas Jefferson High
School for Science and Technology right here in DC hre
live who I should say now is a product safety
engineer at Apple who focuses on preventing tech and abled
abuse and stalking and harassment on Apple platforms.
Speaker 2 (24:27):
So like go Maddie.
Speaker 5 (24:29):
Maddie sounds like she was cool in high school and
is cool now. So Maddie wrote this op ed basically
asking the question of like, should I, as a high
school student at a STEM high school, be given an
image from Playboy as part of my education in technology
and STEM? She writes, I first saw a picture of
(24:49):
Playboy magazines Miss November nineteen seventy two.
Speaker 2 (24:52):
A year ago.
Speaker 5 (24:53):
As a junior at TJ, my artificial intelligence teacher told
our class to search Google for Lenna Soderbird, not the
full image, though, and use her picture to test our
latest coding assignment. At the time, I was sixteen and
struggling to believe that I belonged in a male dominated
computer science class.
Speaker 2 (25:10):
I tried to tune out the boy's sexual comments.
Speaker 5 (25:12):
Why is an advanced science, technology, engineering, and mathematics school
using a Playboy centerfold in its classrooms? Her piece ends
with saying it's time for TJ to say hello to
an inclusive computer science education and say goodbye to Lena.
So Maddie was not the only person who was like,
maybe this image shouldn't be the thing that all of
our education is centered around. In that piece for Wired
(25:34):
I mentioned they talked to several women in technology you
had very similar stories.
Speaker 2 (25:38):
This one is actually pretty funny.
Speaker 5 (25:40):
Deanna Needle, a math professor at UCLA, had similar memories
from college. So in twenty thirteen, she and a colleague
staged a quiet protest. They acquired the rights to a
headshot of the male model Fabio Lonzoni and used that
for their imaging research. So they kind of like turned
it around, like, oh, you're going to use a sexy woman,
Well we'll use a sexy man.
Speaker 2 (26:01):
What do you think about that?
Speaker 3 (26:03):
I love it.
Speaker 5 (26:05):
So in that piece they actually track down and speak
to the real Lenna, who also called for her image
to be retired. She says, I retired from modeling a
long time ago. It is time I retired from tech two.
We can make a simple change today that creates a
lasting change for tomorrow. Let's commit to losing me. And
there's actually some news on that front, because as of
(26:27):
April first of this year, that I Triple E officially
retired the use of the Lenna image and announced they
will no longer be using that image and their publications.
Ours Technica points out that this is kind of a
really big deal that will likely have a ripple effect
in the space.
Speaker 2 (26:42):
Because the journal.
Speaker 5 (26:43):
Has been so historically important for computer imaging development, it'll
likely set a precedent removing this image from common use.
In an email, a spokesperson for the I Triple E
recommended wider sensitivity about the issue writing. In order to
raise awareness of and increase author com clients with this
new policy, program, committee members and reviewers should look for
(27:04):
inclusion of this image, and if present, should ask authors
to replace the Letna image with an alternative.
Speaker 3 (27:10):
Yeah.
Speaker 4 (27:11):
I love that from Lenna herself, like, let's commit to
forgetting me. That's such a great line. But it does
speak to it speaks volumes, as you've been saying, bridget
to our attitude towards women on the internet and towards
consent on the internet. And so when we're thinking about this,
(27:33):
which was foundational, what do you think about this the legacy.
Speaker 3 (27:39):
Of this image?
Speaker 5 (27:40):
Yeah, I love that question. You know what I was
reading about how this image came to be. I'm imagining
a very different time, right, It's the seventies. People aren't
necessarily having a lot of public, loud conversations about the
power dynamics of who's in the room. And who's not
in the room where a lot of this technology is
getting built. And it really made me think of, like, Wow,
(28:02):
the seventies, that probably was such a different time. But
here in twenty twenty four, we are having those conversations,
loud voices, are publicly having those conversations. There are women
and people of color, and trans folks and queer folks
and all kinds of folks who are building and making
the technology that shapes our world today. And so in
twenty twenty four, it almost feels like we are pretending
(28:25):
that we're still in this nineteen seventies we didn't really
know how who could have foreseen world, when in fact
we're not really in that world. People are asking the questions,
people are raising the alarm, And I guess I don't
think it should be several decades after AI technology becomes
ubiquitous for people to start asking the question about how
(28:46):
traditionally marginalized people like women are being used and represented
and perhaps exploited without their consent in these spaces. I
think it provides a really interesting precedent for what's going
on here in twenty twenty four.
Speaker 2 (28:58):
And Jennifer put it really well.
Speaker 5 (29:01):
She writes to me, the crux of the LETNA story
is how little power we have over our data and
how it is used and abused. That threat seems disproportionately
higher for women, who are overrepresented in Internet content but
underrepresented in Internet company leadership and decision making. Given this reality,
engineering and product decisions will continue to consciously and unconsciously
(29:24):
exclude our needs and concerns, Right, And so I really
agree with that that this LETA story really is a
story about power dynamics and who is represented in technology
and who is just sort of like has their needs
exploited or erased?
Speaker 2 (29:38):
Right, Like.
Speaker 5 (29:40):
Men wanting to consume the bodies of women is like
foundational to the Internet. It's like why we have the
Internet the way that we have it. And I think
we know that now. It's like an objective fact about
the Internet and technology. I don't think we can still
make technology that does not honest about because if we're
(30:01):
not being honest about that, we can never fix that,
we can never question that, we can never have that
be a dynamic that we stop perpetuating with technology.
Speaker 4 (30:09):
Yeah, and I think it's like going back to the
point about being in a classroom setting and being shown
explicitly like, this is how women are viewed in the space.
This is what built a lot of what we use
today and we're still talking about it is telling in itself,
and especially when we're seeing that perpetuate in all of
these tech spaces where it still feels in a lot
(30:32):
of ways even though women in marginalized people have built
those spaces that like, you're the guest here, and you're
only here because we're opening our gates a little bit
to let you in, but otherwise, yes, get out.
Speaker 5 (30:49):
And I just think that's a dynamic we need to
be questioning in twenty twenty four. And I think so
like something about the use of this image it's ubiquity
in education spaces, I find so telling. But also even
if you're not studying to be an engineer or something,
I think there is a dynamic that says that if
you are a person who is traditionally marginalized, you're not
(31:10):
a decision maker, you're not a power holder, you're not
doing or making anything that anybody needs to care about,
and the entire dynamic is that we use you. In fact,
so Ding actually points this out on her piece. She says,
while social norms are changing, toward non consensual data collection
and data exploitation. Digital norms seem to be moving in
the opposite direction. Advancement and machine learning algorithms and data
(31:33):
storage capabilities are only making data misuse easier, whether the
outcome is revenge porn or targeted ads, surveillance, or discriminatory AI.
If we want a world where our data can retire
when it's outlived its time or when it's directly harming
our lives, we must create the tools and policies that
empower data subjects to have a say in what happens
(31:53):
to their data, including allowing their data to die. And
so I think, even if you're not somebody who is
a techie, that does concern you this dynamic that just
says we consume, we exploit, we make money from you,
and you don't get to have a say about it.
Speaker 2 (32:07):
That's the dynamics that I think this.
Speaker 5 (32:08):
Lena image really did usher in without really even necessarily
meaning to.
Speaker 6 (32:24):
I think there's a big conversation here on like the
power of capitalism within the tech industry and what makes money.
I can't help but think, like with the Lina image,
the fact that this toxicity was used to make more
profit and more power within this industry. It took forty
to fifty years for it to even have a conversation
(32:44):
about like, let's change it, let's retire it. But the
fact that I had that much pushback because they didn't
care enough and they wanted to build on this toxicity
because they knew it could make money is the most
concerning thing to me. And then and then the powers
that be are saying that, yeah, yeah, we're definitely gonna
control this, and then just goes off to an app
instead of the root of the problem. It seems like
the biggest part of the conversation because even in the
(33:07):
AI world, with new apps coming through, new programs coming through,
and they're all competing with each other, they don't want
to let go of the toxicity. But that's what's making
the money, which is really really concerning.
Speaker 5 (33:19):
Yeah, and I mean like if there was one, so
what of why I wanted to have this conversation Sam?
That is exactly it that it is about money. It
is about capitalism. It is about making money off of
people's own exploitation and selling that exploitation back to them
to make more money. And it's just a really toxic dynamic.
(33:40):
That I believe is harming us and making the people
who have created that dynamic rich all the while they
get to be like, Oh, it's not a big deal
for you.
Speaker 2 (33:50):
Actually, this is going to be really good for you.
This is going to be convenient for you.
Speaker 5 (33:53):
And I don't know, Like I woke up this morning
when I was trying to decide, like what I wanted
to talk to you all today about, and one of
the ideas that I had that I scrapped was just
this feeling that being on the internet just doesn't feel
fun anymore. Anytime I go on a website, anytime I
google something just to find out information, it feels like
(34:13):
a scam.
Speaker 2 (34:14):
It feels like exploitation.
Speaker 5 (34:16):
I feel like I am one click away from somebody
getting my social Security number. It feels like AI generated garbage.
And I just think we have hit the wall of
that feeling. I can't imagine that I'm alone in this.
I think the feeling of being showing up online today
in twenty twenty four feels exhausting, and I think part
(34:37):
of it is because it feels like we are being
bled dry by people that we have already made rich
from our own exploitation.
Speaker 2 (34:44):
Do you want to feel that way.
Speaker 6 (34:46):
Oh absolutely, I think with because getting on TikTok, the
first opening video I'm sure you've seen it is that
content manager who is like, I'm here for the safety
of TikTok.
Speaker 1 (34:55):
Have you seen this?
Speaker 2 (34:56):
I have not.
Speaker 1 (34:57):
She's been there.
Speaker 6 (34:58):
She is for safety in some like she has a
very specific time. Yes, Susie someone she is very white
and she's very redheaded. It's I was like, okay, so
we've got played into the xenophobey. She's like, look, I'm
a white person. I'm gonna help you out here.
Speaker 2 (35:11):
Don't worry. Don't worry.
Speaker 1 (35:12):
I got this.
Speaker 6 (35:14):
But that's the first thing that I'm seeing, so like,
you know, urging TikTok users to talk to the government
because they voted this in and this is real bad
and all this and not whatnot. And I'm just like,
all right, it's gonna go away next. This is now
my attitude because also I'm very tired. But also I
just got an email saying that AT and d Yes
has a record that oh that that reached. They have
(35:36):
your stuff. But good news, since you don't have a
bill with us, we don't. We didn't you did, didn't
get any perfinent information. But I literally think every month
I have been given seeing an email that says something
of my information is has been breached, and it's nothing
that I have done. It is literally everything from my insurance,
my dental insurance, my healthcare provider, my internet, which I'm like,
(36:00):
what the hell, my phone subscription, my cell phone, which
I'm like, I'm starting to get back to that.
Speaker 1 (36:05):
I think I want a landline.
Speaker 6 (36:07):
I've at this moment, y'all, to each of those things
popping up on things, I'm like, I hadn't. I have
to use that information in order for me to have healthcare.
So y'all, let my healthcare information go out and they
have my social Security number.
Speaker 1 (36:19):
There's nothing I can do about that.
Speaker 6 (36:21):
As many times as I can change my password, the
next email I'm getting is telling me that I've got
a data breach of my information.
Speaker 1 (36:29):
So what is the point.
Speaker 6 (36:31):
Like, at this point, the only way is to rewrite
my identity and to never get online again, which would
be really hard for my job.
Speaker 5 (36:40):
Yes, Like, if you have a phone in your life,
if you vote, if you drive, these things that we
are required to do to participate in public life should
not just be avenues for somebody to make money and
scam us.
Speaker 2 (36:56):
But yet it feels that way.
Speaker 5 (36:58):
And you know what, Sam, I have actually not seen
the TikTok that you're referring to, because I have not
opened my TikTok app in days because it's starting to
feel like QVC and I cannot take it anymore.
Speaker 2 (37:09):
Like whatever happened.
Speaker 5 (37:10):
To spaces on the Internet that we're supposed to feel
like safety or exploration or fun or community or connection.
I'm I hope that somebody out there listening is like
Bridget you're old and on hip. We have those spaces,
they are syz tell me about them. I want to
know about them. But I think that we should. We
really got to get back to, like to those principles
(37:32):
of the Internet feeling like something other than being taken
for a ride on which you are the chump.
Speaker 6 (37:38):
Right, And I will say a lot of people have
felt like Discord and read it has been like brought
in but we already know read it has kind of
its problems. And then I think there's a new lawsuit
with Discord with this problems and its terms of service
changing as well.
Speaker 2 (37:51):
I'm like, what totally happened?
Speaker 6 (37:53):
So there's literally no one is protecting the individual to like,
there's no protection for us at all, but they want
us to say, they want us to take away things
from us, which is like the least of our worries,
or they're just like sorry, you're so like you can't
sue us.
Speaker 5 (38:11):
Yeah, I think everybody is feeling that, but I think
it is particularly dangerous for people who are traditionally marginalized
because yeah, which it's just the expectation.
Speaker 2 (38:21):
That, oh, it's totally fine if.
Speaker 5 (38:24):
People who make apps that non consensually undress women using AI,
why wouldn't they be able to advertise on Facebook or
Instagram or Twitter.
Speaker 2 (38:33):
They got to make money. That's a business.
Speaker 5 (38:35):
Like how easy it is to erase the human people
at the heart of this dynamic, erase their concerns, erase
their needs, erase their harm because men got to make
money off of it.
Speaker 6 (38:47):
I'm thick of it, right, Or is tradition literally like, yeah,
this image has always been here, we need to teach
it as a it's historical.
Speaker 1 (38:54):
Now.
Speaker 6 (38:54):
It was definitely not exploiting somebody or taking advantage of somebody,
or using humiliating content because.
Speaker 1 (39:02):
She wasn't humiliated, I don't think.
Speaker 6 (39:03):
But like in the ideal of like it being forever
and ever and ever, of like your seductive picture being
used for it people, which is a whole different conversation
in itself.
Speaker 2 (39:14):
Yeah, I mean, so Lenna, the real life Lena.
Speaker 5 (39:16):
And again there's a really interesting Wired article that has
an interview with her. She doesn't feel like she was exploited.
She's actually really proud of that image, even as she
recognizes that it's like time for it to be retired.
Speaker 2 (39:26):
However, she does wish.
Speaker 5 (39:29):
That she had been fairly compensated for what would go
on to be her like non consensual contributions to tech
when she.
Speaker 2 (39:35):
Took that image.
Speaker 5 (39:36):
There's no way that as a you know, young playboy
playmate in the night in nineteen seventy one or whatever,
you would have a sense of like, well, if this
goes on to be to make me the first lady
of the Internet, I better have compensation and protections. No way, right,
So in that Wired piece they say it makes sense
that she would feel this way. Unlike so many women
in tech, Lenna has at least been acknowledged, even feted
(39:59):
for her contribution. And she did that work and the
people started using that photo in this neat new way,
and now she has this kind of immortality woven into
the design of the machine. This is from Marie Hicks,
a historian of technology and the author of programmed Inequality.
All of this happened for a reason. Hicks writes, if
they hadn't used a Playboy centerfold, they almost certainly would
have used another picture of a pretty white woman. The
(40:20):
playboy thing gets our attention, but really what it's about
is this world building that's gone on in computing from
the beginning. It's about building worlds for certain people and
not for others.
Speaker 6 (40:31):
I find it interesting, dude, that they invited her to
the conference, Like, I'm wondering what the purpose was other
than two like for because it obviously wasn't to ask
her questions about tech and how she did this thing,
because they did not even consider human as we know.
It was just literally to aggle her in real life.
Speaker 2 (40:51):
Yeah, I was thinking about why they did that.
Speaker 5 (40:54):
I don't know, I have parted me wonders if it
was like an attempt to be like, oh, we need
to acknowledge the way that this woman's image was so
foundational to our technology, but then like not really doing it,
like still sort of treating her as like a booth
babe or something like.
Speaker 6 (41:11):
I don't know, right, I just find all of that interesting,
in this level of like not again of not what
she was doing this for. She came in with, like
whatever her ambitions were in being this model and whatnot,
and then all of a sudden being told.
Speaker 1 (41:27):
You're being used.
Speaker 6 (41:30):
As an example for computers, like for six images for computers,
and not only will you see this, but your grandkids
will also, like if she has children like any of
those things, and your your family members forever.
Speaker 5 (41:42):
I'm like, whoo, who would have ever thought that that's
how that image would go on to be used in history?
And I really think like this is where we are today,
and this is like why I wanted to talk about
this is that I think, like the idea, the concept
of images being shared online, the way we understand that
in twenty twenty four, the fact that this image of
Leta became so foundational to that concept without her consent,
(42:06):
you know, perhaps without like proper contribution to the way
that she actually was foundational to that and building out
this entire universe around it that is mostly like controlled
and protected and profited off by men and nobody stopping
to ask about the ramifications of that until decades later.
I just think it really establishes like a concerning precedent
(42:28):
for where we're going right now with AI in twenty
twenty four. And it doesn't have to We can learn
from what we did with that Letta image if we
ask the right questions, if we center the right perspectives
and the right voices, and so Yeah, I don't want
to wait until twenty forty to be like, oh, should
we have been talking about the ways that women and
girls and other marginalized people are being exploited and used
(42:49):
to make technology companies money.
Speaker 2 (42:52):
I don't want to ask that question when it's too.
Speaker 6 (42:54):
Late, right, And here's like the big conversation is, shouldn't
we alsoe that big companies and big tech companies and
big companies that are developing are purposely leaving out marginalized
people because they like the old ways and that it's
only making a certain amount of people money.
Speaker 2 (43:11):
Yes, that's exact.
Speaker 5 (43:12):
I think that I would argue that's exactly what's going on.
I mean, in twenty twenty four, there are so many loud,
thoughtful voices from women and people of color who are
really talking about AI in some interesting and thoughtful ways.
So they exist, they are out there. This is the
tale as the oldest time when it comes to technology.
It is not that they are not there. It is
(43:33):
that they are being, whether intentionally or unintentionally, marginalized, sideline silenced,
pushed aside to make room for voices who are just
repeating the status quo, who are just saying like, well,
I'm trying to get rich, So who cares how this harms?
Somebody who cares about whether or not this goes on
(43:53):
to exploit.
Speaker 2 (43:53):
And I think that's really it's really.
Speaker 5 (43:55):
Like a it's a little bit of a complicated cultural
dynamic and cultural shift that I think that we really
got a break.
Speaker 3 (44:02):
Yeah.
Speaker 4 (44:03):
Yeah, And it's really sad, going back to your point
Bridget of like the Internet not being a place of
joy anymore, because so many times it was marginalized people
who made those spaces because they couldn't find them anywhere else,
and then these companies come in and are like, Okay,
well we can make money, and then it doesn't become
(44:26):
a joyous space anymore. It becomes a very toxic, a
toxic place. And so like hearing this story and seeing
how so much of what we use still is based
on something that was.
Speaker 3 (44:42):
A guy walked in with the Playboy.
Speaker 4 (44:44):
Magazine like, it's it's bad when you that doesn't feel
so out of place and what we're talking about in
our current time.
Speaker 5 (44:55):
Yeah, and again, I mean, I opened up our conversation
with this, but I guess and I guess I'll close
with that too. I believe people when I say this,
people think I sound alarmist or extreme, but I mean
it the way that I mean it. I think that
these things are features, not bugs. I think we got
to be honest about the ways that things like misogyny
and exploitation, particularly when it comes to marginalized people, has
(45:17):
been foundational to technology and the Internet from the very beginning.
Speaker 2 (45:22):
I love the Internet. I love technology. It is why
I do the work that I do.
Speaker 5 (45:26):
But I think that until we are honest about that
but these things are features and.
Speaker 2 (45:30):
Not bugs, we will never get anywhere.
Speaker 5 (45:32):
And so I think that it really starts with having
honest conversations about where we started so that we can
get to a place that we that actually feels a
little bit better for everybody.
Speaker 4 (45:41):
Yes, yes, well, thank you so much. As always, Ritete,
every time you come on. I'm like, we could talk
for hours about this and this and this.
Speaker 5 (45:53):
Invite me back for an episode, just dragging Hugh Hefner.
Speaker 2 (45:56):
Yeah, I'll be here for it.
Speaker 1 (45:58):
I think we need to do this.
Speaker 6 (46:00):
I don't think about this for a minute. Back into
the magazine world and jumping into like all of that.
Speaker 5 (46:06):
Don't even get I mean, like this is like spoiler alert.
I like totally had this wrong. For so long in
my life. I was like, oh, Hugh Hefner was a
champion for free speech and civil riots and blah blah blah.
Then I grew up and learned and I was like, actually,
she wasn't such a good.
Speaker 6 (46:20):
Guy, right, I mean we really fed into the but
read the article.
Speaker 1 (46:25):
They're so good.
Speaker 2 (46:26):
Oh my god, they've corbell.
Speaker 4 (46:29):
Yes, yes, oh yes, please come back for that, Bridget.
Speaker 3 (46:36):
In the meantime, where can the good listeners find you?
Speaker 5 (46:39):
Well, you can listen to my podcast. There are no
girls on the internet. You can follow me. I'm not
really on social media that much anymore, but you can
try to find me there. I'm on Instagram at Bridget
Marie in DC. I am at Blue Sky at Bridget
Todd on threads at Bridget, Marie and DC sometimes on TikTok.
Speaker 2 (46:55):
You'll I'm easy to find. You'll find Google me, You'll
find me.
Speaker 1 (46:59):
Yes, Google, that's a flex.
Speaker 3 (47:02):
It's true though. Our listeners are smart. They can find
you and listeners.
Speaker 4 (47:06):
If you would like to contact us, you can our
email us a step media mom stuff at iHeartMedia dot com.
You can find us on Twitter at most of the podcasts,
or on TikTok and Instagram that stuff I Never told you.
We're also on YouTube. We have a tea public store
and a book you can get wherever you get your books.
Thanks as always to our superduce Christine, our executive producer
My and your contributor Joey. Thank you and thanks to
you for listening. Stuff Will Never Told you This poction
(47:27):
by Heart Radio. For more podcasts my heart Radio, you
can check out the heart Radio app Apple Podcasts wherever
you listen to your favorite show