All Episodes

April 24, 2024 46 mins

Bridget Todd shines a light on the Lenna image, an image that became foundational to the internet and has an enduring legacy. The story of how this image became so widespread without the consent or fair compensation of the model in question highlights problematic attitudes around women in tech spaces.

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:05):
Hey, this is Annie and Samantha and welcome to stuff
I never told you are picture of iHeartRadio. And today
we are once again thrilled to be joined by the fabulous,
the fantastic Bridget Todd to welcome Bridget.

Speaker 2 (00:25):
Thank you so much for having me.

Speaker 3 (00:27):
It's always such a joy when I get to start
my week talking to you all.

Speaker 4 (00:31):
Yes, I feel like you have an extra glow. Maybe
it's because you've been like soaking in all of the
sun on the beautiful beaches abroad. I've been stalking you
on Instagram and I'm like, how is this woman always traveling?
And I miss and I'm sad that I'm not.

Speaker 2 (00:46):
I'm just kidding. Oh you all should have come.

Speaker 3 (00:49):
We actually so, I was in Mazatlan, Mexico for a
clips to be in a hospitality. We actually it was
one of those trips where we'd invited all of our friends.

Speaker 2 (00:59):
We're gonna get a big house on the beach.

Speaker 3 (01:01):
It's gonna be amazing, and then all of our friends
are in and then one by one by one by one,
I'm just there.

Speaker 2 (01:07):
Alone, essentially.

Speaker 4 (01:09):
No very enjoyed it.

Speaker 3 (01:12):
I enjoyed it. I love Mexico. It is one of
my favorite places. It was my first time in Mazatlan.
Ten out of ten completely recommend.

Speaker 5 (01:21):
Okay, did you see the eclipse and totality?

Speaker 2 (01:25):
I saw the eclipse of totality.

Speaker 3 (01:26):
It was my first time ever being in the path
of totality of an eclipse.

Speaker 2 (01:31):
Have you have either of you ever experienced this?

Speaker 6 (01:33):
No?

Speaker 5 (01:33):
Yes? Once?

Speaker 2 (01:35):
What were your thoughts, Annie, I am dying to know.

Speaker 5 (01:41):
Oh my gosh.

Speaker 1 (01:42):
If this was a different podcast, we would go into
a whole separate thing because I had like a relationship
issue that was happening on this day.

Speaker 5 (01:51):
And kind of a drama situation.

Speaker 1 (01:55):
So a lot of times when I look back at it,
a lot of the pictures I took, I was like,
oh wow, we were fighting. But it was also a
work event that I was at, so there was that layer.
But it was beautiful. It was so cool. I it
sounds silly, but I love space, like I like love.

(02:15):
The stars are like my favorite thing, So it was
really really cool to see. It was not what I
quite expected because the glasses. Samanth and I were joking
about this recently, but the glasses feel so funny because
you're like looking around like they're not working and then
you like find the space you're supposed to look at.

(02:37):
I was just like very very happy to see it, honestly,
like all that drama I was working aside, I remember thinking,
this is really cool that I get to see this,
and I'm really happy that I get to see this.

Speaker 2 (02:49):
Yes, that was what I remember too.

Speaker 3 (02:52):
I burst into tears, and the next day I woke
up in the middle of the night in a panic
because I was worried that I will forget what it
looked like being in totality like that, Like that's how
like I had never seen it, but he'd never see
anything like it. Anybody who listens to their no girls
on the internet is probably so sick of me talking

(03:13):
about this eclipse, and I am fully making seeing this
total eclipse like my personality. But it's kind But yeah,
I'm already planning where I will go for the next one,
So I guess I will see y'all in I think,
what is it, Spain?

Speaker 2 (03:30):
So that's that's the one.

Speaker 3 (03:32):
The next the next one that you can see if
you can go to Spain and see I think in
twenty twenty six.

Speaker 2 (03:38):
Ok So, but then the one that you're referring.

Speaker 3 (03:40):
To, Sam is like, don't quote me on any of this,
but that's supposed to be like the big one, the
big one that we will probably be able.

Speaker 2 (03:46):
To see in our lifetimes.

Speaker 3 (03:48):
And I think it's in parts of the African continent.

Speaker 2 (03:53):
I want to say, Morocco. Don't quote me on that either.

Speaker 5 (03:56):
Yeah.

Speaker 4 (03:56):
I did say at one point, like you would be
able to view it in the US. That's the next
time you'll be able to. And I don't know it
is like the actual like the totality, as you say,
but like I don't know, because I know nothing about this.
That's the only I.

Speaker 2 (04:10):
Know the date, in no kindness for this past eclipse.

Speaker 3 (04:13):
Earlier this month, we had done so much planning, including
like looking at farmer's almanacs to see what the weather
and cloud coverage is like this time of year. And
that's how we settled on Mazetlan, Mexico, because it was
the place that is closest to us on the East
coast in the United States that was most likely to
not have cloud coverage in April because you could see

(04:35):
it from Vermont and upstate New York and Texas. But
a lot of those places in April might be cloudy,
and so I have friends who were in Vermont and
Upstate New York who were like, Oh, We're just gonna
see it from our house, and I'm like, oh, will you.
Then on the morning of the eclipse in Mazetlan, Mexico,
we'd been there for a week. Every single day it's
like a beautiful, cloudless, blue sky day. So I wake

(04:56):
up on April eighth, the day of the eclipse, and
it's cloudy, the first day we've had in Mexico for
the entire week we have been there. Luckily, during the
eclipse time, the clouds did part, so we did get
to see it, but there would have been a lot
of feelings had we not been able to see it.

Speaker 5 (05:13):
It's a lot of pressure to put on a trip
like that. Honestly, it's pressure to put on the eclipse.

Speaker 4 (05:18):
I mean, it exists.

Speaker 2 (05:20):
It's not their fault.

Speaker 3 (05:24):
So you can can contact like the manager of the
Sky to be like, actually, we didn't get a good view, and.

Speaker 4 (05:31):
We were like, okay, for y'all who are Christians here,
tell this got it. Truly, we get canceled.

Speaker 2 (05:39):
We were like.

Speaker 3 (05:40):
Getting a little superstitious, like the things that we were
doing to try to like ensure good sky. It was
getting a little a little out there.

Speaker 2 (05:49):
We'll just leave it at that.

Speaker 4 (05:50):
He brought a shaman in. I guess we're going in.

Speaker 1 (05:53):
Oh my gosh, Bridgiet, I want to ask so many questions.

Speaker 2 (05:57):
About this later.

Speaker 5 (06:02):
Oh well, I'm very glad that you got to see it.
It is.

Speaker 1 (06:05):
It is amazing, like truly and over on the other
podcast I Do Savor, we did an episode on like
weird companies making money off of the eclipse with their products,
and I have heard from so many people about the
foods they made for the eclipse, and it's brought me
so much story. So oh oh yeah, like totalityea, like tea, oh,

(06:29):
oh my gosh, so many things like this. So I
feel like we have a couple of we have some
years to brainstorm things like this.

Speaker 4 (06:38):
Next giant celebration, but keep that.

Speaker 5 (06:41):
In the back of your head, you know.

Speaker 3 (06:43):
Oh, next time, definitely doing eclipse themed at food party
or dinner party or something.

Speaker 2 (06:48):
I love that.

Speaker 5 (06:49):
Yes, there's a so many puns.

Speaker 1 (06:52):
I will I will hold myself back for now, but
I have to say I am very very excited to
talk about the topic he brought today, Bridget because it
is a thing that I love of like the history
of something I think a lot of people don't question
the history of and it's fascinating and I didn't know
about it. So can you tell us what we're discussing today?

Speaker 3 (07:13):
I feel the exact same way, And today we are
talking about the Lenna image.

Speaker 2 (07:17):
Is this something that either of you had ever heard of?

Speaker 5 (07:20):
I do not know. I've not known.

Speaker 2 (07:23):
So even if.

Speaker 3 (07:24):
You're listening and you're like, what is the Lenna image,
I've never heard of this image. I've never seen this image.
Even if you don't know the story and you don't
feel like you've ever seen this image before, you kind
of do know this image, because, as Linda Kinsler puts
it in a really meaty piece for wireds that I'll
be referencing a few times in this conversation. She writes,
whether or not you know her face, you've used the
technology it helped create practically every photo you've ever taken,

(07:48):
every website you've ever visited, every meme you've ever shared.
Owes some small debt to Lenna, and it really is
exactly as you were saying, Annie, one of those stories
that is found fundational to the Internet and technology that
you don't necessarily think of don't necessarily think of how
it came to be, And especially I think it's one

(08:08):
of those stories that says a lot about technology on
you know, here on Sminty, we've had plenty of conversations
about this. I've had many conversations about this on their
No Girls on the Internet, about how things like massogyny
are kind of can sort of be baked into the
foundation of technology. And I think that is one of
the reasons why tech is so often perpetuating misogyny, not

(08:31):
because it's some sort of an unfortunate bug, but because
this misogyny can be sort of foundational in some ways.
And I think this image really is a good example
of what I mean. And I think, especially as we're
having conversations about the rise of things like new toify
apps and AI generated adult content creators, we're seeing what
is kind of becoming a marketplace that is men making

(08:55):
money off of the bodies and or labor of women
without their consent, certainly without their compensation. And I think
this situation in the Lenna image, where the image of
a woman went on to create this entire field of
technology without her consent, can perhaps really tell us something
about where we're headed in twenty twenty four.

Speaker 1 (09:15):
Yes, absolutely, especially when you consider where it comes from,
which I know we'll talk about. But also, yeah, these
conversations we're having now about like actors perhaps not given
their consent to being used in certain ways. Just and honestly,
it ekes sense to all of us if you've posted
an image online, right, not consenting to using an image,
to an image getting used in a certain way. But

(09:37):
so much about this history is fascinating because it feels
so standardized, which is odd. Can you tell us about that?

Speaker 2 (09:46):
Totally?

Speaker 3 (09:46):
So for folks who don't know, the Lenna image is
literally an image of this woman, Lenna Lenna Forsin. She
is a woman from Sweden who in the seventies was
a model. So this kind of sensual image of her
wearing a tan hat with a purple feather flowing down
her bare back, staring kind of seductively over one shoulder.

(10:07):
That image of her was published in Playboy in nineteen
seventy two. She was essentially a playmate. That image would
go on to become what's called a standard test image.

Speaker 2 (10:16):
So big caveat here. I am not an engineer.

Speaker 3 (10:20):
If I say something that is you're like, if you're
an engineer listening, and you're like, that's not totally correct.

Speaker 2 (10:25):
I am at an engineer.

Speaker 3 (10:26):
But here is a definition of what a standard test
image is that I found from caggle dot com, which
is like a developer community site. They say a standard
test image is a digital image file used across different
institutions to test image processing and image compression algorithms. By
using the same standard test images, different labs were able
to compare results both visually and quantitatively. The images are

(10:48):
in many cases chosen to represent natural or typical images
that a class of processing techniques would need to deal with.
Other test images are chosen because they present a range
of challenges to image reconstrut dirduction algorithms, such as the
reproduction of fine detail and textures, sharp transitions and edges.

Speaker 2 (11:05):
And uniform reasons.

Speaker 3 (11:06):
So basically, to put that in Layman's terms, a standard
test image is like a test image that tests to
make sure that the technology is working as it should be,
or like rendering the way that it should be. Lenna's
image is not the only common standard test image. There's
also one that is like a bunch of different colored
jelly beans on a table. There's another one that's called

(11:29):
peppers that's just a bunch of different colored like red
and green peppers, like calapeno peppers. So this is just
a thing that becomes a way for technologists to test
that the image generating technology is working correctly.

Speaker 1 (11:41):
I do think this is very interesting for a lot
of reasons. But if you have like jelly beans and peppers,
those are things to be consumed. Then when you're thinking
about where they got this image from a Lena, like,
how did this happen? How did this image become this
standard testing thing?

Speaker 2 (12:02):
So this is actually a pretty interesting story.

Speaker 3 (12:04):
The story of Helena's Playboy picture becomes this standard test
image that is everywhere and very ubiquitous, starts with computer
and electrical engineer Alexander Sawchuk. According to the newsletter for
the Institute of Electrical and Electronic Engineering, or the I
triple E, as I have found out it's sometimes called.

Speaker 2 (12:22):
I was talking to somebody about this and I was like, oh,
the I E E ee, and they were like, it's
just the I triple E.

Speaker 5 (12:30):
I want to I E E.

Speaker 4 (12:34):
Actually the I E exactly.

Speaker 2 (12:38):
So it's the summer of nineteen seventy three.

Speaker 3 (12:40):
Alexander Sawchuk was an assistant professor of electrical engineering at
the University of Southern California and also a grad student
in the sip I lab as a manager. As the
story goes, he's like frantically searching around the lab for
a good image to scan for a colleague's conference paper.
He had just sort of gotten bored with their usual
stock test images because they most had come from like

(13:01):
nineteen sixties TV standards and then we're just a little
bit dull. He wanted something glossy and sort of like
fresh and dynamic, but he also wanted to use a
human face specifically.

Speaker 2 (13:11):
Just then, as the story goes.

Speaker 3 (13:13):
Somebody happens to walk in holding the most recent issue
of Playboy magazine.

Speaker 2 (13:18):
Why this person was bringing Playboy magazine into his workplace,
I cannot tell you.

Speaker 4 (13:22):
How good did you just come into your world institute?

Speaker 2 (13:26):
Okay?

Speaker 5 (13:27):
Cool?

Speaker 3 (13:27):
Yeah, Like, I mean, I do think that that sort
of like gives you a sense of like the dynamics
that we're dealing with. Right, that somebody just happens to
walk in with the with the most recent Playboy under
their arm, right, the engineers tore away the top third
of the centerfold so they could wrap it around the
drum of their mrhead wire photo scanner, which they had

(13:48):
outfitted with analog to digital converters, one for each red, green,
and blue channels, and an HP twenty one hundred mini computer.
So all of that to say is that they effectively
crop to this image so that you can't see the
models bears in the image, so it's just a picture
of her from the shoulders up looking over her shoulder.
It's still like quite a seductive photo, but the full

(14:10):
photo has her like bare booty in it. She's wearing
I look like a feather boa and like thigh high
stockings looking over her shoulder. So back in the seventies
and eighties, this image was really sort of like used
in very limited cases. You could only really see it
on dot org domains. It was pretty limited to like engineers.
Then in July of nineteen ninety one, the image was

(14:32):
featured on the cover of Optical Engineering, alongside that other
test image of the different colored peppers. Funny enough, I
took a look at that cover. It's all black and white,
so I'm like, oh, I think they're trying to demonstrate
that like these images had all these different dynamic colors,
but both of them are rendered in black and white,
kind of rendering that meaningless. So this is when Playboy

(14:55):
gets wind of this, and they are not happy because
it's basically copyright infringement, which this is not related to
the story, but I always have to add this whenever
it comes up how litigious Hugh Hefner and Playboy were.
I always think this is very rich because, as y'll
probably know, Hugh Hefner made an entire lucrative industry off

(15:15):
of images of Marilyn Monroe that she did for a
calendar company for which she was only paid fifty dollars.
Many years after that photo shoot, Hugh Hefner bought those
photographs from the calendar company republished them without Marilyn Monroe's
consent or permission in nineteen fifty three. That was the
first ever issue of Playboy. Hugh Hefner paid five hundred dollars.

(15:36):
She got fifty dollars. Right, So, whenever I read about
how litigious Playboy is, which they're very litigious, I always
had to chuckle at that that, Oh, like, you don't
want somebody profiting off of your intellectual property. But had
no problem profiting off of a woman's body without compensating
her fairly or even her consent.

Speaker 2 (15:55):
Interesting, this is like par for the course for him.
Oh my Goodnes, don't even get me started with Shu Hefner.
We will be here all day.

Speaker 4 (16:06):
The things that came out after he died, which I'm like, Wow,
he had a pretty good, like powerful handle on people
not talking until he died.

Speaker 2 (16:14):
Oh my gosh.

Speaker 3 (16:15):
I was listening to an episode of Celebrity memoir book
Club where they read a lot of X playmate and
ex Playboy bunny memoirs.

Speaker 2 (16:24):
Some of the things that they write about, I'm like,
oh my god.

Speaker 3 (16:27):
Like even even Lenna in an interview, she talks about
how in the seventies, after this photo shoot, she was
invited to go to the Playboy mansion, and the quote
is something like, they made it clear in the invites
that I would have to spend time with Hugh Hefner
while he was in his dressing robe, and I said no, thanks.

Speaker 5 (16:47):
I mean she already knew.

Speaker 6 (16:50):
Was like yeah, yeah.

Speaker 3 (17:04):
So Playboy threatens to sue these engineers, and at this point,
the engineers, it sounds like, had like grown very fond
of using this image so they fought back. Eventually, Playboy
back down because, as a Playboy vice president put it, quote,
we decided we should exploit this because it is a phenomenon.
So yeah, by his own words like, oh, let's exploit this.

Speaker 6 (17:24):
Yeah.

Speaker 2 (17:25):
No, talk about the fact that this is.

Speaker 3 (17:27):
Two groups of men fighting over who owns this image
of a woman, in one case being used in a
manner that is completely without her consent or control. It
just it already from the beginning, it just feels to
me like men fighting over how they can use a
woman's representation that I think is so foundational to some
of the conversations we're having about technology like AI right

(17:50):
here in twenty twenty.

Speaker 1 (17:51):
Four, absolutely, and she did become pretty foundational, right.

Speaker 3 (17:58):
Oh, absolutely, So this is when the image of Lena
really becomes super popular. The whole drama about the cover
catapults this image into what you might think of as
like early internet virality or popularity.

Speaker 2 (18:10):
This was in nineteen ninety five.

Speaker 3 (18:12):
The use of the photo and electronic imaging has been
described as clearly one of the most important events in history.
It is truly hard to overstate how ubiquitous this one
image is in technology. There is this fascinating interactive piece
by Jennifer Ding at the Pudding.

Speaker 2 (18:27):
The piece is so cool.

Speaker 3 (18:28):
It's like one of those interactive pieces that has a timeline.

Speaker 2 (18:31):
Definitely check it out. But in that piece, Ding.

Speaker 3 (18:34):
Actually includes a freeze frame of the show Silicon Valley
on HBO where in the background there is a poster
with the Lenna image on the wall right. So this
image is also included in scientific journals just all over
the place.

Speaker 2 (18:48):
Ding found that within the dot edu.

Speaker 3 (18:50):
World, so like websites related to education, the Lenna image
continues to appear in homework questions, class slides, and to
be hosted on educational and research sites, ensuring that it
will be passed down to new generations of engineer. So
this became so popular that Lenna herself is often called
the first Lady of the Internet.

Speaker 1 (19:11):
Wow, yeah, I kuld of like her taking that picture
having no idea that this is what would happen, Which, yeah,
I mean, I guess that speaks to the next.

Speaker 5 (19:19):
Question, why did this image take off the way that
it did?

Speaker 2 (19:24):
Well? If you asked David C.

Speaker 3 (19:25):
Munson, who is the editor in chief of the I
E or the Tripoli Transactions on Image processing, he said
that the image happened to meet all of these requirements
for a good test image because of its detail, it's
flat regions, shading, and texture. But even he will not
leave out the obvious fact that it's also a picture

(19:46):
of like a seductive, sexy young woman. Duh, right, Like
that's that's definitely part of it. He says, the Lena
image picture is of an attractive woman. It is not
surprising to me that the mostly male image processing research
community gravitated toward an image that they found attractive. And
so I do think there's something about these highly male

(20:07):
dominated spaces where it's not just that there's a lot
of men, it's like their worldviews, their interests, their perspectives,
their biases that are really taking up a lot.

Speaker 2 (20:18):
Of space in these in these spaces.

Speaker 3 (20:21):
I just think that men feel like these spaces are theirs,
and that they are free to decorate those spaces with
the pretty women that they think they feel like they
should be able to use without their consent or compensation.
I just think that, like Annie, you mentioned earlier, that
the other test images are these things that you consume, right,
like peppers or jelly beans.

Speaker 2 (20:41):
There's another famous one of.

Speaker 3 (20:42):
A baboon that's like has different colors on its face.

Speaker 2 (20:46):
It's interesting to me that it's these.

Speaker 3 (20:47):
Things that are not human, things that are like animal
or or that you consume that like throwing a sexy
young woman into that mix. I don't think maybe seem
like a huge departure for these guys.

Speaker 1 (20:58):
Yeah, and again, when you think about things in the
realm of AI or even I know I've complained about
this many times, but in the worlds of fandom are gaming,
it's like that, it's like, you can come into our
world on our terms and you wear what we want
you to wear. You are here because we let you
be here in this male dominated space, but you're gonna

(21:19):
do what we want. It's not up to you. And
that's the only way that you can you can be
in this world. But that being said, there has been
some pushback recently, ish right, Bridget.

Speaker 2 (21:35):
Yeah, So one thing about what you just said.

Speaker 3 (21:38):
When I was researching for this episode, some of the
different engineers who had contributed to this image's popularity, they
were quoted when they actually met the actual real Letta
at a conference that she was invited to. They were like,
I can't believe she's a real person. A part of
me was like, you didn't even see her as a
real human. They just saw her as something that they
had got an image in a picture that they had

(22:01):
been consuming for decades, and they had so removed her
from being a real, breathing human that meeting her in
real life was like they were surprised that she was real.
And I think that really speaks to the sort of
fandom element that you were talking about, this idea that
like you can come if you are a fantasy and
in some ways not even a real human.

Speaker 2 (22:23):
You know what I'm saying.

Speaker 1 (22:26):
You're like, do I ever, Yeah, Like, don't say anything
that I don't like, like, keep quiet and look.

Speaker 5 (22:32):
The way I like, then you can be here. But
oh you're a real person.

Speaker 2 (22:38):
Oh no, I don't.

Speaker 5 (22:38):
Want you hear it at all.

Speaker 3 (22:41):
Yeah, So you're exactly right, Annie. All of this happened,
but it was not without pushback. Around like the twenty tens,
people started publicly asking whether or not this image of
a woman from Playboy should be so foundational to technology,
especially in education seating, you know, given conversations about the

(23:01):
need for more women in these spaces and how to
make these spaces more inclusive and more diverse.

Speaker 2 (23:06):
That's really around when you start.

Speaker 3 (23:07):
Hearing like people in public being like, wait a minute,
maybe this isn't so cool. In twenty fifteen, Mattie Zugg,
who was then a student at the Thomas Jefferson High
School for Science and Technology right here in DC riilive
who I should say now is a product safety engineer
at Apple who focuses on preventing tech and abled abuse
and stalking and harassment on Apple platforms.

Speaker 2 (23:26):
So like go Maddie.

Speaker 3 (23:27):
Maddie sounds like she was cool in high school and
is cool now. So Maddie wrote this op ed basically
asking the question of like, should I, as.

Speaker 2 (23:35):
A high school student at at a STEM.

Speaker 3 (23:38):
High school, be given an image from Playboy as part
of my education in technology and STEM? She writes, I
first saw a picture of Playboy magazines Miss November nineteen
seventy two. A year ago as a junior at TJ
my artificial intelligence teacher told our class to search Google
for Lenna Soderbird, not the full image though, and use

(23:59):
her pick to test our latest coding assignment. At the time,
I was sixteen and struggling to believe that I belonged
in a male dominated computer science class. I tried to
tune out the boy's sexual comments. Why is an advanced science, technology, engineering,
and mathematics school using a Playboy centerfold in its classrooms?
Her piece ends was saying it's time for TJ to

(24:20):
say hello to inclusive computer science education and say goodbye
to Lena. So Maddie was not the only person who
was like, maybe this image shouldn't be the thing that
all of our education is centered around. In that piece
for Wired, I mentioned they talked to several women in technology.
You had very similar stories. This one is actually pretty funny.
Deanna and Needle, a math professor at UCLA, had similar

(24:41):
memories from college. So in twenty thirteen, she and a
colleague staged a quiet protest. They acquired the rights to
a headshot of the male model Fabio Lonzoni and used
that for their imaging research. So they kind of like
turned it around, like, oh, you're gonna use a sexy woman,
Well we'll use a sexy man.

Speaker 2 (24:59):
What do you think about that?

Speaker 5 (25:02):
I love it.

Speaker 3 (25:03):
So in that piece they actually track down and speak
to the real Lenna, who also called for her image
to be retired. She says, I retired from modeling a
long time ago. It is time I retired from tech two.
We can make a simple change today that creates a
lasting change for tomorrow. Let's commit to losing me. And
there's actually some news on that front, because as of

(25:26):
April first of this year, that I Triple E officially
retired the use of the Lenna image and announced they
will no longer be using that image and their publications.
Ours Technica points out that this is kind of a
really big deal that will likely have a ripple effect
in the space. Because the journal has been so historically
important for computer imaging development, it'll likely set a precedent

(25:47):
removing this image from common use. In an email, a
spokesperson for the I Triple E recommended wider sensitivity about
the issue, writing, in order to raise awareness of and
increase author compliance with this new policy, program, committee members
and reviewers should look for inclusion of this image, and
if present, should ask authors to replace the Lenna image
with an alternative.

Speaker 1 (26:08):
Yeah, I love that from Lenna herself, Like, let's commit
to forgetting me. That's such a great line, but it
does speak to it speaks volumes, as you've been saying,
bridget to our attitude towards women on the internet and
towards consent on the internet. And so when we're thinking

(26:29):
about this, which was foundational, what do you think about
this the legacy of this image.

Speaker 3 (26:38):
Yeah, I love that question. You know what I was
reading about how this image came to be? I'm imagining
a very different time, right, It's the seventies. People aren't
necessarily having a lot of public loud conversations about the
power dynamics of who's in the room and who's not
in the room where a lot of this technology is
getting built. And it really made me think of like, wow,

(27:00):
the seventies, that probably was such a different time. But
here in twenty twenty four, we are having those conversations,
loud voices, are publicly having those conversations. There are women
and people of color, and trans folks and queer folks
and all kinds of folks who are building and making
the technology that shapes our world today. And so in
twenty twenty four, it almost feels like we are pretending

(27:23):
that we're still in this nineteen seventies we didn't really
know how who could have foreseen world? When in fact
we're not really in that world. People are asking the
questions people are raising the alarm, and I guess I
don't think it should be several decades after AI technology
becomes ubiquitous for people to start asking the question about

(27:43):
how traditionally marginalized people like women are being used and
represented and perhaps exploited without their consent in these spaces.
I think it provides a really interesting precedent for what's
going on here in twenty twenty four.

Speaker 2 (27:56):
And Jennifer Ding put it really well.

Speaker 3 (27:59):
She writes to me, the crux of the Lena story
is how little power we have over our data and
how it is used and abused. That threat seems disproportionately
higher for women, who are overrepresented in Internet content but
underrepresented in Internet company leadership and decision making. Given this reality,
engineering and product decisions will continue to consciously and unconsciously

(28:22):
exclude our needs and concerns, right, And so I really
agree with that that this Leta story really is a
story about power dynamics and who is represented in technology
and who is just sort of like has their needs
exploited or erased.

Speaker 2 (28:36):
Right Like.

Speaker 3 (28:38):
Men wanting to consume the bodies of women is like
foundational to the Internet. It's like why we have the
Internet the way that we have it, and I think
we know that now it's like an objective fact about
the Internet and technology. I don't think we can still
make technology that does not honest about that, because if

(28:59):
we're not being honest about that, we can never fix that,
we can never question that, we can never have that
be a dynamic that we stop perpetuating with technology.

Speaker 1 (29:08):
Yeah, And I think it's like going back to the
point about being in a classroom setting and being shown
explicitly like this is how women are viewed in space.
This is what built a lot of what we use today,
and we're still talking about it is telling in itself,
and especially when we're seeing that perpetuate in all of
these tech spaces where it still feels in a lot

(29:31):
of ways even though women in marginalized people have built
those spaces that like you're the guest here and you're
only here because we're opening our gates a little bit
to let you in, but otherwise, yes, get out.

Speaker 3 (29:47):
And I just think that's a dynamic we need to
be questioning in twenty twenty four. And I think so
like something about the use of this image it's ubiquity
in education spaces I find so telling. But also, even
if you're not studying to be an engineer or something,
I think there is a dynamic that says that if
you are a person who is traditionally marginalized, you're not

(30:08):
a decision maker, you're not a power holder, you're not
doing or making anything that anybody needs to care about.
And the entire dynamic is that we use you. In fact,
so Ding actually points this out on her piece. She says,
while social norms are changing toward non consensual data collection
and data exploitation, digital norms seem to be moving in
the opposite direction. Advancement and machine learning algorithms and data

(30:31):
storage capabilities are only making data misuse easier, whether the
outcome is revenge porn, or targeted ADS, surveillance, or discriminatory AI.
If we want a world where our data can retire
when it's outlived its time, or when it's directly harming
our lives, we must create the tools and policies that
empower data subjects to have a say in what happens

(30:51):
to their data, including allowing their data to die. And
so I think, even if you're not somebody who is
a techie, that does concern you this dynamic that just says,
we consume, we exploit, we make money from you, and
you don't get to have a say about it. That's
the dynamics that I think this Lena image really did
usher in without really even necessarily meaning to.

Speaker 4 (31:22):
I think there's a big conversation here on like the
power of capitalism within the tech industry and what makes money.
I can't help but think, like with the Lina image,
the fact that this toxicity was used to make more
profit and more power within this industry. It took forty
to fifty years for it to even have a conversation about,

(31:43):
like let's change it, let's retire it. But the fact
that it had that much pushback because they didn't care
enough and they wanted to build on this toxicity because
they knew it could make money is the most concerning
thing to me. And then the powers that be are
saying that, yeah, yeah, we're definitely going to control this
and then just goes after an app instead of the
root of the problem. It seems like the biggest part

(32:03):
of the conversation because even in the AI world, with
new apps coming through, new programs coming through, and they're
all competing with each other, they don't want to let
go of the toxicity. But that's what's making the money,
which is really really concerning.

Speaker 3 (32:18):
Yeah, And I mean, like if there was one, so
what of why I wanted to have this conversation Sam.
That is exactly it that it is about money. It
is about capitalism. It is about making money off of
people's own exploitation and selling that exploitation back to them to.

Speaker 2 (32:33):
Make more money.

Speaker 3 (32:34):
And it's just a really toxic dynamic that I believe
is harming us and making the people who have created
that dynamic rich all the while they get to be like, Oh,
it's not a big deal for you.

Speaker 2 (32:48):
Actually, this is going to be really good for you.
This is going to be convenient for you.

Speaker 3 (32:51):
And I don't know, Like I woke up this morning
when I was trying to decide like what I wanted
to talk to you all today about, and one of
the ideas that I had that I scrapped was just
this feeling that being on the internet just doesn't feel
fun anymore. Anytime I go on a website, anytime I
google something just to find out information, it feels like

(33:12):
a scam.

Speaker 2 (33:12):
It feels like exploitation.

Speaker 3 (33:14):
I feel like I am one click away from somebody
getting my social Security number. It feels like AI generated garbage,
and I just think we have hit the wall of
that feeling.

Speaker 2 (33:26):
I can't imagine that I'm alone in this.

Speaker 3 (33:28):
I think the feeling of being showing up online today
in twenty twenty four feels exhausting, and I think part
of it is because it feels like we are being
bled dry by people that we have already made rich
from our own exploitation.

Speaker 2 (33:42):
Do you want to feel that way? Oh?

Speaker 4 (33:44):
Absolutely, I think with because getting on TikTok, the first
opening video I'm sure you've seen it is that content
manager who's like, I'm here for the safety of TikTok.
Have you seen this?

Speaker 2 (33:54):
I have not.

Speaker 4 (33:55):
She's been there. She is for safety in something like
she has a very specific safety yes, Susie someone she
is very white and she's very redheaded. It's I was like, okay,
so we've got played into the xenophobe. She's like, look,
I'm a white person. I'm gonna help you out here.

Speaker 2 (34:09):
Don't worry. Don't worry.

Speaker 6 (34:10):
I got this.

Speaker 4 (34:12):
But that's the first thing that I'm seeing, So, like,
you know, urging TikTok users to talk to the government
because they voted this in and this is real bad
and all this and not whatnot. And I'm just like,
all right, it's gonna go away next. This is now
my attitude because also I'm very tired. But also I
just got an email saying that AT and d YES
has a record that oh that that reached. They have

(34:34):
your stuff. But good news. Since you don't have a
bill with us, we don't, we didn't. You didn't even
get any perfecent information. But I literally think every month
I have been given seeing an email that says something
of my information is has been breached, and it's nothing
that I have done. It is literally everything from my insurance,
my dental insurance, my healthcare provider, my internet, which I'm like,

(34:58):
what the hell, my phone subscription, my cell phone, which
I'm like, I'm starting to get back to that. I
think I want a landline. I'm gonna at this moment, y'all,
to each of those things popping up on things. I'm like,
I hadn't I have to use that information in order
for me to have healthcare. So y'all, let my healthcare
information go out and they have my Social Security number.
There's nothing I can do about that as many times

(35:19):
as I can change my password. The next email I'm
getting is telling me that I've got a data breach
of my information. So what is the point. Like, at
this point, the only way is to rewrite my identity
and to never get online again, which would be really
hard for my job.

Speaker 3 (35:38):
Yes, Like if you have a phone in your life,
if you vote, if you drive, these things that we
are required to do to participate in public life should
not just be avenues for somebody to make money and
scam us.

Speaker 2 (35:54):
But yet it feels that way.

Speaker 3 (35:56):
And you know what, Sam, I have actually not seen
the TikTok that you're to because I have not opened
my TikTok app in days. Because it's starting to feel
like QVC and I cannot take it anymore. Like whatever
happened to spaces on the Internet that we're supposed to
feel like safety or exploration or fun or community or connection.
I'm I hope that somebody out there listening is like,

(36:18):
bridget you're old and on hip.

Speaker 2 (36:19):
We have those spaces, they are syz tell me about them.
I want to know about them.

Speaker 3 (36:24):
But I think that we should we really got to
get back to like to those principles of the Internet,
feeling like something other than being taken for a ride
on which you are the chump, right.

Speaker 4 (36:37):
And I will say a lot of people have felt
like Discord and read it has been like brought in,
but we already know Reddit has god of its problems.
And then I think there's a new lawsuit with Discord
with his problems and in terms of service changing as well.
I'm like, what, totally it's happened. So there's literally no
one is protecting the individual to like, there's no protection

(36:57):
for us at all, but they want us to say,
they want us to take away things from us, which
is like the least of our worries, or they're just
like sorry, You're like, you can't sue us.

Speaker 3 (37:09):
Yeah, I think everybody is feeling that, but I think
it is particularly dangerous for people who are traditionally marginalized
because yeah, which it's just the expectation that, oh, it's
totally fine. If people who make apps that non consensually
undress women using AI, why wouldn't they be able to

(37:29):
advertise on Facebook or Instagram or Twitter.

Speaker 2 (37:31):
They got to make money, that's a business.

Speaker 3 (37:33):
Like how easy it is to erase the human people
at the heart of this dynamic. Erase their concerns, erase
their needs, erase their harm because men got to make
money off of it.

Speaker 4 (37:45):
I'm thick of it, right, Or is tradition literally like, yeah,
this image has always been here, we need to teach
it as a historical now. It was definitely not exploiting
somebody or taking advantage of somebody or using humiliating content,
because she wasn't humulated, I don't think. But like in
the ideal of like it being forever and ever and ever,
of like your seductive picture being used for it people,

(38:10):
which is a whole different conversation in itself.

Speaker 2 (38:12):
Yeah, I mean, so Lenna, the real life Lenna.

Speaker 3 (38:15):
And again there's a really interesting Wired article that has
an interview with her.

Speaker 2 (38:18):
She doesn't feel like she was exploited.

Speaker 3 (38:20):
She's actually really proud of that image, even as she
recognizes that it's like time for it to be retired. However,
she does wish that she had been fairly compensated for
what would go on to be her like non consensual
contributions to tech when she took that image. There's no
way that as a you know, young playboy playmate in
the night in nineteen seventy one or whatever, you would

(38:40):
have a sense of like, well, if this goes on
to be to make me the first lady of the Internet,
I better have compensation and protections, no way, right. So
in that wired piece they say it makes sense that
she would feel this way. Unlike so many women in tech,
Lenna has at least been acknowledged, even fedan for her contribution.
She did that work, and the people started using that

(39:00):
photo in this neat new way, and now she has
this kind of immortality woven into the design of the machine.
This is from Marie Hicks, a historian of technology and
the author of programmed Inequality.

Speaker 2 (39:11):
All of this happened for a reason.

Speaker 3 (39:13):
Hicks writes, if they hadn't used a Playboy centerfold, they
almost certainly would have used another picture of a pretty
white woman. The playboy thing gets our attention, but really
what it's about is this world building that's gone on
in computing from the beginning. It's about building worlds for
certain people and not for others.

Speaker 4 (39:29):
I find it interesting too that they invited her to
the conference, Like I'm wondering what the purpose was other
than two like for because it obviously wasn't to ask
her questions about tech and how she did this thing
because they did not even consider human as we know.
It was just literally to oggle her in real life.

Speaker 2 (39:49):
Yeah, I was thinking about why they did that.

Speaker 3 (39:53):
I don't know, I have parted me wonders if it
was like an attempt to be like, oh, we need
to acknowledge the way that this woman's image was so
foundational to our technology, but then like not really doing it,
like still sort of treating her as like a booth
babe or something like.

Speaker 4 (40:09):
I don't know, right, I just find all of that
interesting in this level of like not again of not
what she was doing this for. She came in with
like whatever her ambitions were in being this model and whatnot,
and then all of a sudden being told you're being
used as an example for computers, like for specially images

(40:31):
for computers, and not only will you see this, but
your grandkids will also like if she has children, like
any of those things, and your your family members.

Speaker 3 (40:39):
Forever and like who would who would have ever thought
that that's how that image would go on to be
used in history? And I really think like this is
where we are today, and this is like why I
wanted to talk about this is that I think, like
the idea, the concept of images being shared online, the
way we understand that in twenty twenty four, the fact
that this image of Letta became so foundational to that

(41:02):
concept without her consent, you know, perhaps without like proper
contribution to the way that she actually was foundational to that,
and building out this entire universe around it that is
mostly like controlled and protected and profited off by men,
and nobody stopping to ask about the ramifications of that
until decades later. I just think it really establishes like

(41:25):
a concerning precedent for where we're going right now with
AI in twenty twenty four, And it doesn't have to
We can learn from what we did with that Letta
image if we ask the right questions, if we cent
her the right perspectives and the right voices, and so Yeah,
I don't want to wait until twenty forty to be like, oh,
should we have been talking about the ways that women
and girls and other marginalized people are being exploited and

(41:47):
used to make technology companies money.

Speaker 2 (41:50):
I don't want to ask that question when it's too late.

Speaker 4 (41:53):
Right, And here's like the big conversation is shouldn't we
also recognize that big companies, big tech companies and big
companies that are developing are purposely leaving out marginalized people
because they like the old ways and in that it's
only making a certain amount of people money.

Speaker 3 (42:09):
Yes, that's exactly I think that I would argue that's
exactly what's going on.

Speaker 2 (42:13):
I mean, in twenty twenty four, there are so many.

Speaker 3 (42:16):
Loud, thoughtful voices from women and people of color who
are really talking about AI in some interesting and thoughtful ways.
So they exist, They are out there. This is the
tale as the oldest time when it comes to technology.
It is not that they are not there. It is
that they are being, whether intentionally or unintentionally, marginalized, sideline silenced,

(42:37):
pushed aside to make room for voices who are just
repeating the status quo, who are just saying like, well,
I'm trying to get rich, so who cares how this harms?
Somebody who cares about whether or not this goes on
to exploit.

Speaker 2 (42:52):
And I think that's really it's really.

Speaker 3 (42:53):
Like a it's a little bit of a complicated cultural
dynamic and cultural shift that I think that we really
got a break.

Speaker 1 (43:00):
Yeah, Yeah, And it's really sad, going back to your point,
Bridget of like the internet not being a place of
joy anymore, because so many times it was marginalized people
who made those spaces because they couldn't.

Speaker 5 (43:13):
Find them anywhere else.

Speaker 1 (43:16):
And then these companies come in and are like, Okay,
well we can make money, and then it doesn't become
a joyous space anymore. It becomes a very toxic, a
toxic place. And so like hearing this story and seeing
how so much of what we use still is based

(43:38):
on something that was.

Speaker 5 (43:40):
A guy walked in with the Playboy.

Speaker 1 (43:42):
Magazine like, it's it's bad when you that doesn't feel
so out of place and what we're talking about in
our current time.

Speaker 3 (43:53):
Yeah, and again, I mean, I opened up our conversation
with this, but I guess, and I guess I'll close
with that too. I believe people when I say this,
people think I sound alarmist or extreme, but I mean
it the way that I mean it. I think that
these things are features, not bugs. I think we've got
to be honest about the ways that things like misogyny
and exploitation, particularly when it comes to marginalized people, has

(44:15):
been foundational to technology and the Internet.

Speaker 2 (44:18):
From the very beginning. I love the Internet. I love technology.
It is why I do the work that I do.

Speaker 3 (44:24):
But I think that until we are honest about that,
but these things are features and not bugs, we will
never get anywhere. And so I think that it really
starts with having honest conversations about where we started so
that we can get to a place that we that
actually feels a little bit better for everybody.

Speaker 1 (44:40):
Yes, yes, well, thank you so much. As always, Britte,
every time you come on, I'm like, we could talk
for hours about this and this and this.

Speaker 3 (44:51):
Invite me back for an episode just dragging Hugh Hefner.

Speaker 2 (44:54):
Yeah, I'll be here for it.

Speaker 4 (44:56):
I think we need to do this because I think
this for a minute about back into the magazine world
and jumping into like all of that.

Speaker 3 (45:04):
Don't even get I mean like this is like spoiler alert.
I like totally had this wrong. For so long in
my life, I was like, oh, Hugh Hefner was a
champion for free speech.

Speaker 2 (45:13):
And the riots and blah blah blah.

Speaker 3 (45:15):
Then I grew up and learned and I was like, actually,
she wasn't such a good guy.

Speaker 4 (45:19):
Right, I Mean, we really fed into the but read
the articles so so good.

Speaker 2 (45:24):
Oh my god, the corbell.

Speaker 1 (45:27):
Yes, yes, oh yes, please come back for that Bridget.
In the meantime, where can the good listeners find you?

Speaker 3 (45:37):
Well, you can listen to my podcast. There are no
girls on the internet. You can follow me. I'm not
really on social media that much anymore, but you can
try to find me there. I'm on Instagram at Bridget
Marie in DC. I am at Blue Sky at Bridget
Todd on Threads at Bridget Marie and DC sometimes on TikTok.
You'll I'm easy to find. You'll find Google me. You'll
find me, Yes, google me.

Speaker 4 (45:58):
That's a flex.

Speaker 5 (46:00):
It's true though. Our listeners are smart. They can find
you and listeners.

Speaker 1 (46:05):
If you would like to contact us, you can our
email us a step media mom stuff at iHeartMedia dot com.
You can find us on Twitter at most of the podcasts,
or on TikTok and Instagram that stuff I'll.

Speaker 5 (46:13):
Never told you.

Speaker 1 (46:14):
We're also on YouTube. We have a tea public store
and a book you can get wherever you get your books.
Thanks as always to our super duced Christina, our executiveroducer
My and your contributor Joey.

Speaker 4 (46:22):
Thank you, and.

Speaker 1 (46:23):
Thanks to you for listening. Stuff will never told you
this poiction by Heart Radio. For more podcasts my heart Radio,
you can check out the iHeart Radio app Apple Podcasts
wherever you listen to your favorite show,

Stuff Mom Never Told You News

Advertise With Us

Follow Us On

Hosts And Creators

Anney Reese

Anney Reese

Samantha McVey

Samantha McVey

Show Links

AboutRSSStore

Popular Podcasts

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.