Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Hello, bridget here, and it has been a minute I
have missed talking about the intersection of technology and identity
and democracy with you all and the cool Zone team
so much, and popping back in to share an episode
of the podcast Be the Way Forward, an exciting new
show from anitab dot org, a global organization for women
and on binary people in technology. I was the inaugural
(00:25):
guest on the podcast, where I had a great conversation
with host Brenda Darden Wilkerson, President and CEO of anitab
dot org. So take a listen and subscribe to be
the Way Forward if you like it, Thanks so much,
talk to you soon.
Speaker 2 (00:44):
The fact that a very narrow band of the population
besides most of the tech that we use and see
each day, it's kind of a crisis. I'm Brenda Darden Wilkerson,
CEO and President of anitab dot org.
Speaker 3 (00:57):
Our mission to ensure.
Speaker 2 (00:58):
That the faces of those who create tech mirror those
of us who use it. That's where Be the Way Forward,
a new podcast from anitab dot org and me comes in.
I'm talking to the change makers to deliver the powerful
and empowering conversations we all need to hear. Join me
and let's be the way forward together. My name is
(01:25):
Brenda Dartin Wilkerson. I'm the president and CEO of anedab
dot org. We are a very large organization that's dedicated
to empowering women, non binary people, and other underincluded people
in the industry that has the greatest impact on how
we see our lives, which is technology. You know, we're
known for the Grace Hopper Celebration, which is literally the
(01:47):
largest women in tech gathering. Women come from all over
the world annually to get together.
Speaker 4 (01:53):
And what we.
Speaker 2 (01:54):
See happen there is when people get together with ideas,
they find their people, they find their space, and amazing
things happen. New businesses are launched, people find new opportunities,
those relationships are built, and out of it comes solutions
and most importantly, hope, what we hope from this podcast
(02:15):
is instead of that happening annually, it's going to happen
all the time. So we're going to talk about the
stuff that's problematic. We're going to talk about the things
that are making us little nuts, but we're also going
to talk about the ways tech really was envision to be.
Speaker 4 (02:31):
It was envision to be a space.
Speaker 2 (02:32):
Where the dreamers and the doers and the makers could
imagine making our lives better, making the world different, finding
ways to solve problems. So these conversations are about put
it in us, back in that place where we dream,
because when tech works, it has to work for everybody,
(02:54):
and that means that everybody needs to be at the table.
Speaker 3 (02:56):
So as we kick off.
Speaker 2 (02:57):
The series, I can't think of a better voice to
join us on the first episode than Bridget Todd. She
is amazing. She is the founder of Unbossed Creative, a
studio champion social change through digital content. She's also the
host of season six of Mozilla's podcast IRL, All About
(03:18):
AI and one of my favorites. She's also the host
of her own critically acclaimed podcast, There Are No Girls
on the Internet. She's been in the trenches doing the
important work. She's got some important things to share. So,
without further ado, I bring you my conversation with the
insightful and ever inspiring bridget Todd. So, Bridget Todd, I
(03:42):
get to speak with you today. I'm so excited and
I would love for you to tell our listeners who
is Briget Todd and.
Speaker 3 (03:51):
How did you become you?
Speaker 4 (03:53):
Oh? What a question.
Speaker 1 (03:55):
I'm first of all, I'm really grateful to be here,
really excited and honored. Yeah, bridget Todd is a big
weirdo who became herself thanks to the power and an
openness of the Internet. You know, I always say that
the reason why I care so much about technology and
social media and the health and safety and state of
(04:17):
our Internet is because if it wasn't more of those
things when I was growing up, I certainly would not
be who I am.
Speaker 4 (04:23):
You know, I was this.
Speaker 1 (04:24):
Little weirdo kid in the middle of mid Lothian, Virginia,
which is small town you probably haven't heard of. It's
a little bit more developed now, but it right outside
of Richmond, Virginia, and I grew up feeling like a
little alien. And if it was not for the Internet
and the power of expression, the power of using the
Internet to safely connect with others, I don't think I
(04:45):
would be who I am today. And so I want
to make sure that the folks who come up behind
me also get that freedom of expression and self exploration
that I had to become who they're going to be.
But I know we can't get there with an Internet
that is full of garbage.
Speaker 4 (05:01):
How I came to be who I am.
Speaker 3 (05:03):
I love it. I love it. Okay, so two weirdos here.
I feel the same.
Speaker 2 (05:08):
Way, right And you know, I'm going to count that
as one good thing that tech is doing, right, if
it could free a soul like you and give you
the power.
Speaker 3 (05:19):
To have the impact that you have.
Speaker 2 (05:21):
Okay, all right, one one on the good side for
it for the internet right now. So you've got this
podcast it's called there Are No Girls on the Internet.
Speaker 3 (05:33):
First, why did you call it that?
Speaker 4 (05:35):
Oh?
Speaker 1 (05:36):
I called it. I called my podcast there Are No
Girls on the Internet because I could not conceptualize a
future where I would be saying this ten times a day.
And it's a real mouthful.
Speaker 4 (05:47):
No, no, I mean I it is a mouthful, but no.
Speaker 1 (05:50):
I called it that because when I first started getting
really into the Internet, you know, on message boards and
things like Reddit, I came across what they call one
of these like informal rules of the Internet. I think
it's rule number forty seven, which says there are no
girls on the Internet. And so there are sort of
two ways of thinking about that rule. One is that
there is like literal there are no girls on the Internet.
(06:12):
If you encounter somebody who says they are a woman online,
it's actually not a woman. That person is lying to
mess with you. That's one kind of literal interpretation. The
other is this idea that our identity does not matter
online and we get online, whatever identity markers we have,
we leave them at the door. And the thing is
(06:32):
both of those things, either iteration of it, are incorrect.
Marginalized people, women, queer folks, folks of color, black folks.
We have been on the Internet and at the earliest
foundational stages of technology since the very beginning. It would
not exist without us, And so I kind of realized, Wow,
I have let myself believe this lie for so long
(06:54):
that people who look like you and I are outliers online,
when in fact it is our rightful domain. And if
we don't hear our voices or hear our stories our
perspectives are presented, that's either because we were overlooked or
intentionally erased. And so the name of the podcast is
sort of reclaiming this idea. When people say there are
(07:15):
no goals on the internet to I want to have
an archive, that's like, oh, here we are expressing our
perspectives and talking about our work and really taking up
space when it comes to conversations about technology. So it
really kind of started as like an inside joke with myself.
Speaker 2 (07:30):
Yeah, yeah, yeah, And it is it is really critical
that we tell people over and over again we are here,
We've always been here. We are part of the magic
that is the Internet. But it's like it has to
be said. It's like a marketing piece. We have to
say it over and over again because there's this other
(07:51):
thing out there that says to the contrary.
Speaker 3 (07:54):
And we know, both you and I know.
Speaker 2 (07:55):
I mean, I want to talk about how old I am,
but I've been around long time and was a part
of all of the beginnings of this stuff. So that's amazing.
So what kind of things, though, do you talk about
on your podcast?
Speaker 1 (08:08):
You know, I wanted to make sure the podcast was
not just a monument to all of the crappy stuff
that we had to deal with on the Internet, things
like marginalization and rasure, harassment, disinformation campaigns, all of that.
That's certainly part of our story online, but it is
not the only story. And so I wanted to capture
both the ups and the downs, like those days on
(08:30):
early Twitter where we're all cracking jokes late at night.
That's part of the magic of what it means to
be a marginalized person online, just as much as the
rough stuff as well. And so I really wanted to
create a space where we could talk about our experiences,
our contributions, be they horrible, like the kinds of things
we should not be facing, the ways that we're fighting
(08:51):
back against those things, and just the ways that we're
using the Internet to connect and build art and spread
joy and have engagements and build movements. You know, there
is The story of the way that we show up
online is such a multifaceted, layered, nuanced one. And it's
those like nuanced nitty gritty points that I find so
fascinating how the same tool, the same space can be
(09:16):
both foster some of the lowest moments that I've ever had,
some of the scariest moments I've ever had, but also
some of the highest moments I've ever had, some of
some real moments of genuine connection. And it's so fascinated
with how this one domain is able to provide both
those experiences for marginalized folks.
Speaker 3 (09:33):
I totally agree. I totally agree.
Speaker 2 (09:36):
I like that when you're talking about like the early
days of Twitter, and even you know, I woant to
say Twitter now there's still such a big presence for
some of us out there. It's like, I don't know,
I'm being rebellious. I don't want to give up my
space just because right because it was it was a
space that we used before the advent of that person
(09:58):
we don't want to name. Yeah, And so you know,
just for a moment, let's talk about some of that
negative I just kind of want to get your feel
for it. So one of your guests that was on
the show talked about the impact of how things are tested,
especially using the Internet, and the quote was mass infantilization
(10:22):
of entire populations.
Speaker 3 (10:25):
And what I read there was that there.
Speaker 2 (10:27):
Was this taking away of the freedom to choose, and
the taking away of the freedom to really have the
knowledge that it is necessary to be able to choose,
and even having a choice, it's like either you use
it or you don't.
Speaker 3 (10:42):
And one of the things that I loved about the
conversation was bringing in the point of view that AI
and the Internet and tech in general is not just
about the tech. It's about people. It's about human questions,
it's about human rights questions, and so I would love
to hear you speak to that whole that whole mess there.
Speaker 1 (11:05):
Well, I think you put it so so correctly right.
It's we are told time and time again by tech billionaires,
meant most of whom, let's be real, are like white
cis men. We are told time and time again that
it is about tech, when you and I know that
it is about people, relationships, values, how we live our life,
(11:25):
our day to day. When I wake up, the first
thing that I reach for at the time is a
piece of technology. When I go to order my groceries,
I'm looking at the phone to be like, oh, should
I order it on Uber Eats or should.
Speaker 4 (11:36):
I go to the store.
Speaker 1 (11:37):
Right when I vote, I am getting information about how
to do that. Online technology has so much impact of
how we live our day to day, from how we
just get our food in the morning, to how we
get our.
Speaker 4 (11:49):
Information to everything.
Speaker 1 (11:50):
So telling me that it is just about tech, we
can we can feel that that's not true, just based
on how we live our lives. And so we are
being told that tech leaders who don't have to listen
to us, don't feel accountable to us, know better than us,
and they can tell me, oh, well, we're going to
start testing out this new technology that isn't really been
(12:11):
proven to be safe, like Tesla's self driving technology, right,
Like anybody who owns a Tesla can just test that
out despite still being.
Speaker 4 (12:20):
In beta mode on a public street.
Speaker 1 (12:22):
Well, I never signed a form saying that I was
going to be a guinea pig to make Tesla safer.
I never signed a form that I was willing to
put my safety on the line to, you know, get
us to a future where this technology is more safe.
I never signed that, yet it's my reality every day.
And so I think that we need to get to
a place where tech leaders understand that even if I'm
(12:45):
not an engineer, even if I'm not like a techie,
I still am an expert in my experience. I use
this technology every day, and this technology uses me every day.
Speaker 4 (12:54):
So therefore, these.
Speaker 1 (12:55):
Tech leaders who are making so much money off of us,
they need to be accountable to well. And so what
I'm really hoping to have is this cultural shift that says, no,
Sam Altman and Mark Zuckerberg and Elon Musk don't know
better what's best for me than I do. I am
the expert of my experience and I am allowed to
take up space and say that these people, they have
(13:16):
tricked us into thinking that they are smarter than us,
that because we didn't go to Harvard are not engineers,
that we couldn't possibly understand these conversations. We should just
stay out of it and trust them, when these conversations
do so much harm and have so much impact in
our lived experiences. And it really reminds me of this
Zora Neil Hurston quote. If you're silent about your pain,
they'll kill you and say you enjoyed it. And I
(13:38):
don't think we should be silent anymore.
Speaker 2 (13:40):
I don't think we should either. And you know, there
are issues where people are feeling threatened about sharing their opinion.
And when you talk about AI and the surveillance society
and the devices that are listening and the types of
things that are being gathered, there is this concept of
(14:01):
you know, what if I say something, how is it
going to impact it? So I think that's a really
important thing for us to think about and to investigate.
In line with all of this, I would love for
you to give our audience a thirty thousand foot your
point of view of tech. You know who was it
built by? And who was it built for? And what
(14:23):
does this mean? I think you've started talking about it.
What does it mean to all of us, especially those
of us who are in marginalized communities.
Speaker 1 (14:33):
Yeah, I love this question. I from the very beginning,
tech was built by us. It was built by people
who are traditionally marginalized, people that you don't think of
as being integral to tech. You know, women, women of color,
especially black women, queer folks, trans folks. We were at
the foundational levels of every stage of the personal computer
(14:54):
becoming ubiquitous. Back in the day, computing was you know, women,
human women were computers, right, and so it was so
it was so closely associated with quote, women's work. It
was almost sort of like secretarial, right, and so we
it was not back then, it was not unusual for
people who are we think of as being marginalized in
(15:15):
tech now to be doing the work.
Speaker 4 (15:17):
And it was a lot.
Speaker 1 (15:18):
Of the dark arts of marketing in the eighties that
made folks think like, oh, well, computers are for men.
Computers are going to replace like you're annoying, nagging woman's
secretary or whatever, you can replace her with a computer
or a fax machine. And on top of that really
hostile workplace policies at tech companies that did push marginalized
(15:39):
people out. And so you know that was intentional work
done by humans that continued to further marginalize us even
when we were taking out space in these domains. And
so tech was built by us. Humans made decisions that
were hostile to us and either sidelined us, marginalized us,
or erased us, or pushed us further to the Martin
(16:00):
and I think today we are still having to beat
that drum of like, no, if you are a marginalized
person in tech, you are right where you're supposed to be.
If you're taking up in space, taking up space in tech,
you are doing what is you know, your where you're
supposed to be. These are our spaces. We're not like
pushing our way into a boys club. We were pushed
(16:21):
out and we deserve to take up space here. And
so it's really telling having this conversation today when there's
so much shifting happening when on things like social platforms
like Twitter, where when Elon Musk took over, it seems
clear that the thing that he cared about was not
things like community building or you know who's fostering actual
(16:41):
engagement on platforms, because that's marginalized people.
Speaker 4 (16:44):
Like, full stop, end of sentence.
Speaker 1 (16:46):
These platforms would be nothing if not for the engagement
of things like black Twitter and other you know, marginalized
people taking up space there. But he has done nothing,
but like wo poo that the only him that matters
is engineering. Isn't it interesting that yesterday his new CEO
had a whole threat about how the important thing about
Twitter is the community you all build Twitter, And it's like,
(17:06):
oh interesting how you can spend six months dying that
when it doesn't when it doesn't serve you, and then
when your backs against the wall, you're like, oh wait,
it's community. So to me, that means that we do
have power. The folks, the folks in power know that
these platforms would be nothing without us, and it's time
they acted like it.
Speaker 4 (17:24):
Oh.
Speaker 3 (17:24):
I totally agree.
Speaker 2 (17:25):
I mean, and I love this because it's information that
obviously the people who are running Twitter had they wanted
to subvert it in some way and go in a
different direction.
Speaker 3 (17:37):
And so when you think about.
Speaker 2 (17:39):
The Internet and how it has become so problematic, I
think you spoke to some of the reasons why, you know,
how do marginalize people, How do women and non binary
folk really start to really take their place there?
Speaker 3 (17:55):
I mean, I love what you said.
Speaker 2 (17:56):
One of the things that I always say to people
when they're talking about Ooh can I get into tech
or should I do it?
Speaker 3 (18:01):
And I was like, wait, you're what's missing.
Speaker 2 (18:04):
So if you come at it from that perspective, you're
going to participate in a different way than if you
think that you are joining something that is, you know,
not for you. You're what's missing, You're what's needed. So
there's sort of like this double sided piece, right, So
marginalized people help build this tech, marginalized people help this
bottom line of tech, but then they're still impacted. The
(18:28):
communities are still impacted. What's the future? How do we
how do we turn this around with all that's going on?
Speaker 1 (18:36):
I mean, you really said it, Brenda. I think it's
it's a the first step. It sounds a little hippy.
Hippy is an internal shift in your thinking right there.
Speaker 4 (18:46):
I mean, and I feel that too.
Speaker 1 (18:47):
I've worked in tech companies and for the longest time
I will be working nine to five at tech companies
and still saying, oh, I don't work in tech and
it's like, wait, I do actually work in tech, and
even if I didn't have this job, I would still
have a perspective that is, you know, worth sharing and valuable.
So the first thing is having an internal shift that says,
I belong in these spaces. I belong in these conversations
(19:09):
even if I don't necessarily have a technical background or
technical expertise. I am the expert of my experience and
I use tech every day. Therefore, that perspective is valuable.
I these platforms would not be would be nothing without
people like us. And so just like owning that power
and starting the conversation from there, as you said, don't
start the conversation up from like I'm breaking into the
(19:31):
spot where I kind of don't have a right to be,
or like this is me like breaking into a boys club.
Speaker 4 (19:37):
I think that that perspective.
Speaker 1 (19:39):
Was useful for a while, but I think it's it's
not actually true because it says that the people who
do make it in are outliers, and that when you
do get in there, it should be you should it
will be normal to be like, oh, there's only a
few black women or a few queer folks, or if
you you know a few people who look like us, right,
And so that's actually not correct if you get into
these spaces and that's how but it is that is
(20:00):
a failure on the people in power, not in an
indication of those spaces and whether or not you belong there.
So I think the first is really an internal shift.
And then just like not letting these tech leaders off
the hook, right, And so I think that they're really
counting on particularly people who are not usually centered in
(20:21):
these conversations, just not paying attention to what's happening in tech.
It's one of the reasons why I do my podcast,
because they have an interest in making these conversations inaccessible. So, like,
if you've ever heard some tech reporting or an interview
with a tech leader like Sam Altman does these interviews
where I'm like, I think that you are purposely making
AI sound more complicated than it is because that serves you,
(20:44):
Because it serves you when people are like, oh, I
don't understand, Like I'm just going to tune that out.
They're doing that on purpose to marginalize us. And so
I think sort of turning off whatever inside of you
is like I'm not going to understand these conversations you
absolutely will Sam Altman and Elon Musk are not smarter
than you, Like, you can absolutely understand these conversations. You
can absolutely have a perspective on them, make arguments about them,
(21:06):
advocate for yourself and your communities up against their interests
because they're counting on you being like, oh, I'm not
going to do that. So yeah, absolutely, Okay, So so Britchet,
you are obviously.
Speaker 3 (21:18):
Passionate and I love it.
Speaker 4 (21:21):
I love it. Can you tell the one way to
call it? I mean? And I love it?
Speaker 2 (21:25):
I mean, and you know, and I feel like it's
necessary for us to do this work, But can you
tell our listeners you know what?
Speaker 3 (21:33):
Where did your passion come from for this work?
Speaker 4 (21:36):
You know?
Speaker 3 (21:37):
Why do you do it? And you started talking about
you started talking about it before, but tell us more.
Speaker 1 (21:42):
Yeah, I mean, it really just takes me back to
being an eleven year old kid and feeling so out
of place. You know, I had a perfectly fine childhood,
but I lived kind of an isolated life. And I
remember when my dad brought home our first computer. It
was like boxy monstrosity that took up the entire desk
(22:04):
and he set it up in what would become like
our computer room. Remember how you would have like a
room in your house where the computer was, and like
if you brought a soda in there, my dad would
lose his mine.
Speaker 4 (22:16):
But yeah, it was like he had brought me a
pair of wings.
Speaker 1 (22:19):
It was the first time that I was like, oh
my god, there's people outside of my little community in
the South who have completely different perspectives, completely different values,
who live lives that I've never even thought about. And
so it was the first time that I really started
put having the gears turning in my head of like,
wait a minute, I don't have to live in this
(22:40):
town forever. I don't have to, you know, go down
this path that I see replicated all around me. You know,
of the people in my community, I can do whatever
I want. There's all different kinds of people living all
different kinds of lives out there. And so that was
such a huge springboard to me coming into myself and
learning about my own identity, who I am and who
(23:01):
I want to be in the world and my place
in it. And I really really care about that. I
think that's really important. I think that when I think
about the state of the Internet today, I genuinely worry
that the Internet that we are leaving behind the Internet
landscape is not as safe or healthy, or robust or free.
Speaker 4 (23:20):
Or open as it was when I was a kid.
Speaker 1 (23:22):
And so I wonder the eleven year old little kid
that is like me, are they able to have the
same level of freedom and expression that I had when
I was a kid. And I truly want to go
back to the time where the Internet did feel like
expression and freedom and not obligation and you know, burden.
Speaker 4 (23:41):
I really, I truly want that.
Speaker 2 (23:43):
And pressure to conform, right, because what you're saying is
now there's all these pressures on young people to consume
and not.
Speaker 1 (23:53):
Create exactly that, and I think it's like they're pressures,
but they're corporate pressures, right, And so somebody is making
money off of a young person feeling that pressure to
consume and not create. Somebody is making money off of
young people, you know, getting into really tough mental health
situations because they can't stop comparing themselves to their peers.
(24:16):
Somebody is making money for that. And so I don't
want to make it seem like the Internet that I
had didn't have its problems, because lord knows, that's probably
doing stuff on it that I shouldn't have been doing,
but it didn't feel like a marketplace for my pain
as a young person, and that's what I'm trying to
get away from. Yes, young people are going to have
negative experiences online. Somebody shouldn't be making money off of that.
(24:36):
That's the dichotomy. I think that we really got to
step back in question.
Speaker 2 (24:39):
Wow, that's amazing and such an important point that people
need to think about as they are consuming. Because you know,
I've always been of the mind I was raised to
vote with my dollar, right, and there's different currency that
you vote with, and the attention that we give to
these platforms that are doing these things and to young people,
(25:01):
we need to think about our participation in that as well. So,
such an important point. So let's talk about the Internet
Health Report. You hosted a season of their IRO podcast
on AI. It's such an incredible tool and it's a
wealth of information. So I'd love for you to talk
more about it.
Speaker 3 (25:19):
What is it saying right now? Where are we?
Speaker 4 (25:21):
Yeah?
Speaker 1 (25:21):
So, the Internet Health Report was a project of the
Mozilla Foundation, the makers of Firefox. It's meant to sort
of be a report card grading the health and well
being of our Internet.
Speaker 4 (25:31):
I am sorry to say that our Internet. You know,
it's not all doom and gloom.
Speaker 1 (25:35):
It's not all bad, but we really got to be
thinking about some of the ways that biases are being
baked into our Internet experience and how the rapid progression
around things like AI is really.
Speaker 4 (25:48):
Making that worse.
Speaker 1 (25:49):
And so that was my probably my biggest takeaway from
the Internet Health Report is that we have a lot
of biases in our society.
Speaker 4 (25:56):
Things like AI.
Speaker 1 (25:58):
Are well almost certainly make those biases worse and further
entrench them because this tech is so powerful. And so
when people are trying to have the conversation about AI
and like being like, oh no, just accept that it's
everywhere now, and you know, don't.
Speaker 4 (26:12):
Even ask questions. You got it. If you want to keep
your job, you better learn it.
Speaker 1 (26:15):
It is imperative that we pump the brakes and ask
who benefits, and we ask those questions who benefits from
this hype cycle that it's like, we just need to
be quickly accepting it into our lives and in what
way is it furthering biases because our society is biased, right,
Like the AI has the same biases and hang ups
(26:36):
that society has, so if the people who are making
that technology have those biases, it's also going to be
replicated and further entrenched.
Speaker 3 (26:42):
That's right. And when I think about the impact on.
Speaker 2 (26:46):
Marginalized populations, even naming like older people or people who
are less literate, or maybe they're people who are you know,
new to a language that that particular application is speaking
in and they need to use that for something for
their legal lifestyle, for their health lifestyle. You know, we've
(27:07):
talked a lot about AI and you know, the false
interpretation of faces, but we need to talk about voices.
Speaker 3 (27:16):
You know, I won't share what my phone is, but
it still doesn't understand.
Speaker 2 (27:20):
Me, right, And there rarely are people also on the
other end to say, oh, let me intervene.
Speaker 3 (27:27):
This isn't working out.
Speaker 4 (27:29):
You know.
Speaker 2 (27:29):
So if you're your grandmother, my grandmother is trying to
access a system that can't understand her voice, she's going
to miss out on services, She's gonna potentially that could
impact her her very life. So I would love for
you know, for you to talk more about, you know,
how do we get engaged and speak out when we
(27:50):
run up against those types of things.
Speaker 1 (27:53):
Well, first of all, I'm really glad that you use
that example, but I say that I make content that
centers tradition, marginalized people. People always jump to like race
and gender. That is absolutely part of it, but it's also,
as you said, age, it's also where you live. It's
also you know, are you a native speaker of English
or is it your second language or a third language?
Speaker 2 (28:14):
You know?
Speaker 1 (28:14):
It is also military status. It's also do you live
on the coasts?
Speaker 4 (28:18):
Right?
Speaker 1 (28:19):
Did you go to college? Did your parents go to college?
There are so many different identities beyond just race and gender,
although those are very important, that can marginalize people in
conversations around tech, and AI is a perfect example how
I think that we have all kind of been a
little bit I won't say misled, but pushed into accepting
(28:41):
what I think might be a hype cycle around just
this technology. I think that AI is going to be
tremendously powerful, but AI is not new, right, and so
I think that we are being told like AI can
do any job, from lawyer stuff, tesrea writing to everything right.
And so I think that the reason why that is
(29:02):
being pushed on us right now is to get us
to accept it. And so even if that means, oh,
I'm afraid of AI and the way that it's going
to change the world. That actually just feeds into this
idea that like AI can do x y z job,
and it will do x y z job. And so
I think that when we accept that without a lot
(29:22):
of criticism, the scenario that you just described of your
grandmother her AI technology not being able to understand her
voice or understand what she's saying to connect her to
things that she needs in her life, that's what I
think that we need to be working to prevent, right,
this idea of like full scale, quick adoption of this
technology when we know it has so many biases. And
(29:43):
so for me, it's just like rejecting the hype cycle
around certain technologies, right, understanding how when I'm seeing something
everywhere everywhere everywhere, Like if you scroll the iTunes tech
podcast charts, the half of them are about AI. Now,
that wasn't the case two years ago, Right, It's like,
where is this all coming from? And so I think
(30:05):
part of it starts with being a little bit critical
about the information that we're receiving about technology, particularly when
it seems like every voice is saying, oh, accept this
as the new way of life without really thinking about
what that means, right, you know the.
Speaker 2 (30:22):
Way that whole those careers have progressed over time. You know,
when I was doing tech, privacy was sacrocynct. I mean,
you worked really hard to make sure that people's privacy
was maintained. That's out the window and so but and
there was also this assumption that something was still in
beta test and not even in alpha test. But your
(30:45):
listen assumption, Okay, I'm not gonna be the first one
in line to get this. Let everybody else buy it,
you know, test it out, get the bugs out of it.
I wonder if if the technologists of today are still
thinking that critically or are they part of this hype
cycle that you're talking about.
Speaker 3 (31:01):
You know that I worry about that.
Speaker 1 (31:03):
That's such a good question. I don't have the answer.
I will say this, I have seen the same thing
that you have in just the last couple of years,
really conversations around privacy that at one point it would
have been unheard of to have a tech company selling
ads about something that you said in a therapy session.
(31:24):
Right like if I told you fifteen years ago that
what you said to your mental health counselor or your
therapist or your doctor, a tech billionaire was going to
be listening to that and be selling you items based
on what came up, and that you would say never.
But now we've all just accepted that that is okay.
(31:44):
And I think we've really in the last few years,
really I have seen conversations that at one point were sacricent,
like you would never think that this would be something
that would be at risk of not being private, like
the most intimate kinds of relationships we have. Now we
have just accepted if there's going to be a tech
company in between you and that the other end of
(32:04):
that relationship essentially surveilling you, and we're being told that
it's a good thing. We're being told like, oh, it's
going to give you a better experience when you scroll
your social media. Don't you want information that is tailored
to your experience? No, I want to go back to
where some things are private, like you know, think of
I mean, I wonder if like where this ends, right,
(32:25):
because I really think that when we accepted conversations between
medical professionals and us as like up for the taking,
like what's next like conversations, but like when you go
if when you go to like confession that's going to
be you know, like what's if?
Speaker 4 (32:41):
Like what's next? Like where does it end?
Speaker 1 (32:43):
In terms of the lack of privacy that we have
being sold back to us as a good thing.
Speaker 2 (32:49):
Right, And and that also speaks to choice. I don't
want someone choosing what I see on my feed. There
are so many things that I have serendipitously found in
my life from because I'm a nerd and I like
to study and I'm curious, right, you know those days
it would go to the library, or you'd go out
on an experience, you'd meet a person that wasn't scripted
(33:11):
for you. And so the more we sit and it
started really I mean it really was cemented during COVID
because we were like sitting in our houses and so
this became really like this life source and getting all
of our information here, we're talking to people through it.
That's a lack of freedom if it's scripted. So I
think that that whole concept of freedom is up for
(33:33):
grabs that in a way that most of us don't recognize.
And I think those of us who have a voice
have a responsibility to talk about it.
Speaker 3 (33:41):
And that's what I love about what you do yeah.
Speaker 4 (33:43):
I mean you put that so well.
Speaker 1 (33:45):
Algorithms give you the illusion of choice, but it's not
really choice. You're not really choosing what you see if
an algorithm is surfacing it for you.
Speaker 3 (33:54):
Yeah, yeah, it's not. It's not.
Speaker 2 (33:56):
And like you said, people have come to think that
that's normal, and I think that's something we might need
to work on.
Speaker 4 (34:03):
You know, you can. I want people to.
Speaker 2 (34:05):
Think what they want, but I want them to understand
where those thoughts are coming from and that, you know,
when their choice has been.
Speaker 3 (34:12):
Takeaway, so you know it.
Speaker 2 (34:15):
We've talked about some things that sound kind of negative, right,
and there's there's some challenges and there's work to be done.
Speaker 3 (34:22):
Is there hope?
Speaker 4 (34:23):
There's always hope. I am an optimist.
Speaker 1 (34:26):
I would not be in this work if I didn't
have a lot of hope and optimism and joy about
where we're headed next. I think that the hope is
in that I think that more and more people are
checked in and engaged for these conversations. I think more
and more people are like, wait a minute, you know,
should I if Facebook and Meta have done all of
(34:47):
these like not so great things with my privacy? Should
I just download a new app from them without thinking
about it. Maybe I should read into it. I think
we're seeing more and more people checked in for these conversations,
ready to hold tech leaders accountable and think about their
role in the world that we want to have, and
I think that's awesome. So, like, I am really hopeful.
(35:11):
The thing that always gives me hope are people. I
really believe in the power of people, the power of
passionate nerds and weirdos like ourselves to create things, and
that'll never go away. You know, nobody can stop that.
Oh that's really positive and hopeful. I hope so too.
I hope so too, because that's.
Speaker 3 (35:29):
Really what we need.
Speaker 2 (35:30):
We need people who are motivated, who want to be
a part of it, to have those barriers removed. And
that's what makes these conversations like we're having today so critical.
You know, speaking of social media, give me your take
on you know, where should we be going now with
social media? You know, you talked about making those choices,
(35:51):
you know, should we get involved with those those companies,
those those applications that maybe have been dubious in the past.
You know, I think it was your podcast that even
talked about the ring people and their ability to just
share your information with anybody who asks for it, the police,
without your knowledge or consent.
Speaker 3 (36:12):
You know, how do we approach such a.
Speaker 2 (36:16):
Vast array of opportunity to get involved on the social
media in a way that is intelligent and resourceful.
Speaker 4 (36:26):
Yeah, it's an exciting time for platforms.
Speaker 1 (36:29):
I would say, think deeply about what it is you
want from your social media platform. Are you someone who
likes the idea of a social media platform really trying
to feel like a public square where everybody is listening
to what you say, or are you looking for something
more like a campfire with more intimacy. Are you looking
for a platform that you know has moderation as a
(36:54):
as a thing that you everybody is aware of, like
here's how we moderate, or a platform that is sort
of like building that as they go. I would say,
really spend some time getting clear on what it is
that you're looking for, as opposed to as being told
like this is a new platform, we're all meeting here.
Really think about what it is that you want, what
will fit your needs and your values and what you're
(37:15):
looking for, and there's a platform that might suit those needs, right.
Speaker 4 (37:19):
Like, if you want something that's a little bit more intimate.
Speaker 1 (37:21):
Try somewhere good an audio platform that I'm really excited about.
If you want something that's like big broadcast to everybody,
maybe blue Sky is for you.
Speaker 4 (37:29):
Right.
Speaker 1 (37:29):
If you're looking for something a little more niche, maybe
it's mastered on. I cannot tell you what would be
a good fit, but like you can tell yourself, Like,
you know what you're looking for from your Internet experience,
and that should be dictating you, not some tech leader
filling the void with a new platform that you just
sign up for without even really looking at it. So
basically you're saying, let's think about this first, let's think
(37:51):
about it exactly.
Speaker 3 (37:53):
I love it. I love it.
Speaker 4 (37:54):
Okay.
Speaker 2 (37:55):
So I'm sure in all of your work in conversation,
you run across companies that you really think, Okay, wow,
this company is doing great things. I really love what
they're doing. You want to share with everybody about this company.
Can you name one company that you'd like to share
with our listeners today that falls into that category?
Speaker 4 (38:18):
Oh?
Speaker 1 (38:18):
Is it corny to say Mozilla? Like, I swear that
this is not like an ad for Mozilla. They have
not paid me. I mean I did host their podcast,
but like so like I was like a Mozilla fangirl
before I ever worked there. I actually, like ten years
ago now, I submitted an application to work there for
a job that I was like super not qualified for.
Didn't get the job, which was a good thing. But yeah,
(38:40):
I just think that tech companies can do things differently.
And I think Mozilla, you know, being a nonprofit making
a platform an Internet browser that is like about privacy.
I think it's a good thing, and I think that
it's a good reminder to other tech companies that you
can have a different model.
Speaker 4 (38:59):
Another one a Signal. Have you asked for what I'm
giving you too? That's okay? Yeah.
Speaker 1 (39:04):
Signal is another company that has a nonprofit and for
profit model that makes a tool that everybody uses now
or most people. If you're not using, you should be.
That makes encrypted communications easy. And I think it really
is like as to that culture change of you know,
thinking about digital security and privacy isn't just for you know,
(39:24):
activists or people who think of themselves as involved in
sensitive work. Privacy is for everybody. We all deserve privacy,
and so these companies that are finding a way to
make it easy and accessible and fun and more ubiquitous.
Speaker 4 (39:35):
I think are doing great. So that's two for you.
Mozilla and Signal. Oh I love it.
Speaker 2 (39:39):
I love it, and that's that's that's great, A great
perspective when you think about even how they are arranged,
the nonprofit and the for profit piece of Mozilla. I
always thought that that was amazing how they thought to
do that.
Speaker 3 (39:56):
Well, okay, I would.
Speaker 2 (39:57):
Love for you to share with with our listeners anything
that you think that they really should hear.
Speaker 3 (40:03):
What didn't we talk about? What would you like for
them to know?
Speaker 2 (40:07):
One of my purposes in life, I believe, is to
bring people the information that they need so that.
Speaker 3 (40:13):
They can get active. They can be activated, they can
get up and do something.
Speaker 2 (40:18):
We've talked about some things that I think are pretty
critical to just the way we live that is impacted
by AI and the internet. What is just even just
one thing that you would say, you know, we need
to be thinking about this more, we need.
Speaker 3 (40:35):
To do this. What could we do?
Speaker 4 (40:39):
God, that's a good question, I would say.
Speaker 1 (40:43):
You know, on the podcast, we're doing this series called
Present Future where we're kind of exploring the idea that
the sort of faraway tech future that we talk about
is like happening today and so what's happening presently? And
in that exploration, I have really that the people who
are kind of been tasked with architecting what our tech
(41:05):
enabled future looks like. If you ask some of those
people what a good and fulfilled and meaningful life looks like,
I think that most people would not agree with their
interpretation of what a good and fulfilled, meaningful, full life
looks like. I think that tech leaders, for them, a
meaningful and fulfilling life looks like more interactions with a screen,
(41:28):
more technology, having a screen strapped to your face while
you're in your apartment.
Speaker 4 (41:32):
Right. They want us to be more plugged in. I
love technology.
Speaker 1 (41:36):
I will be a techie till I die, but that
is not what I consider to be a good and
full life. A good and full life is a life
that is filled with my community, my family, time offline,
time away from screens. Screens make my life better, but
they are not my life. I think that the tech
leaders who are architecting our future believe that screens should
(41:56):
be our lives.
Speaker 4 (41:57):
And so I would.
Speaker 1 (41:58):
Say, as we go forward in this tech enabled present
future of ours, really keeping that in mind, what do
you consider a good life?
Speaker 4 (42:07):
What do you what makes you feel feel full? Is
it screens?
Speaker 1 (42:11):
Is it more and more technology encroaching on more and
more aspects of your relationship in your lives? And if
the answer is no, which I suspect it is for
most people, really live like it. Really cherish the time
that you're not on a screen. Really cherish the time
with your community, with the people that you love, if
you're able to see them in person. Really cherish that.
Really understand that no tech leader can replace that and
(42:34):
replicate that. Like, really hold tight to the things that
you think make your life good, and don't let a
tech leader tell you that you don't know what those
things are, because you do.
Speaker 2 (42:43):
I know that.
Speaker 3 (42:43):
Sounds like freedom. You're talking about freedom.
Speaker 2 (42:46):
You take back your freedom, take back, take it away,
take it back, you know, be different.
Speaker 3 (42:52):
And I feel like, you know, maybe this is a
little biased on my part. I feel like those of us.
Speaker 2 (42:57):
Who knew what it was before and see what it
is now have that comparison to be able.
Speaker 3 (43:03):
To do that.
Speaker 2 (43:04):
Do you feel that same way for people who are
were more or less born into the matrix?
Speaker 1 (43:09):
I guess would call yeah, Well, I don't know. I
would have said so earlier. I would have said no,
like young people. I mean not to sound like my mom,
but like young people love their dang phones. But like
I was actually reading about how younger folks are I
think just like getting a little fed up with this
like totally tech enabled present that we have. I was reading.
(43:33):
I want to say it was pew, but don't quote
me on that. Someone did a study on whether or
not different generations would like to return to a time
before smartphones and tech, and most people do. The only
generation that didn't want to return to like pre smartphone
era were boomers. But even younger folks like jen Z,
they don't even they didn't even get to have an
(43:55):
error without those things, and they're interested in what that
would be like.
Speaker 4 (43:58):
And so I read.
Speaker 1 (43:59):
Profile like young people who don't have smartphones and love it.
I think that we are really going to see a
shift where people are turning back to the things that
make them feel good and turning away from things that
make them feel bad, even though tech leaders are trying
to convince us that they make us feel good, but
we know they don't.
Speaker 2 (44:16):
Well, I'm encouraged by that. That's very encouraging to me.
I think that's great because that means that they understand
that they have a choice. And so again we want
to activate people to get engaged. I want people to
leave this podcast and go and do something. What would
that be.
Speaker 4 (44:33):
Be choosy.
Speaker 1 (44:34):
It's your life, it's your experience, it's your ecosystem. Be
a little choosy about what you let into it. And
don't let these tech leaders tell you you need anything, because,
if anything, they need you.
Speaker 3 (44:43):
Oh a man, I love that part.
Speaker 4 (44:45):
They need you.
Speaker 2 (44:46):
They need everybody listening to this podcast to be successful,
to have a product in the first place.
Speaker 3 (44:54):
So I feel like that is an opportunity.
Speaker 2 (44:56):
For us to take back some power and with that
forge to tide. I want to thank you for being
part of this podcast.
Speaker 4 (45:05):
Oh my gosh, well this was a dream come true.
Speaker 1 (45:07):
You're such a good interviewer. I could talk to you
all day. Thank you for sharing this platform with me.
Just so, the three things I think that everybody listening
should take away from this conversation are one, shift your thinking.
The voice that tells you that you don't belong in
tech or that you don't have a perspective that is valuable.
Shift your thinking because you use technology every day, whether
(45:29):
you are an engineer or went to Harvard or whatever
or not, so shift the thinking around your role in tech.
You belong there to really get clear on what it
is you want. Right you are the expert of your experience.
It is up to you to say what your digital
ecosystem looks like and what technology you want to have
around you and your relationship to it. So really get
clear on what that looks like for you. And lastly,
(45:51):
don't let anyone talk you out of what you know
and what you feel and what you experience. Elon Musk
is not smarter than you just because it's a billionaire.
I like that, and I'm going to use that. You
are the expert of your experience if I can adopt that.
Speaker 3 (46:04):
Please please don't know.
Speaker 4 (46:05):
I'm sure I sold it from somebody.
Speaker 3 (46:07):
Okay, thank you for giving us your time.
Speaker 4 (46:10):
It's so fun.
Speaker 1 (46:11):
Thank you all so much. I really appreciate the time.
Speaker 2 (46:15):
Huge thanks to Bridge to Todd for joining us today.
Now today's talk resonated with you or it piqued your curiosity,
then hit that follow button. New episodes of Be the
Way Forward drop every two weeks, and if you're hungry
for more reasons to find hope and ways to bring
change check out a need tob dot org.
Speaker 3 (46:35):
We're working hard, day in and day out to.
Speaker 2 (46:38):
Create a vibrant, inclusive, and diverse tech world. Remember the
future of tech, it's in our hands. Let's shape it together.
Until next time, keep pushing boundaries and being the way forward.
Speaker 4 (46:51):
See you so