All Episodes

May 30, 2023 63 mins

Paris Marx, host of the hit podcast Tech Won’t Save Us says criticism about technology and the people and culture who build it, is imperative to securing a better tech future.  

 

Listen to Tech Won’t Save Us: https://techwontsave.us/

 

SUPPORT THE SHOW BY SUBSCRIBING TO OUR PATREON! PATREON.COM/TANGOTI

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
If you want to support there Are No Girls on
the Internet, please check out our patreon. There you can
get ad free bonus content. Just go to patreon dot
com slash tangoti and thanks so much.

Speaker 2 (00:13):
We have these people who are very socially isolated and
have difficulty relating to other people. Think trying to design
or trying to dream up what society is going to
look like for everybody else.

Speaker 1 (00:31):
There Are No Girls on the Internet. As a production
of iHeartRadio and Unbossed Creative, I'm Bridge Todd and this
is There Are No Girls on the Internet. Back in
twenty ten, technology felt exciting. It was when we first
got the old school, non Facebook affiliated Instagram app remember

(00:54):
the Sienna filter.

Speaker 3 (00:56):
Ough, I loved it.

Speaker 1 (00:58):
We were talking about commercial space the very first iPad debuted.
The vibe seemed to be that Silicon Valley geniuses were
bestowing gifts to us in the form of new technology,
and our job was to be in awe of these
gifts and to say thank you. So there was a
lot of optimism around technology and it was easy to
think about it as being an unquestionable force for good.

(01:20):
And maybe that's why when you look back at that time,
there weren't a lot of critical questions about technology and
the people who make it and whether or not that
technology was actually contributing to a better future, one that
we'd actually want. That's a dynamic that Paris Marx says
has led to tech leaders getting really used to being
able to do whatever they want, and Paris is trying

(01:40):
to course correct.

Speaker 2 (01:41):
I'm Paris Marx, I host the Tech Won't Save Us podcast.

Speaker 1 (01:44):
Tech Won't Save Us starts from the premise that we
all need to be questioned in technology and Silicon Valley
tech billionaires about what kind of futures they're actually contributing to.
And when it comes to conversations around things like AI
and its potential to change the world, Paras says, we're
at risk of kind of mirroring the same dynamic that
happened in the twenty tens, letting the people who build

(02:07):
the technology set the terms.

Speaker 2 (02:10):
Into the early twenty tens and stuff like that, Like
it was very positive around tech and what tech was doing,
and like all these startups that were forming and how
these big companies were like growing, and this was going
to be the way that you know, we were going
to have prosperity again. Because this tech industry was going
to like drive a ton of job creation and blah

(02:31):
blah blah, and so there was a real desire not
to kind of look at the potential consequences, to not
look at the downsides of this industry and these business models.
And now I feel like part of kind of, you know,
we have seen a more critical view in the past
number of years, especially since twenty eighteen, and I feel
like part of that is not just kind of like

(02:52):
you know, waking up to what is actually happening, but
also reckoning with the consequences of not you know, looking
at the potential downsides of the companies through those years,
through the early twenty ten you know period, And now
we're kind of trying to catch up and trying to
be like, whoa, what have we unleashed unto the world.
And now there's a real kind of desire to you know,

(03:12):
not just criticize the Elon Musks and Jeff bezos Is
of the world, but to say, like, we really bought
into a lot of this stuff that Silicon Valley was
selling us around how all we need is kind of
new innovative technologies created by Google and Facebook and whatever,
and that's going to make the world a better place,
and we can see that that didn't work out, and
so now we need to actually think a lot more
critically about what these tech companies are telling us, but

(03:34):
also how we address these really serious problems that do
exist in society and aren't going to be solved from
a shiny new technology.

Speaker 1 (03:42):
What do you think some of those consequences of not
asking those questions during that time and technology, What do
you think some of those consequences have been?

Speaker 2 (03:50):
Oh, there are a ton, right, like the easy ones.
The point too, that like the mainstream would know about
is like, you know, the social media platforms and like
all they've done, and like we love to kind of
hate Facebook, and I'm totally on board for hating Facebook, right, Like,
don't get me wrong, but I think that it's much
bigger than that, right, Like, sure, we had the Cambridge
Analytica like leaks and the scandal around that, and like,

(04:12):
you know, I think that that was kind of misconstrued
a bit as saying, like, h it was because Facebook
like manipulated the populations of the UK and the US
and all this stuff that we had this really terrible
turn in our politics. And I don't think that that's
actually accurate. I think it's a misreading of what actually happened.
And like, I think that there are just things happening

(04:33):
in our politics for reasons that are not technological, you know,
that are kind of affecting what people are doing, which
is really unfortunate. But I think that like much more fundamentally,
and I think the piece that we tend to ignore
a lot more is what this actually meant for workers
as well. Right, Like, obviously, we had this massive boom
in tech workers and people working in the tech industry

(04:55):
that were making good salaries and like doing these startups
and all this kind of stuff, and the media like
very interested in that story, right in how there were
these workers who were making all this money, they had
great benefits when they went to their jobs, like free
lunch and gyms and I don't know all the other
stuff that tech workers get. But then, like the tech
industry was also very like, very clearly involved in affecting

(05:20):
and attacking like the rights of workers in many different industries,
not just in the United States or in other Western countries,
but like around the world as well, Right, And I
feel like we have not really kind of grappled with
that as much through the gig economy and of course
the Ubers of the world and things like that, you know,
explicitly attacking the regulations on the taxi industry and ensuring

(05:42):
that this profession gets turned into this kind of unregulated
contractor model where these people are like completely subjected to
the rules of like Uber and the rules of these
gig companies. But then also like much more than that,
you know, you think about Amazon rolling out out rhythmic
management across its warehouses and turning warehouse work from like

(06:04):
a unionized industry where people could make good money to
like basically your equivalent of a minimum wage job or
just a little bit above minimum wage. Right, that's kind
of like the standard if you're in a community. Right,
it's the new kind of Walmart is working at the
Amazon warehouse and this really changes, like, you know, then
you also have the like one of the things that
I think is pernicious here actually is that, like through

(06:27):
the mid twenty tens, we had this narrative that automation
and AI we're going to wipe out all these jobs, right,
and we're kind of seeing repeats of that right now
with the whole generative AI boom. But like that moment,
we have all this these narratives around it and we're like,
oh my god, the robots are going to take our jobs.
But actually what happens is the tech companies like deploy
all these technologies to like decimate the labor rights of
people across the economy, and we just kind of ignore

(06:49):
that picture of it. But I think that's much more
consequential and I think we should be paying more attention
to it.

Speaker 1 (06:54):
You have a great piece about this called tech Giants
are building a dystopia of desperate workers and social isolation,
where you write tech companies like Amazon and Uber are
creating a society divided between the served and their servants,
where the friction of in person interaction is eliminated. That
friction is the stuff of social connection. A world without
it is nightmarish. And I've talked about this a lot

(07:16):
on the podcast, and it's not something that I feel
great about talking about. Which is that during the pandemic,
which was so hard for all of us, I definitely
got I mean I've used.

Speaker 3 (07:27):
The word addiction.

Speaker 1 (07:28):
I got addicted to buying quick crap on Amazon that
I didn't need to give myself a quick serotonin boost
because I was miserable, just like everybody else.

Speaker 3 (07:38):
I got addicted to not.

Speaker 1 (07:40):
Cooking and ordering Uber eats, And when I had to
take a step back and look at what I was
contributing to personally me bridget, it was really hard. It
was hard to see not only that I had been
so easily conditioned into not thinking about the person who
was bringing me my Uber eats, the person who was
leaving whatever I did, what ever dumb thing I just

(08:01):
bought on Amazon during a pandemic to my apartment, but
also conditioning meat to think, when I want food, I
certainly don't have to, you know, cook it myself. But
I don't even have to go to the I don't
even have to call my local pizza place down the
street and have a conversation with the person who owns it.

Speaker 3 (08:18):
That's I don't have to do that.

Speaker 1 (08:19):
And I wonder how did technology trick us into thinking
that all of these things that are part of the
fabric of life in a society you don't need and
in fact, your life would be better without it. Like
I enjoy cooking dinner, why is it that I now
am like, oh, like uber eats, is better when this
is something that used to be something that like brought
me a little bit of joy and calm in my life, you.

Speaker 2 (08:39):
Know, absolutely, And you know, I think that there are
so many people who are like in that position, and
you know, like I think that on one hand, there
is kind of like the individual responsibility piece of it,
of like should we be using this and should we
be kind of contributing to this, But then on the
other hand, you also have to think about how like
these are ultimately like much large kind of structural problems,

(09:01):
and whether we you know, participate in them or not,
it's not really going to necessarily change like the bigger picture,
like if one of us opts out, and you know,
personally I do still opt out, Like I don't use Uber,
I don't use any of the food delivery services anything
like that, very rarely use Amazon unless I have to.
But you know, I know that ultimately, like that's not

(09:24):
going to take down Jeff Bezos and Amazon. Right, But yeah, right,
but I think to your bigger point, like you know
what you say is very true, right, the tech companies
have essentially tried to convince us, and in many cases
have successfully done so. To convince us that like things
that are like regular average aspects of life are actually

(09:46):
inconvenient and we need to like get rid of them, right,
And the goal of that, from their perspective is that
if they can kind of insert themselves between like more
of our interactions, more of what kind of like the
transactions that we make in our lives, and that is
better for them and their business models. Right If if we,
if Amazon or Uber can like stick itself in the

(10:06):
middle of more of the transactions that we do, if
they ensure that like instead of going to the shop,
instead of talking to someone, that we use one of
their apps, then that is great for their kind of
business model because they get to have like a piece
of that transaction, right, And so this is like the
whole incentive behind it. And I remember, like all the
way back in I think it was twenty fourteen, maybe

(10:27):
it was twenty fifteen, Lauren Smiley, you know who writes
for Wired now, But she wrote this piece called the
shut in Economy where she was like outside of of
Oakland for a while, and then she moved back in
and all of a sudden it was like people were
or maybe she moved into San Francisco. It was like
you know, it was in the Bay Area somewhere, and
she saw that, like, there are all these people now

(10:47):
that all of a sudden were like, you know, home
a lot or at work a lot, and they were
getting a lot of things delivered and like contracting a
lot of services because the gig economy is just kind
of booming in that in that period, right, And she
was like, what is going on here, Like this is
really weird, this is not really good. You can see
the distinction between the served and the servants basically very
clearly in this model. And sure, it allows a few

(11:08):
more people to become the served, but that means you
need a whole bigger kind of population of servants to
serve them effectively. And what we saw during the pandemic
was really like an expansion of this model where many
more people became the people who were kind of like
relying on gig services and e commerce websites and all
this stuff to get everything delivered to them so they

(11:29):
could stay home and kind of isolate. But then you
also needed this whole population that for a while we
were calling essential workers, you know, to deliver all this
stuff and do all this work for the people who
were staying home and working from home and all this stuff,
and like we've just kind of like moved on from
that moment and not really thought about the kind of
larger implications of that. And like, I do think that

(11:52):
there were consequences and there were some shifts that happened
during the pandemic with those whole labor models. But I
think that part of what we see right now with
like the big whole generative I push is again like
this this kind of return of this narrative that oh
my god, like the AIS are going to take away
all the jobs, when actually, like what we're much more
likely to see is like the continuation of tech companies

(12:13):
like kind of using technology against workers to further kind
of you know, create more precarious employment and things like that.
And I think that is like what a lot of
the media narratives don't suggest, but I think that's much
more likely to be what we what comes out of it.
And if we're not paying attention to it now, then
it's going to be much more difficult to like stop
that happening because we're too focused on like you know,

(12:34):
agi and digital minds and all this like sci fi bullshit,
Oh my.

Speaker 1 (12:38):
God, like something. Actually I actually heard this on your podcast.
I forget who the guest was, but I do think
that there's this overton window thing happening where the they like,
there was a time where if I told you that
in between you and your mental health care professional would
be a skizy tech company, you would be like, what

(12:59):
the fuck? Of course, And now that is commonplace with
services like Betterhelp, right, And so I think that you're
so right that tech leaders want to be the wedge
between us and you know, the business of being a human,
and we're going to become more and more okay with
the different things that they're okay, Like I don't want
Jeff Bezos in the middle of my healthcare, right, I

(13:21):
use one medical and that is exactly what the fuck
is happening right now, right, Like, I'm not really comfortable
with like these very intimate things about my body and
my health there being a Jeff Bezos interference. And I
think that sort of without us really paying attention and
really thinking about it more and more intimate spaces, it
has become commonplace to think that a tech leader would

(13:42):
be in the middle of your ability.

Speaker 3 (13:43):
To like talk to your healthcare professional or whatever. Does
that make sense?

Speaker 2 (13:48):
Yeah, Like, and I think that you really clearly see
that during the pandemic, right, because you know, lockdown orders happen.
You know, there's an expectation that we lock down or
at least that like we physical this since from one another,
we don't see as many people, so we're spending more
time at home, We're using our devices more, we're watching
more things on streaming services, as you say, we're using
more of these services to get food and get other

(14:10):
things delivered to us. And what we see, like over
the course of like the first year or two of
the pandemic, is that like the profits and the revenues
of all these companies absolutely sore, right, And so it
shows you that the more that we kind of use
their services, the more that we look at our screens,
the better that is for them. And they're incentivized to
ensure that we spend more time as possible, like engaging

(14:32):
with their services kind of in front of our screens,
all these sorts of things. And I would say, like
that's exactly what the metaverse push was, right, like, how
can we get you in front of a screen much
more like the smart glasses all that stuff. The idea
is just like you need to look at screens more,
and it's like I don't actually want to do that.
But I think on your point about healthcare is interesting
as well, because I've been thinking about it a lot
lately because I'm in Canada, right, and so our healthcare

(14:55):
system is a little bit different, and I have like
even bigger concerns where like, you know, at least in
the state, it's like the healthcare system is already private
and so like to a certain degree, you're shifting from
like one private provider to another. But like, I think
it's really kind of worrying to see the potential like
encroachment and the further encroachment of like private companies on
public healthcare systems, and especially like how technology and digital

(15:19):
technology provides a means for that to happen, because the
argument is like, oh, this is like new technology. It
can't be done by the public sector, so you need
to contract it from the private sector and bring it
into your public healthcare system. And then like you know,
we could say that that's not going to make any difference,
but really, like it's kind of what I call like
a sly privatization that's happening there, and it does start

(15:40):
to change the logics of like what is actually happening
in the system, And it's not just a Canadian problem,
Like you know, we see these things happening in the
UK where there's a big push to privatize, and other
countries where I've talked to people, So yeah, you know
a bit of a different perspective there. But those are
things that I worry about too, even just beyond like
the general encroachment into healthcare that these companies are.

Speaker 4 (16:00):
Or do it. Let's take a quick break at our back.

Speaker 1 (16:18):
What do you picture when you think of having a
full and fulfilled life. It probably looks something like engaging
with your friends and family and having fulfilling life experiences.
But if you ask some of the most influential tech leaders,
the people trying to shape what our futures will look like,
their vision of a full and fulfilled life might look
very different from whatever it is you're picturing. It might

(16:41):
look like more time spent on screens and more technology
encroaching on more and more aspects of our lives, from
the food we eat to how we get our medical care.

Speaker 4 (16:52):
All those times you went late at night and eight
things you shouldn't have eaten.

Speaker 2 (16:56):
That was a secret moment you had with your refrigerator.

Speaker 1 (17:01):
That's popular podcaster, engineer and AI researcher Lex Friedman, who
has described wanting the ability to share late night meaningful
moments with his smart refrigerator. But what if I want
a future where I'm sharing meaningful moments with my partner
or my best friend, not a robot screen in my refrigerator.
What if the future that they want and the future

(17:23):
that I want for myself are two very different things.
I don't necessarily think that it's good to be encouraged
to spend more of my time in front of a
screen looking at a screen, like I want to be
in the meat space. I want to be having actual
connections with humans irl. Obviously it's a place of privilege,
like not everybody can can do that easily. I'm like,

(17:45):
very lucky that that is my experience in the world.
But sometimes when I would probably have said that most
people who have the ability to get out into the world,
they want that they see that as a good thing,
as a part of life. That I hear people like
Lex Friedman or something elon Musk talk, and I don't
think we're all on.

Speaker 3 (18:02):
The same page.

Speaker 1 (18:03):
I think that some of these people who are hugely
influential in technology, for them, how can I put this,
for them, spending more and more time, the idea of
spending more and more time in front of screens with technology,
sort of immersing yourself around technology as a human is
good and thus they are leading this, leading us into

(18:25):
this future where that is more and more commonplace. And
because of the way I think, partly because of the
way that like tech press sometimes covers these people, we
don't stop and think, like, well, is this the person
that we want designing the future? This person who has
like describing a very weird relationship with their robot, vacuum
or whatever, that there's not a relationship that I want

(18:45):
to have, But this is the person who is put
in charge of designing the future of how I will
relate with technology, do you know what I mean?

Speaker 2 (18:52):
Absolutely? Absolutely, And like I think, I think there's so
many things like to explore there basically, but I think,
like on a large I think I would say that,
like I I wonder to what degree like the views
and perspectives of these people at the top of the
tech industry are actually kind of like you know, reflective

(19:12):
of wider desires among like the average public, and I
would say it's actually quite limited. Like I think that
people are definitely like excited by the idea of humans
doing more in space, and people generally like the idea
of like electric cars, and sure there's some conservatives who
don't like that, but like you know, in general, like
I think these broad ideas are things that people find appealing.

(19:32):
But I think that if you dig into like the
more kind of niche, the more kind of specific viewpoints
of these influential tech leaders, I think that you'll quickly
find that if people knew more about them, they would
think that it was really weird and like really separate
from what most people think like a good life actually
looks like. And I think, like, you know, maybe this

(19:54):
is a bit stereotypical, but I think a bit of
that is related to how a lot of these people
in the tech and who are successful in this particular
industry do seem to be like socially stunted in many ways.
If you look at the Zuckerbergs, the Elon Musks, like
you know, Lex Friedman. Of course you mentioned like a
lot of these people do seem to have difficulty with
like social relationships, and I'm not going to like diagnose

(20:17):
them on this podcast, but I think that also shapes
the way that they think society should be set up
and like what the future should look like. And so
we have these people who are like very kind of
socially isolated and have difficulty relating to other people then
kind of trying to design or trying to dream up,
like what society is going to look like for everybody else,

(20:39):
and that naturally shapes how they think that that society
should look and how we should all interact with one
another in the future, when most of us don't have
that issue and would like to spend more time kind
of getting to know other people. And of course we
know right now that you know, there's plenty of talk
about it, there's kind of like a loneliness crisis in

(21:00):
you know, I would say, like the Western world probably broadly,
and like I think that that is linked one to
our grater reliance on technology, but I would like go
back much further and say that it's also a result
of the way that we've designed our communities to be like,
you know, very suburban in many cases, so you're really
distanced from a lot of the people that you might

(21:20):
care about, and it becomes more difficult to actually reach
them because of transportation all these sorts of things. So like,
I think that's rooted in like, you know, much more
kind of like physical geographical problems. And then the tech
industry because of this is how it works. It responds
that by saying, not, maybe we should rethink the way
that our communities are set up so that we can
encourage like, you know, people to live closer together, for

(21:42):
us to like fund public social spaces where people can
come together and you know, they don't need to you know,
pay to go somewhere to do something. But also beyond that,
they would say, like instead of proposing those things that
that might actually address the issue, they would say, you
know what we need to actually make people more social
or make people interact with one another more, is actually
a new technology that will enable them to do that,

(22:05):
where whether it's like Facebook and how that is kind
of imagined to connect the world, or like the metaverse,
because now we don't need to go spend time with
people in person. We can just put on a VR
headset and we can put on these other technologies on
our hands and on our bodies. That allows us to
feel a physical presence in a virtual environment. And like
great for all the tech companies because they're constantly tracking us,

(22:26):
they're getting all the data on us, they're showing us
ads all the time. We're in this virtual world that
they completely control. But I think it's like a terrible
idea of what a future would look like if we
actually like moved in that direction instead of just saying, hey,
maybe we should make it easier for you to see
the people that you care about and get away from
the technology that these companies want you to be looking
at all the time.

Speaker 1 (22:45):
And I think there is a fundamental fuckery happening with
us as regular people and the people who make technology
and make money from us regular people. I think there
is a fundamental fucked up relationship that we really need
to question, like and I think part of it is
that tech leaders have been able to convince us, the

(23:06):
regular people, quote unquote, that we're not smart enough to
understand it. These people are geniuses. They went to Harvard Honey, like,
they know how to code. You'll never even be able
to figure it out. So this technology that is already
impacting so much of your life shaping your day to
day that is like a meshed with your daily experiences.

(23:26):
You don't even need to ask questions about it, right,
Like you don't even get to who are you tech?

Speaker 3 (23:31):
Question?

Speaker 1 (23:32):
These tech leaders who are so smart you don't even
understand how what do you think needs to be done
to Really I believe we need to shift that relationship
like massively.

Speaker 3 (23:40):
Do you think that's possible?

Speaker 1 (23:41):
Because none of this technology you would exist without us,
yet you wouldn't know that from the way that, I
feel like tech leaders continue to expand upon this like
fucked up relationship between quote regular people.

Speaker 2 (23:54):
Yeah. Absolutely, It's like they are kind of like, you
know that they are kind of gifted with this superior intelligence,
and they're looking down at us and saying, we're gifting
you with this knowledge. You know, just use it properly, right.
And I think that you also see that reflected in
like the discourse around how government understands technology and whether
government can regulate technology, And every single time that there's

(24:16):
like a new technology or we're discussing tech policy or
something like that, like one of the narratives that we
constantly hear is, uh, you know, the political leaders don't
get it they're too old like whatever it is, so
they can't respond to this, they can't do anything about it.
And inherently, like within that narrative is you know, we
just need to trust like the tech leaders to regulate themselves,

(24:38):
to like be ethical. If we just put a bit
more pressure on them, then they'll be better, right, you know,
don't be evil the old Google slogan, when we know
that they're very evil.

Speaker 3 (24:47):
Actually killing technology.

Speaker 2 (24:50):
Yeah, like that, don't talk about that, you.

Speaker 3 (24:53):
Know, we don't talk about the death tech.

Speaker 2 (24:56):
Yeah. Yeah, they're just making nice search engines that talk
to us. Now. It's very nice, it's very beautiful. I
think ultimately, like shifting our perspective on these things is essential, right,
And I think that we're in this moment where, you know,
as I was saying earlier, like during the early part
of the twenty tens, the mid twenty tens, like you know,
we were kind of generally, you know, we as a

(25:16):
general we kind of like infatuated with the tech industry, right,
they could do no wrong. All the reporting was like
these great companies who are doing so many great things
for us. Then near the end of the twenty tens
and kind of through the early twenty twenties, we have
this shift where all of a sudden, you know, we're
recognizing that maybe we shouldn't have been so positive, so
kind of uncritical of what they were doing, and there's

(25:38):
a broader kind of recognition that these companies do need
more kind of critical you know, critical analysis, critical insight,
We need to actually be looking at what they're what
they're doing, right. And I think that that's important because
I think that, on one hand, like it does kind
of bring the public along, and it does kind of
tell the public like, you know, maybe all that we

(25:59):
were saying before and all that it was cracked up
to be, and actually these tech companies deserve more scrutiny.
But I think it also kind of what you see
in that moment as well, is that the people in
charge of the tech industry were used to being treated
like these kind of godlike figures, Like these kind of
figures who were you know, giving us all this intelligence,

(26:19):
who we kind of worshiped, you know, like if you
think of like how Steve Jobs used to be seen
and he was handing us down our iPods and our
things like that, right, like he was worshiped, right, it
was it was basically like a like a religion. And
I find it interesting if you look at the like
auditorium that Apple has created on their Apple campus. Now,
like to me, if you look at it, like it

(26:39):
has the vibes of a church kind of and like
pews and all this sort of like this is how
I feel when I look at it. Right, But anyway,
this is this is not the point. But if you
look at what happened as the press became more critical,
and as the public became more critical, and as the
government became more critical of the tech industry, all of
a sudden, you have these tech leaders who are used

(27:00):
to being worshiped start to shift how they are responding, right,
and all of a sudden, they are feeling like victimized.
You know, they are kind of all powerful, they're billionaires,
they're some of the richest people in the world, but
they are treating themselves like victims because all of a sudden,
we're not worshiping everything that they do and just thinking that,
you know, everything that they do is positive.

Speaker 1 (27:20):
Back in twenty twenty, Silicon Valley billionaire and venture capitalists
Mark Andresen published an essay called It's Time to Build,
encouraging Silicon Valley tech leaders to be more involved in
building and shaping America. We've already seen more and more
tech leaders getting involved in politics and government. But when
these people who have the power to exert such control

(27:40):
over our futures embrace ideologies that are harmful, like long
termism to pro natalism, an ideology embraced by Elon Musk
that posits that all the really wealthy tech people should
be having lots and lots of kids to pass down
their superior genes and repopulate the earth with their super babies.
It's kind of hard to imagine that it'll be a

(28:02):
future that's better for all of us.

Speaker 2 (28:04):
We've seen that kind of slowly become clearer in recent years,
where you have people like Mark Andrieson kind of pushing
back on this, you know, releasing in twenty twenty that
it's time to build essay where he's really saying like,
Silicon Valley needs to be more forceful and needs to
be involved in like much more of society, kind of
you know, ensuring that they are shaping society in the

(28:26):
way that they think it should operate. But then you
also see, like you know, people like David Sachs getting
more involved in politics, you know, donating more. Obviously you
have Peter Tiel, but he's been at it much longer
than that. He has a kind of a larger project.
And obviously more recently you see Elon Musk becoming much
more open about his politics. But this is indicative of
a broader shift that we're seeing in the Valley away

(28:49):
from kind of, you know, these people being seen as
like liberal Democrats or whatever, toward the recognition that you know,
these are powerful capitalists at the head of an industry,
and now instead of being more they are being challenged
and their position is being challenged, and they want to
ensure that everyone understands that they actually deserve the position
that they're in. They didn't just get it because they

(29:10):
happened to be in the right place at the right
time when this massive industry was taking off and there
was a dot com boom, and they their startup was
one of the ones that rode the wave, and that
kind of gave them allowed them to like luck in
and use their already privileged positions to then kind of
take off to this new level. Right, And so I
think that is why we see, especially in the past

(29:30):
few years, them embracing these particular ideologies that are all
around kind of ensuring that the public believes that they
deserve the position that they're in, that they deserve the
wealth that they're in. And you know, when I say that,
I mean things like effective altruism, which I say is
basically the notion that like, you know, there's nothing wrong
with philanthropy, with there's nothing wrong with philanthropy, there's nothing

(29:52):
wrong with rich people. And how we actually solve our
problems is not by challenging rich people and taxing them more,
but ensuring they spend their billion of dollars more effectively
to address problems, or like long termism that says, you know, actually, yeah,
sure we face these problems right now in society, but
you know, we need to be thinking on a much
longer time horizon. And you regular people, the you know,

(30:14):
the means and yous and the listeners of this podcast
in the world, we are too kind of involved and
to thinking our thinking too much about the day to
day and how we're going to pay our bills and
on all these sorts of things. So we can't think
on these time horizons. So we need these wealthy people
like the Elon Musks of the world and the Jeff
bezos Is of the world to do that work for us,
So they need to be up there and thinking about

(30:35):
space colonization and all this bullshit, right. And then of
course the other piece of this, I would say, is
like the pronatalism that we're seeing as well, which is like,
you know, really kind of bringing the eugenics thinking back
into this eugenics thinking that has always been around in
tech circles and in Silicon Valley. You know, Stanford University
was kind of a hotbed of eugenics thinking and kind

(30:57):
of you know, reviving this put a fresh putting a
fresh goal on it, so that there is kind of
a new revival and a new justification for thinking about,
you know, genetics, for thinking about genes, for thinking about
kind of you know, Elon Musk really thinks that because
he is incredibly wealthy, that means as well that he's

(31:17):
incredibly smart, right, and that the smarts that he has
need to be passed on to future generations, to his kids,
because then, you know, if not, the world kind of
risks having all these low IQ people going around like
it's just wild thinking. But it shows you, how, you know,
in many different ways. They are invested in ensuring that

(31:38):
they keep their power, that they maintain their position, and
that we regular people are not challenging them, and kind
of the wealth and the power that they've accumulated over
the past few decades.

Speaker 1 (31:49):
It's terrifying. It also kind of I mean, it's that's
completely anti science, right, like it doesn't work that way
if you are if you are wealthy and like you
think of yourself as like a smart person, you're not
necessarily it's antimplely anti science. But you're going to pass
that down genetically to your kids, so you better have
a million kids.

Speaker 3 (32:09):
It's what's funny to me that.

Speaker 1 (32:10):
These it's such a fallacy because it's both like I
am so brilliant and you know, so good at business,
blah blah blah, so I'm going to utilize this completely
nonsensical anti science way of supreading that.

Speaker 3 (32:25):
It's like, well, whoa, whoa. They both can't be true.

Speaker 1 (32:27):
Either you're really smart or you're like falling for something
that's completely flies in the face.

Speaker 3 (32:33):
Of how genetics work at a very basic level.

Speaker 2 (32:37):
Oh no, absolutely, And like you very clearly see it
reflected in like everything that they're talking about right. And
I do think it's funny because like, on the one hand,
they present themselves as these like brilliant engineers who like
understand everything, and like, you know, the whole thing about
these Silicon Valley founders is that, like you know, going
back to like the PayPal mafia days and stuff like that,
is like they really believed that they had this kind

(33:00):
of inherent knowledge, this knowledge that allowed them not just
to understand technology, but to move into any industry that
they thought was ripe for kind of technological disruption and
disrupt it basically. Right, you know, Elon Musk can go
into banking and cars and space and Twitter and whatever,
and because he is just the smartest person in the world,

(33:23):
he will know exactly what needs to happen for all
of these sectors to be transformed and made better by
the tech industry. And I think what we've actually seen,
like if we actually look at the impact of the
tech industry is that in some cases, yes, they have
made things more efficient if they have kind of organized
things properly. Like if you look at as much as
I hate Amazon, like it's supply chain and logistics system

(33:44):
does seem to be like, you know, quite an impressive
feat that it has put together. Right, there's a whole
load of exploitation that it's built on, and like all
that kind of stuff, Like let's not forget that, but
you know, it has kind of effectively put this system
together that like I would love to see transferred over
to the post office or something like that.

Speaker 1 (34:02):
It's amazing what you could accomplish when you're like grinding
the bones of workers exactly exactly.

Speaker 2 (34:07):
But then on the other hand, is like you'll have
a company like Uber, which promotes itself as like this
efficient transportation company because it's using all this technology to
like design the routes and the patterns and whatever of
how all these vehicles go. But you know, if you
look at listen to people who actually understand transportation, like
Hubert Horn, a transportation consultant who's been kind of writing

(34:29):
critically about Uber for years, you understand that actually their
model is like less efficient than the model that existed
before because they have lost like all of these you know,
ways of organizing the business that are actually more efficient
in that they don't have a fleet of cars. Everyone
has to own their own car and do their own
maintenance and buy their own insurance, and that's much less

(34:49):
efficient than what existed before. And because you have this
kind of global company that has these massive, expensive headquarters
and these like well paid executives and these high paid
tech workers and all this kind of stuff like working
on all these things, actually the cost of running that
business is higher than like a regular taxi company that
just like had a small office in a city and

(35:11):
like a few dispatchers and stuff like that, and like
you know, they kind of managed it and worked it out,
so so you know, and I think that there are
actually many other examples like that, where like we think
that because of the narrative around the tech industry that
oh my god, they've revolutionized all these things, they've made
everything better, But actually they've just put like the gloss
of digital tech over something that already existed and tell

(35:32):
us that it's better. And we kind of believe it
because this is the narrative that we have about technology.
But that isn't necessarily the case.

Speaker 1 (35:39):
Yeah, And I'm so glad that folks like you exist
who are out there really trying to shift shift that
narrative and like pump the brakes and say wait a minute,
like is this actually better? Is this actually more efficient
or are we just so used to these confident men,
you know, telling us that it's better at us just
sort of believing it. Like, I think a big part
of it it's also just good old fashion hubris that

(36:01):
if I'm really good at building rockets, of course I
would be good at figuring out, you know, content moderation,
which is this incredibly complex thing that like people debate
all the time about how to do it right. I'm
I built a rocket, right, Like, I'm a pretty good podcaster.
That doesn't mean that I would be a good president
or that I could solve you know, uh uh, like.

Speaker 3 (36:23):
Homelessness or something like. It's incredibly it's hubris.

Speaker 1 (36:27):
But it's also it's so like narrow minded that you
think that because you maybe are smart in one sphere,
that that is always going to translate over because of
something particular and innate to who you are.

Speaker 3 (36:39):
It's so it's I just really hate it.

Speaker 2 (36:42):
Yeah, oh yeah, And I think it tells you a
ton about like who these people are and how they
think about themselves, that they do think that just because
they're successful in one arena, that they can naturally kind
of move that success into many others. Just because they
are like, you know, they've been graced with this like
an inherent intelligence. And you know, the tech industry for
whatever reason, is like can just do everything and whatever,

(37:05):
and you know, they also kind of like apply this
kind of software mindset to everything to the physical world
even when it doesn't doesn't work out. And so, you know,
I think that one of the things that's been refreshing
over the past few years is to see kind of
a greater awareness around you know, these people and their
flaws and how they're not kind of like messiahs and

(37:26):
great people who are like making the world better, but
actually like you know, exploiters who you know, are making
massive profits off of a ton of workers, workers who
they're trying to like hide behind, you know, the kind
of veil of technology, but actually you know, they are
creating these systems that are built on exploitation. And like

(37:47):
generative AI of course is another version of this. You know,
even though it's presented as just being like all the
tech technology and computers like doing all this magical stuff,
there's there's always kind of workers below it that like
we like to ignore. And so I think one of
the things that's been positive is seeing people wake up
a bit more, and certainly not everyone has, certainly, like

(38:07):
you know, I would like more people to have done so,
But I still think it's positive to see us moving
in this direction. To see that like as soon as
new technologies are announced and kind of unveiled and rolled
out now that you know, very quickly, there's a discussion
around you know, does this make sense? Should we be
regulating it? Like what are we doing about this? Like
we see whitchat GBT, or we saw about the metaverse,

(38:29):
like people just quickly ridiculing it as soon as Mark
Zuckerberg rolled it out, And like, I think that's a
really positive like change, because I wonder if we would
have saw that like a decade or so ago. And also,
like you know, how Crypto was. Sure you had a
lot of people who were like really invested in it
and really believe that crypto is the future and whatever,
but you also had this really strong community who was saying, like,

(38:50):
hold up, this does not make any sense. This is
just spammy, exploitative Ponzi schemes, and we can't allow this
to go anywhere. And I think that we've seen like
the you know, the kind of the steam kind of
has come out of that bubble right like it has collapsedure,
they're not all gone, but I think that these things
are like positive and they give me some hope that
like we're at least sort of going in the right direction,

(39:13):
and we just need to keep kind of you know,
keeping up the good fight to ensure that as you know,
the tech industry keeps kind of pushing these new waves
of hype that you know, there are critics who are
around who are looking at what's happening and saying, like,
hold up just a minute. What these people say are
saying like makes absolutely no sense, and we need to
be looking not at like the pr and the marketing

(39:35):
lines that these companies and you know, the executives like
Sam Altman want us to be thinking about and looking
at and talking talking about, but to actually look at
the real impacts, to like be informed by history and
how these things have worked out in the past, because
the tech industry loves to ignore history and to actually
try to like get some lessons about what might happen

(39:56):
so that we can respond like proactively, instead of being
a few years down the line and realizing, like, man,
we've let this go a bit too far. It might
be impossible to roll it back at this point, right,
So yeah, I take some hope from those things.

Speaker 3 (40:12):
More.

Speaker 1 (40:12):
After a quick break, let's get right back into it.
When Zuckerberg started hyping up the metaverse, everybody pretty much
just made fun of him until he dropped it. And

(40:33):
before the crypto crash, it was crypto critics who wouldn't
stop calling it out as a scam. And now we
need to be ready to call out AI hype, especially
when it's hype that just ends up hurting workers. I
am almost embarrassed to admit this, but I am very confused.

Speaker 3 (40:52):
About I get asked about this a lot.

Speaker 1 (40:54):
I'm always like, oh, I don't know the role that
generative AI will play in our future. And I think
part of it is I think that I shouldn't feel
too bad because I think this is by design. There
is so much just pr speak that I feel like
I don't have a good sense of what is actually
going on. And I listened to the episode that you
did with Julia Black and she made such a good
point that when she asked Sam Altman, like straight up, okay,

(41:16):
so for a like, imagine who you think of as
like a typical American how will this change? How will
AI change her life in ten years? And that he
not only didn't have an answer, that it was like
he had not thought about the question. And I was like,
oh God, that's not good. And so I think that
we hear so much pr speak. We hear so much

(41:36):
like AI is gonna change everything, It's gonna take all
of our jobs, and then people saying like, no, that's
what that's what people who make money off of AI
want you to be thinking about and want you to
be repeating and want you to be like making a
thing like making thatch happen when it like what are
your thoughts when it comes to generative AI and where
how it is going to actually impact the lives of

(41:57):
someone like the you know mother of Ree who you
know makes add a middle income What do you think
how is it going to impact our lives?

Speaker 2 (42:07):
Yeah, so I would say I'm definitely skeptical of yeah,
of like a lot of the narratives that we're hearing, right,
because yeah, and again like that's just informed by seeing
how things have worked out in the past, right, by
seeing that this is an industry that really depends on

(42:29):
constantly kicking up new hype cycles. In order to keep
investment flowing, in order to keep people making money, in
order to keep interest on kind of their industry and
their products. And what you very clearly saw was that,
you know, after the crypto collapse and after you know,
the kind of ridiculing of the metaverse, and after the

(42:51):
raising of interest rates in particular, the tech industry really
needed something, right. It really needed you know, a next
big thing to kind of show to everybody to make
sure we were all talking about, to make sure it
stayed relevant, and again to keep money flowing, right, to
keep money moving through the system. And so generative AI
has emerged as that thing. And so I look back

(43:15):
at the mid twenty tens, as we were talking about,
when there were all of these narratives around how robots
and AI were going to take all of our jobs, right,
and millions of people were going to be out of work,
Like there were studies saying like forty fifty percent of
people were going to lose their jobs in the coming years,
all this kind of stuff. There were going to be
no more truckers because self driving cars were going to
replace them all. There were going to be no more

(43:36):
taxi drivers or uber drivers because we're self driving cars
were going to be everywhere in the next few years, right,
still waiting for those self driving cars, you know, I
know there are a few on the streets in San
Francisco and stuff, but like they're really not a mass
transportation system. And what we saw durund the pandemic was
actually we needed some more truck drivers because the whole
kind of supply chain was a mess. And part of

(43:59):
the problem there is because truck drivers have had their
working conditions and their wages so pushed down over recent
decades that people don't want to join the profession. And
so it's not because like robots are getting rid of them.
It's because work the working conditions suck, and we actually
need more of them. But anyway, that's kind of you know,
getting away from the point. But so I saw all

(44:21):
of that, and I saw like all of the narratives
around that moment and how there was a lot of
excitement and a lot of belief that like robots were
going to replace baristas and they were going to replace
elder care workers and all this was just a few
years away and never happened. Right, What we actually had
was technology being deployed in you know, the ways that
we've been talking about, where it ensured that workers were

(44:42):
reclassified from employee to contractor, where it ensured that their
pay was was lessened, where it ensured that they were
more precarious, where ensured that there was algorithmic management and
more surveillance of them. These were the actual kind of
legacies of that period, not destroying jobs and all all
that kind of stuff, and not making things a whole
lot more productive. We didn't actually see a whole lot

(45:04):
of investment in automation through that decade. The scholar economic
historian Aaron Benanov kind of found that when he went
and actually looked through the data for that period, what
we actually saw was, you know, just all of these
kind of surveillance tech and all this kind of stuff
used against workers. And so when I see what's going
on right now, I see the hype cycle I was
talking about. I see kind of the return of some

(45:26):
of these narratives from the mid twenty tens. You know,
open AI released the papers say talking about all the
jobs that were that were at risk because of CHAT,
GPT and all this kind of stuff. But that's not
to say that there aren't real threats that are being
posed by this technology, where again, it can further disempower labor,
it can further ensure that they're more precarious, it can

(45:47):
further push down wages. We're seeing, like you know, the
front end of that. The most obvious piece of that
is like the writers and the screen actors who are
undergoing contract negotiations right now. The writers are even on
strike and one of the things that they're concerned about
is how AI can be used in their industries to
kind of reshape their performances, to change the words to

(46:08):
kind of generate you know, scripts or generate writing that
they would have to rewrite, and it would mean that
there would be less work to do for them, and
it would also mean that the quality would decline. Like
I think that those are more of the risks of
generative AI if we're looking at the potential impacts, and
I would say that right now, these technologies are also

(46:29):
being sold to us in such a way where like
they're going to become ubiquitous in our lives. We're going
to be using generative AI and chatbots all the time.
They're going to be like rolled out into everything. This
is just the new way of the world. I think
a lot of people are being too quick to like
accept that as inevitable when I don't think it is.

(46:49):
Like think about the voice assistance that were rolled out
in the past, like six or eight years. How many
people actually use those regularly anymore? I certainly don't, And
I read a story recently that suggests actually a lot
of people don't. This is not just anecdotal evidence. So
you know, I think that there's a lot of things
that the tech industry rolls out that we think are

(47:10):
like the next big paradigm, that we think are going
to revolutionize everything, and it doesn't actually play out as planned.
And I think the final point I would make on
this is that when we look at generative AI, what
we actually see is that it increases, like the cost structure,
it increases the cost of doing these things. So if
you're switching over a search engine from the way it
works right now to like generative AI and chatbots, the

(47:32):
cost per search goes up, you know, quite significantly, actually,
because it uses more computer power, because it's trained on
these massive models, because it relies on all this centralized
kind of computation right in these massive data centers that
these companies own and I think that that's another piece
of this that maybe there's not enough talk about, because
certainly you have the sam Altmans of the world going

(47:52):
out and saying this is going to empower every individual,
and we're going to have chatbots that are like personalized
for you and all this kind of stuff. Actually, what
this technology does is, you know, we've been talking about
like anti trust and breaking up companies and the issues
with like monopolization in the tech industry for a few
years now. What this actually does is like kind of
cement that power, because you can't really have generative AI

(48:14):
without scraping all of this data from the web, without
having this centralized kind of computing infrastructure that only a
small number of companies globally can actually control. And so
if we do move in this direction, we're kind of
building in the power of these large companies in a
way that we were trying to challenge a few years ago.
And that suggests to me that they're also kind of

(48:36):
responding to that fact. You know, there was a story
just a few months ago about how the tech companies
had very quite successfully kind of challenged and kind of
blunted the push for anti trust, and now that the
Republicans have taken over one of the houses in the
States that sorry I can't remember which, the Senator of
the House or whatever. Anyway, they took over one of

(48:56):
those and so now like it's pretty much going to
be impossible to get any like anti trust legislation through
for the next number of years, I would say. So
the tech industry is very concerned about these things as well,
very kind of alert to this. So, yeah, those are
all my thoughts on Jenner VAI.

Speaker 3 (49:13):
That's really helpful.

Speaker 1 (49:15):
And I think it does, like you know, I'm I'm
on TikTok a lot, and I do think that it's
it really is reflective of like it's a hu it's
a it's a little bit of a hustle.

Speaker 3 (49:26):
And the people who are selling you the hustle have money.

Speaker 1 (49:29):
There's money to be made, right, And so I see
people who are like, oh AI is going to change everything.
You're going to need to know how to integrate AI
into your work, So better sign up for my course,
and I'll tell you like, it's like it's it's it's yeah.

Speaker 3 (49:41):
Just it just feels like a little bit of a
little bit.

Speaker 2 (49:44):
Of a hustle, and that's the way it's always been right,
Like every single time there's a new shift like that.
Like I remember when it was popular to like like
when Kindle and like self publishing and all that stuff
was taking off on Amazon. There were a ton of courses.
There were a ton of people trying to sell you, like,
you know, the way that you were going to be
a successful self published author and all this stuff. And

(50:06):
like you know, there's there's all these courses around SEO
and search engine option optimization to ensure that your website
is going to like get to the top of the
search results, and like this is something that happens every
single time, and I think it's very indicative of this
happening again that like some of the like you know,
beyond the kind of pr lines from the sam Altmans

(50:28):
and like the real AI boosters that you see the
people saying that like you know, it's a massive threat
to humanity, using these kind of long term mist arguments
against it, saying that you know, they're on the cusp
of digital minds. All this stuff is complete bullshit by
the way, you know. But the other like big narrative
that you see on Twitter and TikTok and like all

(50:49):
these other social media platforms is like the hustle bros
and the hustle people like going really hard, like the
LinkedIn the LinkedIn folks as well, like this is all
the stuff you need to know about AI, this is
how it's going to improve your workflow, help you writing emails,
like all this blah blah blah. Yeah, it's it's a
load of bull shit.

Speaker 1 (51:04):
Have taken over Twitter, Like I almost it's it's just
so fucking boring and like I almost miss like I
don't know, it's it's just it's just such empty, boring
garbage that I can't you can't even like dunk on it.

Speaker 3 (51:18):
You can't even like all it is.

Speaker 1 (51:20):
It just like clogs up your feed and where your
feed is already like worse than it was five years
ago anyway, you know, it's just it's it's so.

Speaker 2 (51:26):
Boring totally that the whole website is like going downhill
and it sucks because like you know that there was
such a community there, like it was it was fun
to be on Twitter, but like I certainly use it
less than I used to. I'm not like off it
all together, but especially when you see it like slowly

(51:46):
moving in this kind of direction, that Elon Musk is
taking it where like I think he's explicitly making decisions
to kind of reshape it as more of like a
right wing platform, right like parlor failed gab failed thrue
socials a joke, Like you know, they they still want
to control, like, you know, the the kind of way
that we communicate with one another. They still want to

(52:07):
control the media. That's long been like part of the rights,
you know kind of desire what they want to do.
You know, they have Fox News, but they also kind
of strongly influence like the liberal media quote unquote as
well to kind of pull the narratives in that direction.
So obviously they want to do that on social media
as well, and have been effectively doing so for many years.

(52:28):
We can go back to the reporting on Facebook and
like how Zuckerberg was like worried about kind of pissing
off the conservatives or making it look like he was
taking actions that would go against him even as their
pages were like getting the most kind of views and
growth and all that stuff. And so now you see
like Tucker Carlson starting a Twitter show, like how does
that make any sense? We see DeSantis, Like as we're talking,

(52:48):
DeSantis is going to like announce his presidential bid this evening,
you know, and obviously all these kind of right wing
folks just get so much additional traction, or they don't
seem to think so, I guess, but you know, they
get promoted a lot on Twitter now because of how
Musk has changed things around and let all the Nazis
back on and all that stuff, like, yeah, it's just

(53:09):
a it's a real health site. Like we used to
joke it was a health site that like now is
really elside.

Speaker 1 (53:14):
Yeah, I mean, something I've noticed is that people are
starting to come around on the fact that Musk has
been parroting pretty extremist, far right talking points and it's
not a new thing.

Speaker 3 (53:25):
It's been doing it for kind of a while.

Speaker 1 (53:28):
Why do you think that is something like even people
that I respect in tech press didn't really engage with that, honestly,
I feel or like weren't able to get it. There
was always this vibe of like, maybe it's just like
an edgy thing that he's doing, or it's a like
I once read someone saying like, oh, he's trying to
save the environment by making buying electric cars, that he

(53:51):
makes look more appealing to people on the right to
help solve climate injustice, Like what a take Why do
you think it was so hard for people to just
like take him at his word and like listen to
the things that he says and believe them when he
says them.

Speaker 2 (54:07):
Yeah, Like what you're describing is like how far of
a stretch some people go to to like justify exactly
what he's doing and what he's saying.

Speaker 3 (54:15):
Right.

Speaker 2 (54:16):
I think that the key thing here is that for
such a long time, Elon Musk was like, you know,
he was kind of the tech god. He was the
guy who was kind of he was the figure who
was kind of held up from the tech industry, especially
after the death of jobs, that was like doing all

(54:37):
of this wonderful stuff for us. He was investing in
the rockets and was going to take us the Mars,
and he was building the electric car and was going
to save us from climate change and like all this
kind of stuff, right, And the media did hold him
up as this figure who could really do no wrong
and who we really had to like worship as not
just like this incredible tech entrepreneur, but as the builder

(54:59):
of the future, right, Like he had a vision that
nobody else had that you couldn't get from the political
system or anything like that. Like he was the future
and he was someone that we had to trust. And
I think that that has like, even though that has
started to shift in recent years, I think that first
of all, it took a while for the media to

(55:19):
really wake up. I think like even as you did
have them publishing more kind of critical stories on Elon Musk,
they would still publish the puff pieces as well. Right,
you know, even what was it last year, he was
still named Times Person of the Year or whatever, Like,
even as he was increasingly being engulfed in controversy, there
were a ton of stories about the racism and sexism
in his factories. You know, there was the story I

(55:42):
believe about him like sexually assaulting the flight attendant. I
believe that was just before that happened. Like there was
all this stuff that was kind of coming out about
Elon Musk, and he still got this this title, right,
And so I think that like unfortunately, like I think
that there's a lot of people who really believe in
the cult of and who are not just kind of
like personally invested, but also let's be real, they've bought

(56:05):
a Tesla or they bought Tesla stock and they're like
financially invested in Elon Musk. Because one of the things
that is like somewhat unique about Tesla is there's a
lot of quote unquote retail investors, people that are not
institutional investors who owned Tesla stock, which is a bit
less common. Right, It's because Elon Musk has all these
fans and they've bought in and they kind of believe

(56:26):
what he's doing and where he's going, and they want
to follow him. Of course, some people have woken up.
But then the other piece of this is there were
a lot of you know, kind of people who reported
on Elon Musk who got to know Elon Musk because
they're access journalists and I won't name names, but maybe
you can think about who I'm talking about for a
number of years who really helped him to kind of

(56:49):
build up the reputation that he had and wanted to
ensure that they maintain that connection to him. And I
think also kind of some in the way that they
presented Elon Musk to the public and didn't want to
recognize that not only was it not only is he
kind of a bad person, who is you know, ensuring

(57:10):
that transphobic talking points, who is ensuring that kind of
white nationalist talking points, Like all of this kind of
really terrible political stuff is being furthered and that he's
like actively sharing it and promoting it and believing it.
But also that there how they presented him for so
long was also wrong and was not an accurate reflection

(57:33):
of who he was and what he was doing and
his impact on the world because they created this narrative
around him. They created this character of Elon Musk, and
he was you know, this was a valuable property that
the media had created and kind of sold to the world.
It was great for him, that allowed him to get investment.
It was also great for them because any time they
publish anything on Elon Musk, it gets a ton of views. Right,

(57:57):
people love him, people hate him, whatever people want to
read about him, And so that was great for the
media as well. And that's part of the reason that
they that they kind of created this narrative around him,
that they created the character of Vila Musk. But if
you go back to it the very early days when
he is starting zip too, when he's starting x dot com,
you can see the stories about how he is a
terrible manager who is an asshole to the people who

(58:19):
work for him. Right, this is not unknown, This is
very clear at that time, and that continues as he
starts other businesses as well. And like we see the
stories time and again through his history of like who
he actually is. But I think that some of these
people who worked in the media, who kind of were
incentivized to promote him in this way, ensured that that

(58:39):
aspect of his personality and who he was was always downplayed,
was always kind of hidden from the public, was not
part of the persona that was sold to us. And
so now there's kind of a reckoning, not just with
who he is, because some of these people will say, oh,
he's just changed in the past like six months. No,
he's always been this person and now he's just being
more open about it. And so there's a not just

(59:00):
with that, but also like with what we've been told
about him for a long time, and some people really
don't want to engage with that and don't want to
believe that he was always who he is now and
is just being more open about it, and want to
pretend that this is kind of a break from what
they knew before or again is some kind of like
you know, three dimensional chess sort of a thing that's

(59:21):
going on where he's actually trying to get conservatives on
board with climate policy. Like it's just total bullshit, right,
But yeah, so that's kind of how I think about that.

Speaker 1 (59:29):
So I always end all of my interviews with asking
when it comes to the future, are you hopeful? Like
what is there anything that brings you hope?

Speaker 2 (59:38):
I would say yes, And I think it's difficult to
be hopeful sometimes, like seeing the state of the world,
seeing the climate getting worse, seeing like the continued influence
of the tech industry and like the worst people in
that industry. But I feel like if I didn't have hope,
like I would just be like totally depressed and stuff.
So even despite the fact that things look bad, I

(59:58):
feel like I have to be hopeful. And that's not
to say that like there aren't things that do give
me hope. I as I was saying, like I feel
hopeful seeing kind of people push back against the tech
industry and seeing like the broader realization that we have
around these people in the industry, around like what they
are doing, and the recognition that like you know, they're
not who they have been sold to us to be.

(01:00:22):
But then like on top of that, it's also quite
inspiring to see, especially in the past number of years,
a lot more organizing, not just in the tech industry,
but like in the society more broadly, people kind of
pushing back against you know, actions of the tech industry,
people unionizing, like all these kinds of things I think

(01:00:42):
are really hopeful because they suggest that, you know, maybe
we're moving in a direction where people are getting really
pissed off with the way that things are and are
trying to you know, change those things, are trying to
ensure that we don't keep moving in that direction. And
of course, you know, I guess the flips of that
is that we also see that organizing happening on the

(01:01:03):
other side among the fascists and the white nationalists who
are trying to make the world worse. But I think
that it's almost like inevitable that in the stage of
capitalism that we're in, we're just going to see this
kind of polarization. And you know, the goal is to
ensure that we have as much energy and power as
we can build, like you know, on the left and

(01:01:23):
among people who actually care about like regular people and
not just like, you know, all this kind of fascist,
racist garbage that we're increasingly seeing to kind of build
a better world. So I feel like I feel like
that's a bit mixed, but I would say on the whole,
I am hopeful that you know, even though things are

(01:01:43):
going bad in some areas, I think that there's a
reason to believe that people are willing to fight for
something better, and you know, hopefully that continues.

Speaker 1 (01:01:52):
Well, your work is a beacon of light in a
in what sometimes feels like an ocean of garbage. Where
can folk listen to your podcast and all the cool
stuff that you're doing.

Speaker 2 (01:02:04):
Yeah, thanks so much for inviting me. It was really
great to chat. I would say, you know, the podcast
is tech won't save us. It's like on all the
podcast platforms, wherever you listen, it should be there. I
also have a newsletter called Disconnect, which is over on substack.

Speaker 4 (01:02:19):
You know.

Speaker 2 (01:02:20):
You know people have mixed opinions on that platform. I
certainly do as well, but if you want to read it,
it's over there. You know, you can follow me on
Twitter or maceedon or blue Sky. You know, I post
a bit on on those platforms. Still, yeah, I would say,
those are the key things.

Speaker 1 (01:02:36):
This has been amazing. Is there anything that I did
not ask that you want to make sure it gets included?

Speaker 2 (01:02:42):
I don't think so. This has been great.

Speaker 1 (01:02:48):
Got a story about an interesting thing in tech, or
just want to say hi? You can read us at
Hello at tangodi dot com. You can also find transcripts
for today's episode at TENG Goody dot com. There Are
No Girls on the Internet was created by me Bridge Pad.
It's a production of iHeartRadio and Unbossed creative. Jonathan Strickland
is our executive producer. Tari Harrison is our producer and
sound engineer. Michael Almato is our contributing producer. I'm your host,

(01:03:10):
Bridget Todd. If you want to help us grow, rate
and review.

Speaker 4 (01:03:13):
Us on Apple Podcasts.

Speaker 1 (01:03:15):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Therapy Gecko

Therapy Gecko

An unlicensed lizard psychologist travels the universe talking to strangers about absolutely nothing. TO CALL THE GECKO: follow me on https://www.twitch.tv/lyleforever to get a notification for when I am taking calls. I am usually live Mondays, Wednesdays, and Fridays but lately a lot of other times too. I am a gecko.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.