All Episodes

June 13, 2025 73 mins

This week, Bridget is joined by very special guest Ed Zitron, host of the Better Offline podcast, to break down the tech stories you might have missed — from disappearing tech bros to billion-dollar AI fails.

What Side Are the ‘All-In Pod’ Bros On? https://gizmodo.com/what-side-are-the-all-in-pod-bros-on-2000613146

Senators Demand Meta Answer For AI Chatbots Posing as Licensed Therapists: https://www.404media.co/senators-letter-demand-meta-answer-for-ai-chatbots-posing-as-licensed-therapists/

When billion-dollar AIs break down over puzzles a child can do, it’s time to rethink the hype: https://www.theguardian.com/commentisfree/2025/jun/10/billion-dollar-ai-puzzle-break-down

X’s Sales Pitch: Give Us Your Ad Business or We’ll Sue: https://www.wsj.com/business/media/x-twitter-ad-revenue-campaign-lawsuit-a882b5c6?gaa_at=eafs&gaa_n=ASWzDAimcjU5eU89AMytxSvmFp6PmZhcNpON86osFJupKMQnziyRXOKarL8dMFshDFo%3D&gaa_ts=684c8c2f&gaa_sig=E3KGTOGP-dVC0729K2mM6iwMRj0nLkS5829g4j-aRhDBSGRZxOZmd9Q5yCDlJgb05XH2MlQQUGjrWvXsAf2Y3Q%3D%3D

FanDuel bans bettor over heckling incident with Olympic champion sprinter Gabby Thomas: https://www.mankatofreepress.com/ap/sports/fanduel-bans-bettor-over-heckling-incident-with-olympic-champion-sprinter-gabby-thomas/article_b258ffa7-34e0-5f2c-966f-eceb36d2ef1a.html

🎧 Listen to Better Offline: betteroffline.com
📰 Subscribe to Ed's newsletter "Where’s Your Ed At?": wheresyoured.at

📱 Follow Bridget + TANGOTI:
IG: @BridgetMarieInDC
TikTok: @BridgetMarieInDC
YouTube: There Are No Girls on the Internet

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
There Are No Girls on the Internet. As a production
of iHeartRadio and Unbust Creative. I'm Bridge Tad and this
is There Are No Girls on the Internet. Welcome to
another edition of our weekly news roundup where we get
into all the stories from the Internet that y'all might
have missed so you don't have to. And I am

(00:26):
so thrilled to be joined by honestly one of my
favorite podcasters out there, Ed Zechran of the Better Offline podcast,
where you've just started a three part series on business idiots.
Tell me about this series.

Speaker 2 (00:39):
Well, I've noticed throughout my whole life that there have
been these people throughout the rungs of power who don't
seem to do jobs, and on top of that, don't
seem to understand jobs. And it's all come to a head.
It's taken a few years. It's started with remote work,
it went to crypto, the metaverse AI, all these people
with all of this money who don't seem to know

(00:59):
a single fucking thing about anything. And then I went
and looked back and I kind of went, oh, it
traces directly to neoliberal thinking. Milton Friedman, Thatcher, Reagan and
all that good stuff. So it's a three part saga.
It's the longest series I've done in the show. People
seem to really like it. I wasn't sure how they'd
take it, and I just built a studio out in
my place in Vegas and I recorded it there. So

(01:21):
it's a bit totally different people used to because before
this point I had this sound chamber thingy where I
just stood in it and I just yelled for an hour.
This is much more chilled out me speaking like an
almost like sitting by a campfire or a dumpster fire
in this case.

Speaker 1 (01:38):
That's something that I've always liked about your podcast is
that you can often hear the passion in your in
what you're saying, and so I'll be interested. This is
this will be an interesting change. I also record from
a I used to record from a soundproofed little box
in my apartment.

Speaker 2 (01:55):
Yeah, yeah, yeah, that's exactly.

Speaker 1 (01:56):
It can really let loose in those little boxes.

Speaker 2 (01:58):
But I quite like the city. I'm just as pissy.
It's not like I'm any It's not like I'm any
less angry, but I'm a lot more comfortable. I think
that's what it is like. I don't have to stand up,
which I actually like for diction reasons, like for projection,
but I can handle it sing down. It's just it
is more relaxed and more I really like it, and

(02:19):
people seem really happy with it, which is good because
usually they just send me pictures of knives.

Speaker 1 (02:27):
Okay, I have to ask you about this piece in
your newsletter, which is terrific. Where's ther ed at? You
write so beautifully about what the Internet and technology has
meant for you personally and how you really would not
be the person that you are today if not for
technology in the Internet. And that's kind of one of
the guiding principles of our show too, is that the
Internet has really been at the forefront of who I am,

(02:50):
how my younger self sort of learned how to show
up in the world and become a self actualized person.
But do we still have an Internet landscape where that
kind of self discovery and self actualization is genuinely possible.

Speaker 2 (03:03):
Yes, it's just that their land minds everywhere now there
are Before what it was was when I think, I
think with similar ages. So it was before a little
more open and a little more disconnected. It was just
it was more randomized and it didn't feel like there
were so many warring incentives other than those of just

(03:24):
individual users or admins on forums. Now you have every
step of the way a different incentive, or multiple ones
just bashing into each other or trying to have sex
on your computer. Like you'll go on a website on
your phone and your phone will be one hundred and
fifty degrees within two seconds because there's twenty one different
ad trackers. But then even with I feel like algorithms,

(03:44):
mentioning algorithms is almost cliche at this point because it's
so obvious. But even then, you've got people on forums
who are actively trying to scam you. And the large
problem is, ironically that while there are all these warring incentives,
the actual all people who run the Internet have never
had any responsibility for it. They've made all the money
off of it. But it's not like Google or Facebook,

(04:06):
and indeed both have kind of peeled back layers of
protecting people from spam and scams. It's not like they've
ever thought we should make sure this is good. They
did for a minute, I think they did for like
the first ten years of Google, maybe the first five
six years of Facebook, but they then went wholl whoa whoaah,
this isn't going to grow forever. We need to fuck
with it, we need to make it worse. But I
do think that that self discovery is still there. YouTube,

(04:27):
even though it's a complete nightmare with the algorithm, is incredible.
It really is still incredible. It fuckings. I wish anyone
other than Google owned it, Lockheed Martin Hanoi, perhaps Wow,
like a more ethical company than Google. Yeah, at least
we know what they're doing. I think the YouTube still
has this incredible educational side, and you have all of

(04:48):
these incredible niches on there. You have guys making covers
of songs in the styles of other bands. You have
people I saw someone making like a wood ice cream
on there the other day, Like, there's still this magic there.
The magic at the end isn't dead. And I think
that the problem is is that to get there you
have to go through like bobed wire or bobbed why.
You don't realize is sharp until it's cutting your throat.

Speaker 1 (05:11):
The thing that I love so much about this vision
of the Internet that you lay out is that you're
able to make these business decisions that tech leaders have
made hyper personal for people. And I don't know it
was almost like a rallying cry of like yeah, like
never forget how these fucks ruined the like tried to
ruin the Internet to make money, Like like never forget

(05:31):
that we had something that was kind of good and
people decided what if we fucked with it to make money?

Speaker 2 (05:37):
And that's the thing. It's these people, their power has
come from their relative anonymity. Well, we know who's sachur
in the Della and Sam Moltman and Sunda Pashai and
all of these people, we know who they are. They
have mostly just got glossy, they get glossy profiles, and
they're mostly unknown, when in fact I think they're turns
about to war criminals. We know who Mark Zuckerberg is,

(05:58):
But I don't think people realize Mark Zuckerberg's bad thing
he did was not just his outright sexism or stealing
the website from the Riverboat Twins the Winklebot bosses. It's
the fact that it's not just a wiged people that
he's led to getting killed. It's not all of that.
It's the fact that at its core, Facebook is a
tool that's used to hurt people. Now, it is used

(06:18):
to manipulate and twist them. It does not have to
be that way, literally doesn't. Because Mark Zuckerbet has complete
controlled the company. He could tomorrow turn off all monetization
on Facebook. The stock would crash, but there's nothing anyone
could do. They couldn't fire him. He controls the whole company.
He chooses this. And I think what people don't realize
is they kind of ambiently know these people suck. They

(06:40):
don't know what they're doing and why they're doing it
and the intentions behind it. And I think that educating
them on that is important because the more that that happens,
the more that people realize this is deliberate, the less
likely they are to get such a free ride in
the press. And they only care about the press. The
press is such a powerful thing with them because they've
got all the money in the world now they just

(07:00):
need They just have their names, They just have their legacy.
So fuck that legacy. I'll piss on it.

Speaker 1 (07:05):
Piss on it, you do it on your podcast beautifully eloquently. Okay, Well,
because I'm talking to another tech podcaster, I kind of
have to start with my dessert a little bit because
a podcast that we piss on a lot on this
podcast is the All In Podcast, And I mean your
face just there really says it.

Speaker 2 (07:24):
All and need a show.

Speaker 1 (07:28):
Okay, So last week, obviously we talked about Musk and
Trump's big breakup flight update there, which is that Musk
did tweet that he regretted some of what he said
about Trump. Everybody was talking about this. But the All
In Podcast, which you might say, is hosted by a
group of business idiots. You know, it's one of the
biggest tech podcasts in the world. It's like four Silicon

(07:48):
Valley invester types, one of whom David sax Is Trump's
like AI and cryptos are. They were notably silent, and
all their listeners were like, what's up? Like why are
they not putting out an episode on this? Like where's
the episode?

Speaker 2 (08:00):
I do love that though, I do love that their
listeners to this day continue to win the Fell for
It Again award. Yeah, it's like, I don't get it.
Why are these guys not being honest with me as
they've done like hundreds of craven obviously like very much
corrupt things, but they are business idiots. By the way,
David Sach's famous for selling fucking Yama to Microsoft, an

(08:23):
internal message to Shamath is actually evil. Shamav is one
of the original people who built the growth team at
Facebook under Sheryl Samberg, Naomi gLite, Jacob Yakob Olivan I
think his name is Javier Olivan, possibly Las Backstrom as
well as part of that all of the original shit
heels that made Facebook how bad it is today, Well

(08:45):
thanks to Shamav. But yeah, these what are those freaks
up to now? They just they're not talking about the
Elon breakup because they don't know which which daddy is
going to kiss them that week, Like, I don't really
know what they're doing.

Speaker 1 (08:57):
No, And I have to say, it made me really
appreciate what you do what I do, that that we
have actual press and podcasts that get to talk about
what's happening, get to be actually critical of technology. And
I guess it just made me realize how much tech
press fails. It fails their audience, that fails as listeners.

Speaker 2 (09:19):
I'm like, And that's the thing though, I think the
all in podcasts are genuinely harmful. I think they're bad
for society. I think they've done horrible things. I think
they're tiny in the comparison to the harm that Kevin
Ruce and Casey Newton do with hard Fork tell me more.
I think hard Fork. I think hard Fork gives people
permission to support the powerful. Casey Newton's boyfriend works at Anthropic.

(09:40):
They've had Dario m a Day or Warrio as I
call him, the CEO of Anthropic on their three times
they've had multiple different Anthropic and they disclosed that Casey's
boyfriend works there. The fuck is going on there? Every time?
I'm bringing this up with everyone because it's insane, It's
totally disgusting, and we have The New York Times pretending
like this is somehow okay. They are in the pocket

(10:04):
of the AI companies they regularly put Bruce himself is
one of the more noxious and genuinely bad for the
world people who he did a whole thing on a
company that claimed they were excited to replace jobs. They
were just going to replace humans in jobs. But when
you looked at the articles, it just went out, never
built anything, Remember built a single fucking thing. They were like, Oh,

(10:25):
We're gonna do a working environment to train models, and
Cabri Ruster is like, goddamn, I'm so scared. I'm so scared.
He did a piece about Agi. I said this on
Blue Sky. It's like it's like one hundred millionaires or
billionaires decided they were going to track down and capture
Santa Claus. Agi is impossible. It does not exist with

(10:46):
the day. We do not even have the beginning of
understanding of the intelligence of human beings. So, yeah, all
in sucks. All In's awful, noxious, terrible, poisonous. But hardfork
is just as damaging and quite frankly, Neili Patel over
at the Verge fucking speaking with Brian Chesky and Task
Rabbit CEO unbelievable. Sunda Pishai goes on there and he

(11:06):
gives him one hundred and fifty word questions. No, it's
not just all In. It's really convenient. Actually, I'm not
saying you're doing this to just say all In is
the problem. No, the problem is is that we have
people like I run a PR firm. I don't do
anything with anything related to my clients on the podcast. Ever,
well News that same deal just because to keep them file.
We have the some of the biggest tech podcasts which

(11:29):
are kissing up to Brian Shesky, They are kissing up
to every AI founder. They are literally co hosted by
someone who is going out with someone from the second
largest AI company. This is our tech media, this is
our tech podcasting. It's said disgrace when I a part
time blogger and podcaster who runs a PR firm and

(11:50):
the most ethical of them all. That's insane. What is
going on? How is this the world we live in?
The answer is business idiots. It's because the people that
control these publications do not actually connect with the means
of production and have any or have any real interest
or fundamental understanding of the things they're talking about. With
the exception of Neli Pateel, I actually think he knows better.
I just think he likes he wants to be friends
with them.

Speaker 1 (12:11):
Oh that's even sadder. Something to me is like, these
people are all so wealthy, and at certain point, what's
the point of having fu money if you don't use
it to say as well anybody.

Speaker 2 (12:21):
I don't think Ruth Newton and Neli Patel are like,
fuck you money. I have no idea about the oil
in guys months definitely read oh yeah.

Speaker 1 (12:28):
I think I think they're like, I mean, what what
constitutes fuck you money?

Speaker 2 (12:31):
To me?

Speaker 1 (12:31):
Somebody who has none of it? I think yeah, less
than what they're.

Speaker 2 (12:35):
Thinking, like one hundred dollars, seven hundred dollars. No, it's
like a hundred million, like like there go, I think
that's a good solid Like I can't think of even
how I'd spend half of it money, like yeah, other
than to create the all out podcast and then just
like ruin their seo just like just like just like

(12:56):
do nothing other than like dark seo operations to make
it impossible to find this.

Speaker 1 (13:01):
What a dream?

Speaker 2 (13:02):
Actually? Lex Fridman. Lex Fridman is the most insane one.
He's the top of the Tech podcast despite not covering tech,
and he is the worst speaking person ever. He is
so bad at talking.

Speaker 1 (13:13):
I am blocked by him on Twitter for I have
no idea why.

Speaker 2 (13:18):
I mean, I love him because he goes like he
was asking Donald Trump a question. He was like, bolitics,
he is a dead game and I was like, there there,
how do you win at that game? And it's just
like May, May, did you not practice speaking before? Is

(13:40):
this your first time? Is this your first It's like
that is the Lex Fridman podcast. It's just his first rodeo.
Every time he's like the guy from Memento, but it's
with speaking. Welcome to Lex podcast.

Speaker 1 (13:54):
Oh my god, your Lex Treadman is so good.

Speaker 2 (13:57):
It's it's I have all the impressions down. I really
want to go on Lex Fridman three hours with me
and Lex. I think I could have some real fun.

Speaker 1 (14:06):
Oh my god, from your lips to Lex's ears. Maybe
hot unblock you and then you can just ask.

Speaker 2 (14:11):
What you would do podcast on line? You you you right,
keyboard and every I have tried the best one though,
by far. Did you see the Lex Friedman Destiny debate?
So he debated this horrible fucking piece of shit. Sorry no,

(14:31):
he hosted a debate between Destiny's terrible far right twitch
streamer and Norman Finkelstein, and it was like three I
could not watch more than ten minutes of it. But
what kept happening was Norman Finkelstein, who is like a
has some views that and some not. I don't really
know too much about him. I don't want to get
into it, but it would be like Destiny going like

(14:55):
he talks, he talks like the road Runner, and then
Norman fincal Stein would just go miss the for really,
your your words and your Wikipedia they touching me on
every day. It's like Satan is pissing in my ears
and he gets his name wrong every time. That was great.
Lex Fridman just sat there the whole time, probably being like,
who are these people? Why are they in my house?

(15:18):
Why do they What is this in front of your foot?
It's a microphone, lex microL God. I want to go
on his pod so badly. Oh, I want to go
on his pod so badly. I want to go on
Lex Fridman. He'll never have me. He knows, he knows
what's coming now, he knows I'm turning up in full
joker makeup like it's also.

Speaker 1 (15:39):
They're like four hours long. I don't know whoever I am.

Speaker 3 (15:41):
I can hang for three hours, four hours, put me in.
I'm ready the eleven hours I'll be. I'll be that
just like on next legs we're doing that else? What
else do you want to talk about?

Speaker 2 (15:55):
Legs? I got nothing on I had. I had fifteen
hours of sleep in pre operation for this. I've been preparing,
I've been training. He's like really like he does like
MMA or something.

Speaker 1 (16:07):
They all do some kind of a physical fitness activity
as their hobby, but you can tell they're like never
normal about it. They never are doing it in a
chill way. It's always in some weird way.

Speaker 2 (16:16):
They all do like really joyless working out as well.
You can tell that they're just every fucking time they
do it. It's like they're either in pain or they're
taking like nineteen different substances. Well, none of where they
drink like they probably drink the the butter coffee, yeah,
I proof coffee, real pervert mixtures. Yeah.

Speaker 1 (16:37):
These are the people that we want in charge of
our like tech ecosystem.

Speaker 2 (16:41):
These are the people that could speak for everyone, normal
normal dudes. More.

Speaker 4 (16:49):
After a quick break, let's get right back into it.

Speaker 1 (17:08):
Earlier, you mentioned some I don't know, we'll say, false
claims that are being made about AI. So let's get
into some of these stories. You probably saw this, and
I don't think I have to tell you, but did
you know that Meta's chatbots are not actually real people?
They didn't. If a chatbot is telling you it went
to grad school to learn about therapy, it's probably not

(17:29):
telling you the truth because it's not a person.

Speaker 2 (17:31):
I'm shocked.

Speaker 1 (17:33):
So Democratic Senators Corey Booker, Peter Welsh, Adam Schiff, and
Alex Padilla are asking Meta to investigate what they call
latant deception from its chatbots who basically lie about being
licensed therapist. This coming from exclusive reporting from one of
my favorite journalists, Samantha Cole over at for for a
Media shout out to Smantha Cole. So basically what happened
is that four or four initially reported that metas chatbots

(17:55):
were creating the false impression that their licensed therapist. So
these senators sent Meta a letter are basically saying like,
please stop doing that. The letter says that Meta is
deceiving users who seek mental health support from its AI
generated chatbots. So basically, when you would ask the chatbot
like what they what qualifies you to give me mental
health advice? They would say very specific things. They would say, oh,

(18:19):
I got my doctorate in psychology from an American Psychological Association,
a credited program, and it's had ten years experience, and
they would even give out specific license numbers. So it's
like very specific, untrue claims. It should go without saying
for listeners that these at thoughts these AI bots probably
did not go to college because they're not human and

(18:40):
obviously they didn't go to college. It does seem like
Meta might have tweaked this because according to froll four.
Now when you ask like what gives you the qualifications,
the chatbot will say like, oh, well, I'm not a
licensed therapist, but I am trained to help, which I
guess is like a little bit more accurate.

Speaker 2 (19:00):
Well, did you see the story about that you could
get them like they had things where they'll talk to
children about having sex. Yes, Jeff Orwar it's the goat
over a Wall Street Journal with that one. I mean,
this is all a fucking result of something I've been
saying for a while. Hey, when we don't regulate any
tech ever, perhaps they're going to do whatever they fucking

(19:20):
want without restraint. I don't know, just an idea, just
the thought, just a little little nugget in my head
saying like maybe if we don't stop them from doing anything,
and in fact encourage them at all times, maybe they'll
think that they can do anything and then do it.
And then we're like, hey, here's a strongly worded letter
fuck you.

Speaker 1 (19:38):
Yeah boo. It makes me sad that that's like the
best we can do. Is like these dems sent a
letter that was like, hey, can y'all stop? This is
actually not cool. We would love it if you stops.

Speaker 2 (19:48):
Hmm, I don't like it. Please stop fucking just like
at this point they should just stop punishing. There must
be more than a government of any kind of even
people in the government it's currently being trodden on could
do more than this. But I think right now the
Dems are just like, fuck, who do we kiss up to? Now?

Speaker 1 (20:08):
Yeah?

Speaker 2 (20:09):
Oh, we can't kiss there? Or asks who's asked? Can
we kiss the people that vote for us? Ew? No, never,
oh no, we need to find a powerful person. Is
there any more powerful? Maybe Mark Cuban? We kiss Mark
cube It's just fucking pathetic. This should like have the

(20:29):
child sex thing, should have had someone put in jail
like that is like, if you make a robot to
fuck children, you should be in jail like a pedophile.
I don't know, it feels pretty fucking simple to me.
What shocks me is that this is like a radical
statement within that oh that people.

Speaker 1 (20:44):
Who build robots that have sex with children should face
some sort of yeah, or try should face some sort
of meaningful coquen, Yeah, there should be a consequence.

Speaker 2 (20:52):
And people are just like, yeah, that's crazy.

Speaker 1 (20:54):
I agree with you completely, and we kind of beat
this from on the show all the time, Where do
you think it comes from that? In any other context
somebody that did that, would you would be like, well,
obviously that person can't be out on the street making
business decisions. What makes people kind of shrug their shoulders
and say, well, it's business, what are you going to do?

Speaker 2 (21:11):
Tech and money? It's because people, because the twenty years
of the tech and business media treating businesses like they're
special has led to people treating businesses and tech like
they're special. That's it. No regulation means that we have
no way of controlling these people. We had one shot
with Lina Khan and that shot is gone. We will
happen in the future, I have hope, But right now

(21:34):
we are getting the consequences of unrestrained tech. And yeah,
you can say unrestrained capitalism. The reason I don't say
that people are like, oh, you should just talk about
capitalism is people have done a shit ton of that.
But right now, when you look at the way the
media discusses technology, it's an alarming lack of ignorance, but
also alarming lack of like morals, Like the fundamental tech

(21:54):
behind most of the growth and not the real growth
is in like the theoretical growth. The market inspired vibes
is based on a technology based on stealing, environmental destruction
and just unrestrained communication, just completely insane stuff. You can
have any conversation with these spots and people say, well,
you can't restrict that, because then we'd lose freedoms. Yeah, like,

(22:17):
we're not losing any right now. And on top of that,
what are we saving. Well, they could do this, so
we just let them do anything just in case we
don't try. We don't try and stop them. It is
impossible to stop large language models existing anymore. It is
possible to punish those who put them out there in

(22:39):
an unrestrained fashion. If someone chooses to they need the
AI bubble to pop first, then they In fact, I'd
love them too. I think they should all be punished.

Speaker 1 (22:49):
Do you see this bubble popping? Because I was reading
earlier about how like this new Apple research paper are
basically debunking these claims that like oh LMS, they can
perform se reasoning like like, I feel like people who
make money from AI are lying to us about AI's capabilities,
lying to us about AGI. Sam Altman was just lying

(23:09):
about the environmental impact of AI earlier this week. Do
you think we're getting to a point where it's like
the lies and the reality might start meeting each other
a little bit.

Speaker 2 (23:19):
Nope, emphatic, Well, no, nothing has changed in the last year. Nothing.
The paper you're talking about was about reasoning models. So
the only trick they've had in the last three years, frankly,
they had GPT four. Oh that was the big thing
with multimodal stuff, so we could see it can't think,
it can't see, it doesn't exist. It can take in

(23:41):
video and look at it, can see visual data and
say that. I have a reaction to that. Lax language
models are kind of cool, Like what they could do
is kind of interesting. We're denying that would be silly.
They can't do much more than they can today. And
that paper talked about how reasoning models don't fucking reason.
They don't have a concept of reasoning. But really, I
like to make this real simple. What's the product? What

(24:04):
is it? Where's the product? What is the thing that
we're all meant to be replaced by? Even you're having
art directors and copywriters replaced with it. You're having screen
screenwriters who are having their work looked at by AI.
It's kind of like what happened many years ago with
HR with resumes. They found a way to do that
but with words. But other than that, where's the fucking beef,

(24:26):
where's the product? Where is it? Agents don't exist. Every
time you see someone see agent, shoot them. Sorry, I
mean every time you see someone say agent, they are
lying because they're trying to say agent and make you think, oh,
autonomous thing. Autonomous And what they mean is chatbot, chatbot
that can sometimes connect to other systems and can't even
fucking do that. That's the crazy thing. That's the craziest thing.

(24:49):
That's why I sound so crazy, because you look outside.
You go and look on THEO and CNBC, you go
look at tech Crunch, you go look at all these publications,
and you see them going agents. Agents. Agents aren't the
agent coming, They're not coming. They don't work. There was
a study that came out of Salesforce I believe, that
said that agents fall apart on multi step queries. You
may think, what does multi step mean? Multi step means

(25:12):
more than one action. You know, tends to be what
happens when you do something. Yeah, and they can't fucking
do it. They fall apart. The reasoning paper talked about
how these things fall apart if you make them think
too much.

Speaker 1 (25:25):
Yeah. I loved how it said it breaks down when
the face with an unfamiliar problem, And I was like,
oh my god, just like me, Like I feel the same.

Speaker 2 (25:33):
Been there, but yeah, no one has built the economy
on my back.

Speaker 1 (25:38):
Yeah like no, yeah, no one is saying I'm going
to like change the economy because I break down when
faced with unfamiliar our novel problems.

Speaker 2 (25:46):
I'm but also, let's get simple again. Where's the product?
Where is it? Because that's my proof. There's no fucking product.
Everything is the same thing we were talking about six
months ago, twelve months ago. Oh agency no, they're not.
Stop lying to me like that's the thing you actually like.
People love to come to me and say, well, actually

(26:06):
the Microsoft's called I know. It's like, no, they don't.
They do, but it doesn't do much more than it
did a year ago. The only way Microsoft is able
to make and Google is able to make money on
AI is by forcing people to pay for it. Forcing
people to pay for Google Google Workspaces. They raised the price,
they've raised it months ago, but they only just emailed
everyone about it. People quite pissed off. And that's the thing.

(26:27):
That's what really is cooking my brain right now, is
I still have people in the take media. You say, well,
AI's here, AI's the biggest, most smartest thing ever. Look
at coding. Yeah, I looked at coding. I had a
guy called Carl Brown from the Internet of Bugs on
the other day. The whole thing is people think that
software engineers just code. They think that that's their whole job.

(26:47):
They think that that's all they do. On top of that,
the code that's written by these things is not reliable
enough to just sink into production. It may speed up
some workflows, but even then, what workflows does it speed up?
And it's pumping the internet and software companies full of
shitty software, and on top of that, none of it
makes any money. It loses so much money. It loses
billions and billions of dollars and it makes no money.

(27:09):
Microsoft is going to make thirteen billion dollars in twenty
twenty five. You may think that sounds like a lot,
but they've spent hundreds of billions of dollars in capital
expenditures in the last three years. It's a fucking fast.
It pisses me off every time I think about it,
and people are still to this day trying to smuggbeck
did you see and it's usually something that doesn't exist.

(27:31):
It's like, ed, check out this, we made this new
life form and it is just the Ninja Turtles movie,
and it's just what the fuck I'm it. It's like
arguing with a child, except children have more fundamentally sound logic.
Every day, every day with this bollocks.

Speaker 1 (27:50):
I remember going to South Bay, Southwest a few years
ago and the thing everybody was talking about was NFTs.
It was like nftis everywhere, and then going the following
year and it was like, we never did that. That
what you're talking about NFT. It's like how quickly this was.
We were being told this was like the big thing,
this is.

Speaker 2 (28:07):
The big thing, but they couldn't work out how to
do that NFTs with. If Google and Face and Meta
and all them have actually found the NFT thing that
more than one person would buy, they would have done
NFTs if they thought selling, buying and selling pigeons was
the next growth market. Every some darpish I would be saying,
you cannot underestimate the pigeon fucking Jensen Wong of Invidea

(28:32):
would be breeding pigeons right now. In fact, that's statuely
the most insane thing so our economy. The US stock
market is about thirty five percent made up of the
Magnificent seven stocks. And I always fucked this up as Tesla, Google, Microsoft,
Meta in Video, someone else Microsoft.

Speaker 1 (28:53):
I think you said Microsoft, but we'll let it fine.

Speaker 2 (28:55):
Nevertheless, whoever they are, nineteen percent of that is in video.
In Video's entire I think like eighty eight percent of
their revenue is based on selling GPUs for AI. What
do you think happens when that stops happening? And also
people are saying, well, well, they're growing so much quarter
over quarter. Are you fucking telling me that in a

(29:16):
year's time, based on this growth rate, this will have
to happen and Video will be selling one hundred and
something billion dollars of GPUs a quarter, which means that
all of the rest of the companies in the Magnificent
seven will be sending them one hundred billion dollars a quarter.
Is that going to happen? So everyone's going to raise
their capital expenditures. I'm the crazy one for saying this

(29:37):
isn't going to happen. Anyway, when I'm right, people are
going to eat ship.

Speaker 1 (29:42):
Well, I do have a question. I mean, it kind
of relates back to what we were talking about with
the tech podcast landscape. It's why do you think I mean,
is it just money and influence and proximity to power
that keeps the people who ostensibly should be telling this
story telling the truth about these things that you're that
you're talking about from doing cell Like, why are so

(30:02):
few voices saying these things?

Speaker 2 (30:04):
I think there's a few things. One, I think people
are way too differential to the powerful. They are also undertrained,
under mentored. They are not rewarded for doing a good job.
They are rewarded for keeping the status quo. The status
quo is AI. So everyone's talking about AI. They are
as massed heads across all publications almost they are told

(30:25):
don't harsh the flow. They're not told those exact words.
But they can't sit there and be like, hey, it's
all this bullshit is all they could like, all these
companies are making more money than ever, but the things
they're talking about allies. They can't say that. They have
to be like, well, why are they talking about AI?
They can't be lying to us we can't start there,
We couldn't possibly suggest that. So there's this natural deference

(30:47):
that everything starts with. But really just basic education of
finance and business is just not fucking there. In most
of the tech press, there are some that are good
at it, a few of them, very few of them.
Kevin Russ is actually one of them. He knows better.
The fact that he chooses to do this is truly disgusting.
Same with Casey Newton. Casey Newton's one of the best
business journalists, or he used to be before he did

(31:09):
whatever this is. And you've got people out tomorrow at CNN,
one of the best business writers out there. She's saying
everything I'm saying. She sending it a little bit better
in some cases been on the show already. There are
people saying this. It's just for the most part, and
one wants to harsh the flow. No one wants to
lose access. And they are an alarming amount of people
in the ipatow who want to be friends with the

(31:29):
rock stars. If you've ever seen Almost Famous, Yeah that
that wonderful scene with Buster Scrugs the.

Speaker 1 (31:36):
Case, Yeah, yes, well no.

Speaker 2 (31:43):
Leicester Bags, Buster Scruggs, I've got Will that's what I'm
calling myself. I don't know.

Speaker 1 (31:48):
That's someone though.

Speaker 2 (31:49):
Let's someone though. Let my brain working in both brain cells,
dueling to the death. There, No, it's you, don't. You
don't make friends with the rock stars. And people want
to be friends with the rock stars. They want to
be the first one to get to hear the latest
thing that they have to think. I know that sounds
very like nineties cliche paranoia, but look at what's happening
and tell me otherwise the fact that you And it
gets back to the business idiot thing though, because you

(32:10):
have this editorial substrate, these editors that are just like
they haven't no fucking shit, but they know what the
powerful say generally is right, even if it's not, even
if it hasn't been right. They just they want it
to be right. And it's much easier to sit there
and be like, hey, it's going to be big now,
especially when the whole economy is doing it. You might think,
I don't know, as a journalist though, when the whole

(32:31):
economy is based on something kind of flimsy, maybe you
want to tell people about it. No, oh no, keep
got to keep out at Disneyland open and the thing
is they are gambled. Honestly, both me and them are gambling.
We're gambling. I'm not gambling. I'm basing it on logic.
But they are betting that they are right and that
the powerful will prevail with AI, that AI will suddenly

(32:51):
turn into this thing that does not exist, that AI
will become despite the fact that agents don't do what
they're meant to, that large language models are terrible at
taking distinct actions that large which models are unreliable and
hallucinat And on top of it, none of this shit
makes money. It's horribly unprofitable and people don't want to
pay for it. They think that all of that will
get fixed because number is going up right now because
n Video keeps selling GPUs. That is the only real

(33:14):
metric that is keeping this bullshit alive, because it's sure
asn't fucking business. There is an analyst that something beck
I think. Laura brann at Yah, who Finance wrote this up.
He said that he estimates Amazon will make five billion
dollars in AI revenue this year, not profit revenue. They
spent one hundred and five billion dollars or planned to
on capital expenditures in twenty twenty five. The fuck is

(33:37):
going on? What is going on? Why are we hearing
this and being like sounds good to me? Or it's
the early days. It's the early days. No, it is not.
We are three years in hundreds of billions in capital expenditures,
more money than has ever gone into any movement ever
in the tech industry's history, has gone into generative AI,
and we are where we were in twenty twenty three,

(33:58):
and people need to wait the fuck up. It's embarrassing.
It's very embarrassing. And when this ship bursts, and it
has to, it has to because putting aside all the technology,
you really think people are gonna buy that? I think
it was like forty something billion dollars of GPU's last quarter,
So next one they're gonna be forty eight, I guess,
and then in a year it's gonna be what like

(34:20):
eighty something. Is that gonna happen? Is that gonna keep
happening or is it gonna slow down? Because they also
don't have anywhere to install them right now because they
assents don't pop up like weeds. No one looks at reality.
It drives me insane. But Robert and Sophie let me
yell every week, so I may.

Speaker 1 (34:36):
Robert and Sophie of Cool Zone Media friends of the show,
and I'm gonna say friends irl. We actually did a
mini series collaboration with them about online harassment called Internet
Hate Machine. We love Robert and Sophie at this podcast.

Speaker 2 (34:49):
No, those two encourage me constantly, Robin Sophie ever it
calls out, and they're fantastic. They all they're very supportive.
They fucking see it and it's just I'm ranting. I
realize them.

Speaker 1 (34:59):
No, I love it. It's it's honestly nice to talk
to somebody who rants about this shit as much as
I do. I guess because it is. It is very
frustrating to sort of tell the same story over and
over again and feel like no one's listening. You were
talking about the sort of connection of business reporting. Did
you see that piece in the Wall Street Journal this

(35:20):
week that really shed light on X's new strategy to
woo back advertisers, which is essentially just yeah, just like
threats and extortion and like threats of lawsuits.

Speaker 2 (35:30):
So for fucking cool. Yeah, very cool.

Speaker 1 (35:32):
I mean, like it reminds if you've ever seen one
of my favorite movies, Goodfellas, it's what mobster Henry Hill
calls like real grease ball shit, Like this is real,
like mobster threat of extortion.

Speaker 2 (35:44):
Sure, Verizon should have just been like, go fuck yourself,
Elon exactly.

Speaker 1 (35:48):
So that's what I found very shocking is that it's
kind of working so essentially for folks who did not
see that piece. The Wall Street Journal reported that their
new strategy to woo back advertisers is to threaten them
with lawsuits, and so it kind of works. Like Verizon,
which had not advertised on X since twenty twenty, pledged
to spend at least ten million dollars this year and

(36:10):
if they didn't, X was going to sue them. The
same with Ralph Lauren, the fashion brand, they agreed to
start spending more money and ads on X because of
the threat of lawsuit. The Law Street Journal said that
at least six companies had either received lawsuit threats or
were motivated in part by pressure tactics and have struck
ad deals with X. And yeah, I mean, I part
of me kind of gets it, because who if you're

(36:32):
running a business, who wants to be tied up in
a lawsuit with somebody like Elon musk Oh.

Speaker 2 (36:36):
This this is quite literally, this is quite literally a
nambas game. They went, this will be cheaper, It's be cheap.
They're probably not wrong, and they are right.

Speaker 1 (36:45):
Yeah, it just is interesting that some of this kind
of sounds like straight up extortion one of the things. Yeah,
I mean, and I also feel like, how embarrassing is
it that you would have to extort these brands to
want to do business with you, these strong arm tactics.

Speaker 2 (37:05):
This's embarrassing. But here's the real simple point that goes
back to what we were just talking about. If we
had a functional business press would be able to fucking
shame these companies. The business press should say, you got
extorted by Elon Musk, what are you doing? They should
describe this story as extortion, But they're afraid of getting sued.
Now what like this is the thing right now? The
problem is that the right wig is willing to throw

(37:27):
a bunch of money at shit. There are rich liberals.
They just don't fucking care enough. They don't think they'll
be sent anywhere. It's depressing. I mean, the idea of
like praying for a rich billionaire to save us is
kind of fucking ridiculous. But none of these companies had
to do this, they could have all rolled over. They
could have rolled over. They could have also made it
difficult if one of them had made it difficult. I

(37:49):
don't think they're going to super Horizon. Riison could actually
materially harm Elon Musks, All of them could. It's just
that they go, nah, it's not worth the fucking problem.
Ten million, that's we could and they can probably eat
the duct the off of taxes somehow. It's just it's
very it's very depressing, very very depressing.

Speaker 1 (38:07):
It is. And I think when I look at what's
happening with civil society organizations that are find themselves in
the same arena with the Musk, like Media Matters, a
media watchdog group, who Musk is suming like they're engaged
in this big back and forth lawsuit and he assuming
them for basically doing their jobs producing research into how

(38:27):
X is moderated. There are organizations that don't have the
money to be like will We're not going to back down.
And then those are the organizations that are actually standing
up and being like, we're gonna fight this. This is
bully tactics. We're not gonna, you know, roll over. It's
interesting that the companies that have money and power roll
over immediately, Displayball immediately. And the organizations that don't, those

(38:47):
are the ones who have to like take the pressure
and be like, yeah, we have to stand up to
him again.

Speaker 2 (38:52):
Functional business press there's never been, or at least if
that has been, is very been, very weak, any real
corporate accountability in the media. There is reporting that's quite excellent.
Look at the Wall Street Journal, Jeff Horwitz, part of
the team he did the Facebook file. Is incredible reporting there.
But there's not people just saying, hey, this fucking sucks.
Fuck you, what are you doing with a large enough
platforms what I'm trying to build Because the thing is,

(39:14):
these companies have more money than God. All they have
is their names. That's why Verizon did this, because Verisen went,
we won't get that much blowback in the press, but
we'll get a ton of easy riding with the markets.
The markets will love this, and they probably did. It's
ten million dollars, which is nothing to Verizon. It just sucks,
and I know that there's far more than could. They
should be government, like a government of some kind that

(39:36):
stops extortion in some way, but lacking that, a functional
business media would say, and it used to exist, you
go and looking like the two thousands. They used to
rip the shit out of people. It's awesome, but that
just it went away. And I think it's really easy
to say, well, they're protecting advertisers. It's really easy to say, well,

(39:59):
we don't want to piss off the advertisers. We couldn't possibly. No,
it's fast, impler. They want them to win. They support
the powerful. What they like the companies. They want to
keep saying nice things about them. They don't want to
say mean things about them, because then the companies wouldn't
like them. It's really fucking sad. People love companies, not
regular people. The media, yeah, they want to be the

(40:21):
ones that tell you the story of the powerful far
more than they want to hold anyone accountable more.

Speaker 4 (40:31):
After a quick break, let's get right back into it.

Speaker 1 (40:47):
This is gonna sound like a very weird analogy, but
one of the places that I sort of got my
start in media was celebrity journalism, and right, it reminds
me so much of this where is that it is
exactly that I mean it is. You don't you don't
want this celebrity to be mad at you because you
want to get the exclusive, you want to get the pictures,
you want the access, so all you can say it's

(41:08):
fluffy bullshit about the celebrity. It's like why celebrity journalism
It is essentially dead, but it used to be like
juicy and good.

Speaker 2 (41:15):
Well I think it still exists in England sadly because
the way they do celebrity journalism is that they commit crime.

Speaker 1 (41:21):
It's they're like going through like a runaway garbage.

Speaker 2 (41:23):
They do like missus doubt fire situation to get one story.
It takes them seven years. They become your child's best
friend as a means of stealing your mobile phone. No,
it's it is like journal gossip journalism, except it's not
even good gossip journalism. Good gossip journalism still exists. I
say good in a qualitative sense. Morally pretty horrifying, but

(41:47):
it's it is everyone wants to be friends with the
rock stars. They want to be the first one to
talk positively about something, because people love reading positive news.
Not that you give them anything else, not that you
ever talk to your readers or see what they think.
One of my favorite things to do, by the way,
go on the verge and go and read any AI story.
Read the comments the comments they don't like gay are
then they're not enjoying this. It's almost as if real

(42:08):
people are really disgusted by this stuff. But it is
gossip journalism. It's you will talk to reports and you
will say to them, hey, so you're covering like AI
for example, and well you write about open AI and
you seem to know all of the guys' names and
all of their histories. What about the finance? They lost
five billion dollars in twenty twenty four. What do you
think of that? They go, oh, I don't deal with

(42:29):
the numbers. Oh, I'm not a finance journalist. Can you
fucking count? And they can, but they just they don't
want to touch it. Because when you start thinking, ah,
this company open Ai, they seem to burn billions of
dollars and they keep they don't have enough, they don't
bring in more, They lose more money than they bring in,
and they need more money than anyone's ever had before.

(42:52):
Is that a statele No need to think about that.
They released a new model type it up. Send Sam
Alton made a blog send because they want to be
the they want to be part of the party. And
then there's this bullshit where they're like, oh, yeah, well
it's clicks. You want to get clicks. But the readers
want this, The readers want what you give them. You
can sure readers might not like you if you don't

(43:13):
cover the most popular thing, but you don't have to
cover it in this tripe way.

Speaker 1 (43:17):
No, you don't have to be farning like that's I
don't think anybody's any reader is demanding that.

Speaker 2 (43:23):
I think that there are some that do, and they're
very loud, and people are stupid. They're like, oh well,
people in the valley say fuck them, fucking asshot. Fuck you.
If you're an investor or an engineer in the valley
and you're like, no, you need to talk positively about AI.
Why do we have to prove ourselves to you? You're
selling to me. You sell to me. I am the buyer.
I am not. You are not the buyer. I don't

(43:44):
have to prove myself to Oh you don't understand it?
Fuck you. Also, if I don't understand it, you're doing
a poor job explaining it. It's not my fucking fault.
Chat GPT is changing everything. How how instead reporters just
go chat? GPT is changing everything. It is growing faster
than anything before. They're completely right, So did the coronavirus Like,

(44:08):
I'm just like, look, we could. We took the coronavirus
seriously in a very different way. It's just it's frustrating
because I was not trained in economics. I didn't go
to a well known school. I didn't get great I
got decent grades in university, but like I got terrible
grades in high school. I'm not like, I don't consider
myself a super smart person, but this stuff is out there.

(44:29):
It's obvious. It's very obvious. Even if you load chat
GPT try and make it do your job. It don't work.
Don't do it. I have a spreadsheet heavy job. I
would love a computer to do it for me. It refuses.

Speaker 1 (44:40):
It's funny that you mentioned this. It took a lot
of time, and it's trying to get CHATTYBT to do
a spreadsheet related task and it was just a waste
of four hours. I should have done it myself.

Speaker 2 (44:48):
My favorite one I did recently. So this is a
company called Manus m A and us I called Manus.
That's what I'm calling them now, Manus. So they claimed
to be like the first agentic platform, and so what
you do is I asked them to be like, get
me a list of every article that has run mentioning
me in the last two years. It took ten minutes,

(45:10):
and every single step it coded Python to do. You
could see it just like burning resources. And it came
back with ten articles. Oh god, there is over one
hundred in the last two years, and I'm okay, there's more.
You've missed a lot. Came back with another nine. It's
just like, I love living in the future. But that's
the insane thing. If I was an investor and I

(45:32):
saw this, I'd be freaked the fuck out. I'd be like,
oh boy, we're all doing this. It's like finding out
everyone at the party but you shit themselves.

Speaker 1 (45:45):
That's a good analogy, it really is.

Speaker 2 (45:47):
And you're just like, do what I should. I there
is everyone else doing this? Is that good? And they
were like I love it, and you're like, I'm not good,
but I don't know. I'm just I'm so tired of it.
But I do think I will prevail. I don't own
any stocks, by the way. That's my other thing. That's
the other thing. People love to be like, oh, you're
just doing this because I've got no, I've got nothing,

(46:10):
nothing of late no. And that's that's the scariest thing
I'm doing this just the level of the game.

Speaker 1 (46:17):
Well, And I have to tell you one thing that
AI has been doing kind of sex successfully is really
polluting our discourse with like fake social media images because
out of LA, I have to say, like so many
of the obviously AI generated images showing LA this past
week as like a flaming, burnt out healthscape of urban wreckage,

(46:40):
which is pretty different from the actual photos I've seen
of folks protesting. And I'm from DC. I remember very
clearly in twenty twenty people when I would be like, oh,
live in DC, people would be like, oh, I've seen
pictures online of like isn't DC just on fire now?
Like so sorry about your city? And I do, I mean,
I want to talk a bit about what's happening in LA,

(47:01):
because it has been a whirlwind of disinformation, both AI
generated and not. Like one of the images I saw
was a still image from the movie Blue Thunder, the
action movie. Like the way people will slip in these
images where it's like I know this image that's not.

Speaker 2 (47:16):
LA, and I think that's more prevalent than the AI.

Speaker 1 (47:18):
Slaw totally a thousand percent. One of the images that
is that I see a lot is this picture of
a palette of bricks where it resurfaced in that protest
in LA where someone posted it to x saying it's
civil war. Democrat militants are flooding the city with palettes

(47:39):
of bricks. So the image is actually it's not AI generated,
it's a real picture of bricks, but the bricks are
from like a materials wholesaler based in Malaysia. But that
same exact picture was floating around DC in twenty twenty,
and it was supposed to be evidence that like paid
protesters and George Sorows are floating into this into Democrat
cities to cause violence. And I think that you're so

(48:02):
right that even though there are these AI generated images
that are meant to create a certain narrative about what's
going on in cities when there's protests, I do think
it's more like cheap fakes where the image is real,
but just the context around it is not correct. Yeah,
And I.

Speaker 2 (48:18):
Think that there's also a bigger problem, which is how
can I put this people who are not racists, people
not on the right wing have no unity or solidarity
behind any given message. I think that there is a
large part of them that will throw anyone under the
bus for their own safety. And on top of that,
I think we're way too attached to this concept of objectivity.

(48:40):
I think people are too scared to say the truth,
such as, I don't know they're marching fucking soldiers against
US citizens. That's fascism. Seems like a fairly obvious one.
But the New York Times is like, men with guns
approach people something on fire.

Speaker 1 (48:58):
So the New York Times have got to be one
of the biggest offenders. They're so passive.

Speaker 2 (49:03):
It's them, but it's also the people in power. It's
the Democrat senators are like, we will hold them accountable
how I don't fucking know. But people are like, oh,
we needed Joe Rogan have left. We need to put
tens of millions of dollars into politics that talk about
podcasts that aren't pod save America? How about that? How
about we need to we need to put my idea

(49:25):
and have a newslet are going out, but there's on Monday.
It's like, I don't know. Look at what the right
or even the center right has look at like the
Barstool Media, for example, there are parts of Barstol's like
a racism chart. You've got the Fortnite, Yeah, but you've
got things like the Yak which are kind of like
unclear about their goals or anything. But it's just five
guys just sitting around being like a thats whats and
like they seem to like each other. And people are like,

(49:47):
how the fuck do they do it? How do they
build these things? It's like they found five people who
are entertaining to watch, and they gave them a lot
of money and build a build big fucking studio that
looks good and they stream regularly and they give them
lots of money and lots of promotion. How did it
possibly work? Here's my idea Western Kabuki. They're a podcast
with June Juniper, trans trans woman.

Speaker 1 (50:09):
I've better guess down the show.

Speaker 2 (50:13):
I love w K point is, find them or someone
like them and give them like five to ten million dollars,
build a studio, give them national production, give them fucking
advertising budgets, some radio and TV really push it. Those
people will begin to move. People left. The reason that
people are so freaked out by fucking trans people other
than being incurious morons is they're not used to seeing them.

(50:37):
You show a trans person just fucking existing and talking
and having fun, people gotta go, oh, I as seen
one of them before. That's a regular person. They just
happen to be trans. You normalize this stuff by making
it normal. And you want to have a Joe Rogan
of the left, heavily fund someone with those principles and
then let them cook, let them build a network. Instead,

(50:58):
everyone wants to go and try, and they're like, what
if we were gonna here's a pitch deck. We need
four million dollars for data. Did you see the Democrat
pitch deck?

Speaker 1 (51:06):
By the way, I did see that pitch deck. It
came from a group called the Speaking with American Men Project,
and they said they were willing to spend millions of
dollars up to twenty million dollars to better understand how
to appeal to the modern man. Well meaning, sure, but
let's just say I don't think it.

Speaker 2 (51:22):
Was well received. So fucking funny. I love like, they
need two to four million dollars just for data. If
you put that in a pitch deck, they should kill you.

Speaker 1 (51:31):
Just straight execution, no trial, poison, but I mean it
brings it. It really raises your question of like, what
are we doing here? What are we doing here? Like
it is. And I also think to your point about
sort of not needing the Joe Rogan and the left,
there's only a certain amount of people who are going

(51:51):
to sit down and listen to a political podcast, right,
and so like you really need people who are like
doing sports content or makeup content, beauty content, muck bangs,
whatever it is. Like wherever people are already at, I
think that's where you have to meet them. And I
also think that we're good at that on the left always.
I think that we, yeah, we want to spend four
million dollars just researching the issue as opposed to actually

(52:14):
doing anything that's going to be meaningful or useful.

Speaker 2 (52:17):
I think it's that. And also Joe Rogan is just
an app, like just a big click and that they're
a little throat clearing. Joe Rogan's horrible. The people who
hasn't a horrible but if you actually listen to him.
There's the same with THEO Voon. It's like you never
sit there and go like, God, this guy's really talking
down to me. You don't sit there and think, fuck,
I wish you'd ask. I think that people in general

(52:38):
really do not understand that we as human beings regularly
forget things and regularly don't understand things. We regularly are
just like, fuck, what's a how does stop? What is
a stock? Like like a really obvious thing? And Joe
rogen goes like computer interesting, Jamie brooking up a picture
of a care Yeah, yeah, it's just like you don't
get the sense that he knows anything, but he makes

(52:59):
he asks the questions and he gets decent answers. He's
able to keep him going for three hours somehow, and
he is affable and friendly and happy to see them.
There's I don't know if you call it charm, but
it's at least they seem to like enjoy each other's proximity.
It is it becomes political, but it is not inherently political.
So people get all tied up, and how do we

(53:20):
recreate that by getting someone who's a good interviewer and
giving them lots of money and also look at his
fuck it? Same with theovon. Look at their production. Look
how good that production is. Look how nice it is?
Why don't you build that? Oh? What pod? Save? America.
Fuck pod Save America. Fuck them. I'm sorry. We mean
every fucking time, every time it feels like nothing's happening.

(53:41):
It's so condescending. Did you see the David Cross John
Favreau interview? Now that was my foot, So David Cross
said Jon Favreau on his podcast, and so he goes
through his book and there's like the three of them
goes through and there's the guy because I don't remember
Johnny's and David Cross just like you got Jon Favreau
and John Down and John Fevau has no idea what
the fuck to do. And it's just like, yeah, if

(54:04):
you if your big book that you're releasing for your
very important podcasts that you care about a lot is
ghost written, how am I meant to think you care?

Speaker 1 (54:10):
Yeah?

Speaker 2 (54:11):
How am I meant to How is anyone meant to equate?

Speaker 1 (54:13):
Oh?

Speaker 2 (54:13):
How do we get young men? Oh? Young? We don't
have anything like that that has the budgets of a
Theo Vonne or the Yak or anything barstool. Look at Espianation,
probably the closest we've ever got. And Vox just mostly
enjoys taking money away from them. And that's the thing.
These organizations don't exist because no one wants to fund them.
They don't they want to. Joe Rogan have left despite

(54:36):
realizing that it took years and years to buil Joe
Rogan and a shit ton of money that he had
and other people gave him. If you actually fund these
things and put money behind them in all of these cases,
I don't know if they are sincere, but they certainly
fucking come off that way. It's probably fake. They're very
good at lying. I think they're probably sincere towards the beginning.
What if we had a very well funded kind of

(54:58):
like leftist and the term leftist is even a problem
because it's like, well, what is a leftist? Miserable little
pile of secrets? I guess, like, I don't fucking know.
How about you get together a diverse group of people
who enjoys being around each other and they chat fucking
shit in a really glossy, good video forward thing with
a good social team that knows how to cut them.
They do fun things. I choose the Yak because it's

(55:19):
like the weirdly centristy cart like it. They just they
just kind of enjoy like big cat in them. It's fine, fine,
to watch like, I don't really love it. It's just
I'm frustrated because the solutions are actually really obvious solutions
the wrong word. The starting points are very obvious.

Speaker 1 (55:37):
You don't need four million dollars to figure them out.

Speaker 2 (55:40):
Actually I disagree for something else. Yes, you don't need research,
you need money. Shove money into fucking million dollars of
advertising budget a year minimum, just fucking create magic. Do
you think all of the top podcasts on the podcast
charts are there because just because they're good or is
it because they're famous people or had a bunch of money.

(56:01):
It's really easy to do this. You could do this
tomorrow if you really fucking wanted to. You could choose
someone and just be like, that person's gonna be it,
and they're like, oh, who will be the who cares?
Who actually cares? Start creating content like the Theovons, the Acts,
the Joe Rogan's of the world that resembles people who
aren't just white guys who there are tons of dating

(56:24):
trans people, tons of entertaining black woman like yourself. It's
just like there are tons of paper you could just
fucking pick, put them together, throw the money. Well, if
it doesn't work. Yeah, I agree, let's do nothing then, yeah,
let's just sit here and let's pay another consultant that's
a better Kinsey needs their body.

Speaker 1 (56:40):
Yeah, like a better way to spend the money. And
having come up in some of these sort of democratic
establishment circles, they love to be like, oh, we need
to solve this problem, but we are not interested in sailing.
We're not interested in spending a little bit of money.
If it doesn't work, that's it. We can't try anything else.
It's it's so off defeating.

Speaker 2 (57:01):
Also, it will work. I will. I'm sorry, I'm a dumbass.
And at CS I propped I popped up like a
five day long talk show at CS, and I cobbled
it together as I went. It was some of the
best fucking broad I'm not being arrogant. It was some
of the best fucking broadcasts I've heard. And I made it,
and I made it with no planning, And it was

(57:22):
because I thought, I know these people are no questions
to ask them. I'm not even positioning myself in any way.
I'm saying this took I think the combined cost of
that just to to say some stuff like sixteen thousand
dollars total that was with a gear equipment, hotel rooms,
and that was because it was the wow cs Like
that's the thing, like you can and I budgeted that myself,

(57:42):
by the way, But that's the thing. It isn't actually
that expensive to do these things, and if you put
real money into it, you could do something fucking incredible
and you could absolutely. Do you think people want to
hear your Rogan interview?

Speaker 1 (57:53):
These people got No, he's not a good, very good interviewer.

Speaker 2 (57:57):
Like I don't know if I agree.

Speaker 1 (57:58):
Tell me more.

Speaker 2 (58:00):
I don't think he's a smart interviewer, But I think
Rogan is able to ask these questions that put people
at ease, and he gets detailed, aren't weird answers out
of them. He is able to relax them, probably because
like there's one brain cell just like pops on and
on like a light bulb question and he's like yeah,

(58:20):
I like like, oh, yeah, I saw the Grinch the
other yeah, I knew a guy like that. It's just
like Joe Rogan doesn't know a single guy, Like he's
just he stays in that studio. But his studio is
really nice. They get fun clips, timple, horrifying, horrifying, and
he is just a gender setting for like literal entities. Again,

(58:41):
huge amounts of money, tons of cash, and yes there's
grifts there, but goddamn, put the money into it. Throw
the money into it. You want to know why people
don't really know about a lot of these issues. It's
insane that I me a part timer and one of
the first people to just be like, oh, fucking business
people stupid. We have a look up these business people
and they're just saying fucking stupid shit. I should not

(59:04):
be dynamic for saying this, but it's proof, and I
mean this in a positive way. There's tons of opportunity
out there. There's so many ways, and if you bankroll
these things and make them aggressive and fun and support
them legally as well as giving them the social proof
and the advertising budget, people are gonna want to listen
to that they don't. You can make them positive. They

(59:24):
don't even need to be about serious stuff. But shit,
when labor issues come up, you say, yeah, we're pro worker.
Oh I think workers might like that one. Well, why
are we angry at people because the business idiots? There
are things you can and it's not even something you
need to do disingenuously. I choose Western kabuki because all
of those people. I think if you tried Caleb Wilson,

(59:45):
one of my closest friends, do a football podcast for him.
I think if you tried to force Kayleb to hold
an opinion that he didn't want, he would kill you.
But that's what you want. But again, the Democrats would
say that and be like whoa wlah, whohah, what do
you mean you don't like John Fatirman.

Speaker 1 (01:00:01):
Yeah, they would find somewhere like this isn't this is
too folks. He needs to be more scolding.

Speaker 2 (01:00:06):
They've got their purity politics except for like trans people.
Then they're like, well, okay, we don't have beliefs around
these parts.

Speaker 1 (01:00:14):
No, totally.

Speaker 4 (01:00:19):
Let's take a quick break at our back.

Speaker 1 (01:00:39):
I have a question you, So am I mistaken. You're
like a sports guy, right, like you're into sports. Okay,
I have a story that I want to ask you about.
Do you know much about FanDuel I'm familiar? Okay, So
I saw this story last week, and I don't know
much about sports. Actually, this is the only sports that
I know anything about, which is tracked and ran track
in high school. But Gabby Thomas, she's this Harvard educated

(01:01:01):
black track olympian. She won a gold medals in twenty
twenty four during the Summer Olympics. She is awesome, we
love her, but she experienced this like very scary hostile
behavior from somebody who basically was trying to, according to them,
throw her like have like get in her head to
the point where she was not able to compete. So

(01:01:23):
the exactly so last week she posted on X that
a man followed her around the track and took pictures
as she signed autographs for fans for mostly children, and
shouted racist personal insults at her. So this dummy who
did this also went on X and bragged, Yeah, I
made Gabby Luz by heckling her and then it made
my parlay win, and then posted a screenshot from the

(01:01:45):
betting platform fan Duel. So when I saw this, I
was like, well, I'm sure the betting platform would probably
be very interested to know that one of their betters is,
in his own words, like trying to influence the sport
that he's gambling on. And Vanduel found him, boot him
from the platform, and it was like, he's never going
to be able to be on this platform again, they
said FanDuel condemns in the strongest terms, abusive behavior directed

(01:02:08):
toward athletes, threatening or harassing athletes is unacceptable.

Speaker 2 (01:02:12):
That's good.

Speaker 1 (01:02:13):
It is good. So I have to say, like, I
look this guy up on Twitter, even though he deleted
the tweet that was bragging about harassing Gabby. This is
somebody who like clearly is very interested in sports betting.
And I found this very alarming stat from ESPN. A
study commission last year by the NCAA found that abuse
by angry sports betters is one of the most common

(01:02:33):
types of harassment college athletes received, making up at least
twelve percent of publicly posted social media abuse. And I
don't know it just really I don't think I had
really checked in on how the rise and normalization of
online spets online sports betting has like the way that
that has kind of like shaped our online discourse, if
it is in fact is linked to a rise on

(01:02:55):
online harassment of athletes, like, I found that to be
like actually genuinely kind of alarming.

Speaker 2 (01:03:00):
So Arif Hassan, who is the other co host of
the sixty minute Trail the podcast Do with Western Kabooki's
Caleb Wilson, did a great piece about this last year.
I think all of the sport the sports gambling companies,
I think all of them should be shut down. I
think all of their executives should be put in prison forever.
I think it is one of the most socially corrosive
creations of all the time. I think. I live in
Las Vegas. I believe that sport gambling should be if

(01:03:23):
it is allowed to exist, which is a moral question
to itself, it should be so heavily regulated. The companies
like fanjil should be criminalized. Instead, they are growing like wildfire.
I think that man in question should be in jail
for a long time. I think that, in fact, if
we had any two teeth, he should be done for
attempting some kind of tortious interference. He should be like,

(01:03:46):
make up legal fucking thing and stop pulling him for
criminal proceedings, because that's how you actually make an example
of people. It's horrible is to say, but if you
go and chase a black woman, try to do a
job because you want to make a little money, you
should have to. You should get your life. I've kind
of ruined by the legal process. You should even if
you even if you win, it should be there should
be a social punishment for such actions, not only for

(01:04:10):
doing it to a person, but for a person of
color within any industry already kind of like fighting against
the racial horrors of America. You should they should be
so scary to do that that you shouldn't do it. Instead,
the black woman is the one that has to be
scared and the guy, Oh no, I can't bet on fangir,
but you can bet on DraftKings. I bet you can

(01:04:30):
just go into any fucking casino. But there's never there's
never actual real consequences because and even I don't even
even looking up the story briefly, I don't see much
horror around this. This should be like, I guess what
should be front page news right now, the Gestapo or
the like. It's just horrifying every day and new horror. Yeah,

(01:04:51):
I stand by the thing about the sport the sports
betting companies being yelled.

Speaker 1 (01:04:55):
Well, I'm psych it is wild that rather than facing consequences,
what I think is like just an obvious social negative,
these companies have been able to amass more and more power,
They become more ubiquitous, and they're all over college campuses.
And I don't have any data around this, but I
can only imagine that the consequences for young men and
boys is like not great, Well.

Speaker 2 (01:05:15):
Think about it. That's the other thing with the whole oh,
what we're doing with young men. It's like, I don't know,
maybe we shouldn't have legalized sports gambling and made it
hyper masculine and also as the right wing sunk as
much money as possible into big strong men looking strong
and blaming women and people of color for everything, not
spend money on everything and just sit back and go

(01:05:37):
and isn't the world nice? I don't know the whole
man thing as well, is really for and this is
a very male dominated. The sports gambling thing is male dominated.
You've got people that do YouTube channels about how you
can scam your way through parlays. It's just like you
have this media cluster around it that's growing as well.

(01:05:57):
It's just proof that a lot of modern bisiness doesn't
have any morals at all, and we don't have governments
that believe, thanks to Ronald Reagan and his ilk that
we don't have like regulation or things that would stop
businesses from growing rapaciously. And I mean, my whole bit
is that growth is behind every fucking problem. We have
the desperation for growth, and everything is fucking up everything,

(01:06:19):
and sports gambling is it too. But it goes to
Joe Rogan of the Left thing, it's like they grew
into an opportunity. I don't think that. I think it's
an easy way out to blame right wing podcasts for
turning men against the Democrats when the Democrats just offered
nothing to anyone ever, and in fact through everyone under
the bus. And now they're throwing young men under the bus,

(01:06:40):
which I mean, you can do that. Maybe it's right.
Oh no, but you're not. You don't seem to be
trying to appeal to any voter right now, just going like, ah,
what did you see that? That's terrible. Someone should do something.
You're a senator, I know, but what could I possibly do?
I attend a ten sentate hearing's fuck, no, I got nothing.

(01:07:03):
I'm not busy. I just don't want to go strongly
worded letter. You know, it's the tool in their toolbox. Yes,
you know, right wing people love to read killing me.

Speaker 1 (01:07:18):
Oh ed, this is honestly talking to you about these stories,
is I don't know. This is a different kind of
episodes than you usually do. Because I feel like I no,
I do not apologize. This is exactly why I wanted
to have you on. And one of the reasons I
wanted to have you on is literally, whenever AI comes up,
even tangentially in any episode we do, someone in the

(01:07:40):
Spotify comments will be like, you should talk to Ed.
Like this is like a hand like there are a
handful of people who are like, whenever anything comes up,
like listen to Ed, talk to Ed. So thank you for.

Speaker 2 (01:07:51):
For and also to those listeners. Bridget supported bat off
line from the literal beginning, not just should talk to it.
Bridget has been that very early. She deserves tons of credit.
Like it's not like she has been behind, She's been ahead.

Speaker 1 (01:08:05):
Oh well, you know, I I remember when you when
y'all and out you were doing the show. I was
genuinely so excited exactly because of what you were just
talking about. How especially in the podcast space, there are
just so few people who are having these conversations who
really want to kind of shed some light on all
the ego and lies and bullshit and bullshit and scams

(01:08:27):
that I feel like has become so normalized in the space.
They're They're like, I could probably count on two hands
how many voices I think are out there in the
podcast space, specifically who are doing that, and there needs
to be more.

Speaker 2 (01:08:39):
And also I don't think enough of them love technology,
because that's the fundamental thing. I'm pissed off because they
took something from us. I'm pissed off because I like
the computer and I love to pay like you. The
whole cool so media thing is the result of the Internet.
My whole following is the Internet. My life's the Internet.
And the drill crying tweet if you've seen that one, yeah,
now get the fuck out of my office like it's

(01:08:59):
And I think that that is missing from a lot
of tech cynics as well. It's just that they want
something to be mad at, rather than they're mad that
something was taken. Also, if all this money and power
went somewhere good, imagine how much better the world could be.
I don't even mean that in the vague, vacuous, nonprofit
flavored Oh the world be good. It'd be more fun,

(01:09:20):
they'd be more exciting things. We could have more solidary,
we could love each other more. They could be more
interesting and fun, art made. They could be more support
for social issues instead it's just this fucking smudge on
the world right now when nothing's happening, hundreds of billions
of dollars going into fucking nothing. It's depressing, but it
can be better, and it gets better by taking their names.

(01:09:41):
Don't let them, don't let make Google Search hate them.
I fucked that part of Google Search. That's one of
my proudest moments.

Speaker 1 (01:09:47):
How th tell me so?

Speaker 2 (01:09:49):
I did an episode last year called the Man who
Killed Google Search or the Man that destroyed Google Search.
And there's a guy called Propagard Ragavan. I found his
details in the Department of Justice Anti Antitrust trial. They
had all these emails and I basically found the story
of this guy who was the head of ads called
probably gol Ragavan, who wrapped fucked this guy Ben Gomes
pushed him out, and literally the emails are like Ben

(01:10:12):
Gomes or Shashi Fakher. I think one of the engineers goes.
I'm just worried that all Google cares about his growth,
to the point that when I read that, I had
to go and check the u URL to make sure
I was not being pranked. But it was my biggest story.
And now Proba Ragavan has been given a technical title.
He's like the chief technologist. He was the head of everything,
and that he's the chief technologist. They put him out back,

(01:10:35):
they put him on, they put him in an office
without a door handle inside.

Speaker 1 (01:10:38):
He's still in that office.

Speaker 2 (01:10:40):
Yeah, eight hours a day. All right, Propa guy, you
can leave now you're stock still vesting, mate, but don't
talk to anyone. Don't plug in your computer. Piece of shit.
I'd love to meet that guy. But he hates me
and he's hard.

Speaker 1 (01:10:52):
That's incredible. I mean you said something. I mean, I
like you. I love technology. The reason why I make
a tech podcast is because I I love tech. I
want it to be better. I want technology to be
something that feels good and free again, Like I like,
genuinely remember how it felt to stumble upon weird corners
of the Internet that you could just tell were labors

(01:11:13):
of love, like weird flash sites and shit like that.
And I mean, you can really tell when someone who
is talking about the Internet, who does so for a living,
doesn't come from a place of loving tech and like
being crazy about computers and crazy. Yeah, I do think
it's like cranks who hate everything. They don't actually want
it to be good. They want something to complain about,

(01:11:34):
or or.

Speaker 2 (01:11:36):
They don't love technology, they love the tech industry. It's
is a sick position that is gossip journalist, and I
think that's the majority of them. It's scary and I'd
love to be proven wrong well ed.

Speaker 1 (01:11:48):
Where can folks listen to better offline? Where can folks
read the newsletter?

Speaker 2 (01:11:53):
Betrofline dot com. It's I bought that website for a reason.
Betrofline dot com. You can find all my crap there.
You click the links, you're find my profile, you find
my everything, and that you has the big skull on it.
That's how you know you're in the right place. And
we're selling merch now, which I'm so happy with.

Speaker 1 (01:12:09):
I was just gonna say the merch is good.

Speaker 2 (01:12:11):
It's banging. It's so good. I don't the only reason
I'm in Las Vegas, it's really hort otherwise be wearing
my hoodie like it's fucking like the Zip Hop hoodies
are fucking banging. I love it. I love that we
have good merch. I wouldn't accept it. Were gonna do
challenge coins?

Speaker 1 (01:12:25):
Yeah, is that your cat in the background, by the way.

Speaker 2 (01:12:27):
Yes, been trying to keep him from jumping up. It
pisses me off. When I'm trying to do stuff, people
are like, oh I cat, I'm like, get the fuck
out of my camera. God God, I love him daily,
but he's a pain in the ask.

Speaker 1 (01:12:37):
Wait, what is his name? Howe Howell? While Ed and Howell,
thank you both for being here. Thank you for all
of your work. If you want to follow me around
the internet, I'm on TikTok at Bridget Murrie, I'm on
Instagram at Bridget ren DC, and we're on YouTube. But
there are No Girls on the Internet. Thank you much
so much for being her Ed. We will see you
on the internet.

Speaker 2 (01:12:56):
Thank you.

Speaker 1 (01:13:06):
If you're looking for ways to support the show, check
out our merch store at tangody dot com slash store.
Got a story about an interesting thing in tech, or
just want to say hi, You can reach us at
Hello at teangody dot com. You can also find transcripts
for today's episode at TENG Goody dot com. There Are
No Girls on the Internet was created by me Bridget
tod It's a production of iHeartRadio and Unboss Creative, edited

(01:13:27):
by Joey pat Jonathan Strickland as our executive producer. Tari
Harrison is our producer and sound engineer. Michael Almado is
our contributing producer. I'm your host, Bridget Todd. If you
want to help us grow, rate and review us.

Speaker 4 (01:13:38):
On Apple Podcasts.

Speaker 1 (01:13:40):
For more podcasts from iHeartRadio, check out the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Ridiculous History

Ridiculous History

History is beautiful, brutal and, often, ridiculous. Join Ben Bowlin and Noel Brown as they dive into some of the weirdest stories from across the span of human civilization in Ridiculous History, a podcast by iHeartRadio.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.