All Episodes

January 8, 2025 99 mins

Welcome to Better Offline’s coverage of the 2025 Consumer Electronics Show - a standup radio station in the Venetian with an attached open bar where reporters, experts and various other characters bring you the stories from the floor. In the second episode, Ed Zitron is joined by Health Physicist Phil Broughton, actual priest Gabriel Mosher, journalist Ed Ongweso Jr., and eventually Robert Evans and Gare Davis of It Could Happen Here, to talk about turning our health into an analytics nightmare, the amount of made-up stuff on the floor, how we beat the generative AI slop, and something called a "suicide helicopter." 

Ed Ongweso Jr.: https://bsky.app/profile/bigblackjacobin.bsky.social
Robert Evans: https://bsky.app/profile/iwriteok.bsky.social
Gare Davis: https://bsky.app/profile/hungrybowtie.bsky.social
Gabriel Mosher: https://bsky.app/profile/eighthway.com
Phil Broughton: https://bsky.app/profile/funranium.bsky.social

Linda Yaccarino’s CES Speech: https://www.youtube.com/watch?v=1R9LtHiZ74w
Meta ends fact-checking program: https://www.nbcnews.com/tech/social-media/meta-ends-fact-checking-program-community-notes-x-rcna186468

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media Chosen by God, perfected by Science. My name
is said Zeitron. You're listening to Better Offline. This is
day one technically, but really day two of the Better

(00:23):
Offline CES Saga, the Consumer Electronics Show. Here from beautiful
Las Vegas, Nevada. My home, I my van helsing like curse,
but also my job. I must be in Vegas. Our
beautiful slot machines must be filled with beautiful dollars. I
am joined today by a remarkable coterie of people. Edon
Graiso Junior, of course, joins me for another episode. ED

(00:45):
lovely to see you. It's really great to be back.
Thanks for having me again. So just before we go
any further, as people think, why did I say that
at the beginning, other than the fact it's true, where
did you hear? The phrase Chosen by God, perfected by Science.
Today was on a.

Speaker 2 (01:00):
Giant a giant banner above CES on the Venetian Expo floor.

Speaker 1 (01:06):
Nice.

Speaker 2 (01:07):
It was advertisement for germ Pass, which is this automatic
ultraviolent light technology that automatically kills ninety nine point ninety
nine percent of germs on any high touch surface.

Speaker 1 (01:22):
Right, so to my right sets Philip Broughton, who is
going to tell you what he does.

Speaker 3 (01:30):
So Hi, I'm Phil Broddon. I am a health physicist,
which means radiation safety and laser safety officer, which means
I deal with the entire electromagnetic spectrum from gamma rays
all the way down to radio.

Speaker 1 (01:44):
And Phil is also the bartender for the Sweet So
if you somehow came to episode two of the Cees Saga,
I don't know why you would do this. This isn't
like a there's not like a chroninal, there's not a
plot to follow. I guess you could jump anyway, but
like you probably start episode one anyway. We're in the
bu Venetians suite. We're in a recessed area, separate room.
On the side. We are running a suite with Boost

(02:04):
for journalists all weeks. Tons of places to sit down
and me, which is why no one's toning up. But nevertheless,
we are also recording a podcast and Phil, Phil and
I have been doing this eight it's eight times now.

Speaker 3 (02:17):
This is the eighth time been doing it since twenty fifteen.

Speaker 1 (02:21):
And we love doing it. It's a Phil takes vacation
time from his job elsewhere, and yeah, it's great. And
also Phil's other thing he likes to do is a
suffer torment. He likes to suffer pain at all times.
Because Phil is a safety man and is well aware
of things like codes and laws and stuff and regulations,

(02:43):
and so, Phil, you may make a record this year.
It's every year by Wednesday, someone has brought you something
that required you to call the fire marshal.

Speaker 3 (02:53):
Yeah, that's why.

Speaker 4 (02:55):
Is that?

Speaker 3 (02:56):
Because sometimes people lie on their applications. They lie when they're.

Speaker 1 (03:02):
That applications will walk by.

Speaker 3 (03:04):
Oh, so everyone who has a booth at CES is
required to submit how many square feet they're gonna have,
what utilities they're gonna need, and any hazardous things they
might bring to the floor to the people who run
the show, who then are required to tell the fire marshal.
And the fire marshal then comes back and goes, oh,

(03:26):
you've added rabid, flaming, radioactive gerbils to the show. That
changes the occupancy we're allowed to have in this building.
So please don't do them well.

Speaker 1 (03:37):
And then they lie, and so what do you usually
tell the fire marshal.

Speaker 3 (03:43):
So what comes to me is a journalist will say, Hey, Phil,
one could I get to drink?

Speaker 1 (03:50):
Two?

Speaker 3 (03:52):
I saw a thing and I don't think it was
okay or m that seemed dangerous and I'm worried. No, okay,
so what do you have? Where did you see this? Roughly?
Let me go ahead and pull up the CES map
of all the vendors. Oh, you're on that corner of
that aisle. Cool. Let me go make a call.

Speaker 1 (04:14):
So has anyone been like arrested for this if they've
been like taken off the floor, Like, what are the
consequences of.

Speaker 3 (04:22):
The far removed from the floor.

Speaker 1 (04:24):
Yeah, that's fun arrested.

Speaker 3 (04:26):
No, but you end up forfeiting your fees and you
don't get to present anymore for the rest of that show,
or they take the dangerous thing away.

Speaker 1 (04:35):
I think that maybe, based on what Ed will be
telling us about in a little bit, I think we
should maybe remove all of the booths from CS.

Speaker 4 (04:44):
Like.

Speaker 1 (04:45):
It sounds like that might be a good idea. But
before we get to that, we're also joined by a
man of the clergy. So for some reason, and I
really mean this, I do not fucking know, we have
a priest that joins us, father Gabriel Mosher, who joins
me I left and weirdly connected with the tech media
as well. But you just kind of show up in

(05:05):
the robes, not complaining. Just I really do not remember why.

Speaker 5 (05:09):
I have no idea how this all got started. But
it was also ten years ago?

Speaker 1 (05:14):
Was it ten years ago? Not fucking hell? Anyway, I
age every day. But like, do you know how he
got here?

Speaker 3 (05:20):
Yes? I paid for his flight.

Speaker 1 (05:21):
I'm know where buddy.

Speaker 5 (05:24):
Hell, you know, I think part of it was I,
especially back then, I was connected to quite a few
sort of like tech people and whatnot.

Speaker 1 (05:32):
Right, Yeah, and we figured might as well. Yes, so,
And just to be clear, you have no worldly possessions
other than those which are given to you.

Speaker 4 (05:40):
Correct, pretty much.

Speaker 1 (05:41):
Yeah, that's great. Would you describe cs as sinful or
not sinful? Depends depends on which party you go to.
I mean the convention, I mean that exactly. Enough about Yo, No,
this is a scobullless zone.

Speaker 5 (05:56):
No, I think it's great unless you you think that,
you know, technology going to save you from everything.

Speaker 4 (06:01):
But yeah, no, it's it's interesting.

Speaker 5 (06:03):
I find, you know, the whole idea of these various
people trying to develop innovations to make life a little
bit better, a little bit more convenient, to be a
good thing. I mean, it's that's just that's just human nature.
It's great unless they are a bunch of liars, which
they are half the time.

Speaker 1 (06:19):
And talking of liars, so the there is an article
in the New York Times at the moment about priests
that using chat GPTM to make sermons. How do you
feel about that? So?

Speaker 4 (06:31):
I think that's pretty lame.

Speaker 1 (06:33):
Do you think is it sacrilegious?

Speaker 6 (06:34):
No?

Speaker 1 (06:35):
But why not?

Speaker 5 (06:36):
Well, it's just like if you're going to use chatgybt,
for instance, to create sort of the outline of a
book or something like that, I think that's fine. But
if you're if you're using if you're using chatgybt to
like write your sermon as a whole, then you've failed.

Speaker 4 (06:49):
It's just like writing your term paper.

Speaker 1 (06:50):
Okay, So very specific question though, So if it hallucinated
a verse of the Bible? M hm, would that be
like a false idol situation? Would that actually be a sin? No?

Speaker 5 (07:02):
I mean I mean not from the chatgybt because it's
not like no, no for the person publishing person. Yeah, sure,
I mean yeah, that would be that would be pretty lame.
I mean you would you would definitely be trying to
like lead people into like weird places, and yeah, this.

Speaker 1 (07:14):
Is this is how I imagine that's also bad.

Speaker 4 (07:17):
Who knows? I mean, I don't know.

Speaker 1 (07:18):
I means this is why the show is happening. We
need to Yeah, it's just the and that article was
I don't know why it had to. It's one of
those things you read and you're like.

Speaker 4 (07:28):
I've got to read this article.

Speaker 1 (07:29):
It's not great. It's weirdly well reported, like it's the
guy went out and like really reported the ship out
of it. But it was just like, Okay, a bunch
of people who believe in something that not everyone believes in,
believe in another thing that let even less people believe
it lots of people spend more money on.

Speaker 5 (07:45):
But I can tell you, like most preaching is pretty terrible,
so this might actually be a notch up. So anyway, yeah,
you think, I mean preaching is pretty bad these days, is.

Speaker 1 (07:54):
Not that fucking point? Looking at that, you would think
it's why I can't go into churches other than the
burning so terrible. Yeah, mister W. Gwisner, Junia, tell us
a little bit about what you've seen today on the floor.

Speaker 2 (08:06):
All right, Well, you know, I scanned a bit for
a while because I was trying to find something that
spoke to me.

Speaker 1 (08:11):
First.

Speaker 2 (08:12):
I went for these healthcare management devices for two reasons.

Speaker 1 (08:16):
I won. You know, my coach Jaythan on This Machine
Kills podcastant on This Machine Kills podcast not introduce you.
I'm so sorry, totally fine.

Speaker 2 (08:25):
You know, listen to the first episode, everybody, and then
the second one and.

Speaker 3 (08:29):
Then the other podcast.

Speaker 1 (08:31):
Right.

Speaker 2 (08:31):
Yeah, you know, Jaythan writes a lot about network devices
and mythologies behind them, and that was one of the
courts subjects of his first book, and so I pulled
into that for that reason. Also because you know, my
partner uses medical devices to monitor glue close levels since
they're diabetics, so I'm always interested. I've been increasingly interested

(08:52):
in them because you know, these devices don't really work,
and so when I see innovative offerings or well.

Speaker 3 (08:59):
You know, actual devices are qualified and do work with
but health management aids aren't real.

Speaker 2 (09:07):
Yeah, this is what I or what I what I
should say, is that their devices work, but there are
a malfunction regularly at a rate that surprises me considering
they're supposed to be and are necessary, you know, for
staying within a healthy range for blood sugar. But the
health management aids, you know, they offer. They present themselves

(09:28):
as more aggressive solves, as preventative healthcare mechanisms, right as
ways to.

Speaker 1 (09:36):
Overhaul your entire lifestyle.

Speaker 2 (09:38):
So I was on the lookout for a lot of
these because I've been interested in them increasingly, in writing
about the ways in which they, you know, people are
encouraged to be healthier or to empower wellness or to
reward it, and in the ways this data gets you know,
accumulated sent back to insurance companies. So the first first
one I went to was biopop, which is this sorry,

(10:00):
it's this device that is interesting because obstensively it's it's
it is promising, and it's a non invasive UH device
healthcare management device that would allow you to get access
to some biometric, metabolic, you know data. But it uses
uh it uses ultra Sorry, it uses an n I

(10:22):
R spectra spectruc spectroscopy.

Speaker 1 (10:26):
Spect truck spect truck. I couldn't say it, reader.

Speaker 3 (10:32):
You're infrared spectroscopy what N I R, yes, M spectrum
spe speculum.

Speaker 4 (10:40):
Yes, it activates speak speech.

Speaker 1 (10:43):
Just be staring at the abyss the whole time, trying.

Speaker 2 (10:45):
To say And so I came into the booth, you know,
listening to them, you know, listening to them talk to
other people about it. One of the use cases that
was offered was, you know, imagine you have a c GM.
There's no expensive, hard to maintain. You can use this
to monitor your blood glucose because that the only hike
and monitor right now, you know, implying it to a
medical device. So I go in and I say, he
is this a medical device? And he goes, oh, no,

(11:08):
it's so.

Speaker 1 (11:08):
Cool that they have that. They have to do that.

Speaker 2 (11:11):
Yeah, Actually the healthcare management device that doesn't work if
you have a lot of melanin in your skin because.

Speaker 1 (11:18):
Interesting only tested it on Asians and light.

Speaker 2 (11:20):
Skinned people and so it's not in market yet because
they need to test it out on people from India
and Africa.

Speaker 1 (11:28):
And so thankfully there's not that many people.

Speaker 3 (11:32):
Just a few billions.

Speaker 1 (11:34):
That's It's like way less than Montana.

Speaker 2 (11:36):
So biopop is very interested in helping you improve your
healthcare management uh and empowering you to be a healthier person.

Speaker 1 (11:44):
They also want to reward you for doing so.

Speaker 2 (11:46):
So they are both on the blockchain and they will
share information.

Speaker 1 (11:49):
All right, companies understand which blockchain it's is crazy. They
did not specify. I was listening the whole time, but
my brain is just like, this is the same shit
you hear blockchain, Yeah, didn't specify one. No, to what end?
This is the stuff. I will be screaming at them
and say they.

Speaker 2 (12:10):
Want to encourage users to meet certain to meet certain
goals and to stay within certain ranges. If you're on
the blockchain, you might be able to get a reward. Great,
maybe it's a coin in the ecosystem they might create,
maybe it's from somebody at talking, maybe it's more established blockchain.
They also, again are sending the information.

Speaker 4 (12:29):
To the.

Speaker 1 (12:31):
UH, to the insurance, the.

Speaker 2 (12:33):
Insurers, because this is how they're going to be able
to you know, kind of they're they're they're trying to avoid.

Speaker 3 (12:46):
Certification.

Speaker 2 (12:47):
No sorry, step back, they're trying to achieve it, and
they're trying to get an insurance coverage. And they're also
trying to forgure out a way that they can work
with or you know, participate in some of these wellness
programs that are popping up where your insurance coverage is
modulated based on your own personal behavior to kind of
ensure or regulate or govern govern and how you behave.

Speaker 1 (13:07):
Outside so that you just become a healthier person. I
think it's fucking sick that everything is becoming metro mile now.
Everything is just like and what this kind of sounds like,
and I hate to go a little far, it does
sound a little eugenics, Jace. It's just like, if we
all optimize our health, insurance will be cheaper, if we
have the right numbers, we're the right kind of person.

(13:28):
And there's only one way all that stuff goes. And
they'll claim that it's an efficiency thing. But I've yet
all of this. I have a fucking Aura ring on.
I occasionally have an Apple Watch on. Sometimes it even
reads my heart rate. I have all of this data
and it's useless. And occasionally United Healthcare will be like, hey,
can you you want to connect your Apple watch to
us for some reason? Yeah, it's not useless.

Speaker 2 (13:50):
You're creating a profile that if you know, they have
got their hands on it, and we're just rich.

Speaker 1 (13:55):
You want to connect that, you want to connect men
can watch let me see how what you lift? You
lift enough?

Speaker 3 (14:05):
So the real customer is the insurance company, not Yeah.

Speaker 2 (14:08):
The real customers who are interested in the data, I mean,
And that's really also the thing that you'll see with
a lot.

Speaker 1 (14:14):
I saw a.

Speaker 2 (14:15):
Lot of healthcare ecosystems and apps and smart tech that
were purportedly about rewarding you for getting better sleep, rewarding
you for eating better, rewarding you for meeting certain goals.
But then when you think about what the reward actually
is or how it's being distributed, what it is is
that they are going to cut some sort of deal
with someone who actually wants that data on you. Yeah,

(14:37):
we'll use that to extract a little bit more profit
out of you, and then they're also going to give
you a little bit of that the crumbs of that
value that's being created.

Speaker 1 (14:47):
Right, Yeah, this is great. This is also and there
were lots of companies doing this. Yes, I would say.

Speaker 2 (14:52):
Almost every take called health company I saw that was
using artificial intelligence or was using some sort of rewards
as them for empowering wellness seems to have a model
that is premised on regulating and governing your behavior so
that they can generate data about you that would generate
profits about you and other people that then they could

(15:14):
give you a small piece of you know, that's or
you know, another company I actually use the data to
offer you discounts to other devices and other products that
you could use. So this is we give you a
AI sleep mask and realize you're not sleeping properly because

(15:36):
of your pillow, so we give you thirty dollars off
the pillow.

Speaker 1 (15:39):
Honestly, at this point, my seventh pillow, I actually want
the computer to try and wat this one out for me.
Don't you want a computer in your pillow? And I
don't know what this AI sleep musk is because I
sleep with I sleep in an insane way. I have
like a skull cap on and I have a giant
eye mask. I must sleep in complete darkness. I also
like I grab the covers, and I like c into

(16:01):
a ball at the edge of the bed. My therapist says,
this is good. Yeah, anyway, an AI sleep mask? What
the fuck would that do?

Speaker 2 (16:08):
Every time one of these things said AI, it's an
app and an algorithm that collects the data, presents it
to you on a colorful graph, right, maybe gleans or
infers insights about other elements of how you're sleeping eating.

Speaker 1 (16:24):
Did they explain any of how this would work?

Speaker 2 (16:27):
No, you know, for example, one place you know? For example,
they said they can offer a way for you to
reduce mental stress. And I asked how they measure mental
stress and they say, well, if you got enough of
our devices in the ecosystem, we can measure.

Speaker 1 (16:40):
We can measure physiological.

Speaker 2 (16:43):
Things like your heartbeat, we can and how much you sweat, right,
and we can say, oh, this is not in the
usual range for you.

Speaker 1 (16:50):
What that is true with me? You're being anxious? You
know that's yeah? I also I can see it, but
also I should A person can do this.

Speaker 5 (17:00):
Work booze to make that happen is like, I don't know,
I don't know how this works.

Speaker 1 (17:03):
Yeah, Like these rewards, what are they so other than
discounts off of other members of the scam.

Speaker 2 (17:09):
Discounts an unnamed crypto payouts.

Speaker 1 (17:16):
This is so sick. This is I would love us
to dig into the history of these people. Yeah, I
would love to learn what their previous job was.

Speaker 3 (17:24):
I'm hurting over here. You're not stress enough that the
single one of these things is an fty A certified product.

Speaker 1 (17:30):
If Costa Rica, I am going to fucking lose it. Listen.

Speaker 2 (17:34):
I would like to know too, because I didn't think
I was going to see crypto. I think we talked
about it the other day, and you guys are like,
it's not here, And then what's the first.

Speaker 1 (17:41):
Thing I go blocked? We're on the blockchain that reward.

Speaker 4 (17:45):
This is the thing, like, it's insane.

Speaker 1 (17:46):
It feels like the one place you wouldn't.

Speaker 2 (17:48):
Well they were they were kind of saying that by
being on the blockchain, your data is secure.

Speaker 1 (17:54):
That's not. It's entirely open. Yes, I know less they
do a private blockchain. Those do exist, at which point
you can ask the question of why don't I use
any number of other databases that would be more efficient. Yes,
Oh it's immutable. Who cares, No one gives a fuck,
So the databases can be like that too.

Speaker 4 (18:12):
It's just a trendy word that they can use.

Speaker 1 (18:13):
It's not even trendy anymore.

Speaker 4 (18:15):
But like ten years ago, right now the people.

Speaker 1 (18:17):
Still saying new phone, who dis it's learn and you
meme and also do another thing. Why is the crew?
I love that that's still legal.

Speaker 3 (18:28):
I'm scared it's going to be super leg It's.

Speaker 1 (18:30):
Going to be super legal. And the FDI super.

Speaker 4 (18:34):
Is in like above legal, it's above.

Speaker 2 (18:36):
It's encouraged, it's sanctioned by the sanctioned by the Department
of governmental efficiency.

Speaker 3 (18:45):
I'm so scared to check Kaiser when I get home
to see if they're thinking about this.

Speaker 4 (18:50):
They what new alert Kaiser private blockchain.

Speaker 1 (18:54):
That's a good question though. Did any of these companies
have any contracts with insurance companies? No?

Speaker 2 (18:59):
Actually, every single one that would talk about this either
didn't have the product that was offering out in the
market yet or didn't have a single insured that was
approval to even work with insurance.

Speaker 1 (19:11):
I just feel like insurance would build these things themselves.
You would think that, and probably they will.

Speaker 2 (19:16):
Well they just i mean not with the blockchain bullshit,
but they are already building like you know, well established
data pipelines to get in insights.

Speaker 6 (19:24):
I mean.

Speaker 2 (19:24):
One example of this is like cars and car insurance. Right,
your car is a computer, right, and car insurance have
a relatively easy time of using your driving behavior to modulate,
you know, what they're going to charge you on a
basic level, of course, if you do accidents and such,
but you know, getting to the point where it's like,
you know, how are you breaking, how are you how
fast you're actually going?

Speaker 4 (19:45):
You know.

Speaker 5 (19:46):
But another thing is that this this already exists for
the insurance companies.

Speaker 4 (19:49):
It might be. Is a thing what semi it's the
Medical Information Bureau.

Speaker 5 (19:53):
It's it's a housing for all of the medical data
of everybody that is involved in any medical stuff in
the United States, and it's all housed University of Michigan.
I think, okay, so who can access this? This is
it's effectively supposed to be like metadata for actuaries. And
so this is why actuarial data for insurance companies is

(20:16):
so precise and accurate, because they can see all of
this data and then use it to rate policies.

Speaker 1 (20:23):
So we create like a little surveillance site just so
the health insurance could continue.

Speaker 5 (20:27):
And this has existed since like the seventies, like sixties
or seventies.

Speaker 4 (20:30):
I think, yeah, well, no, this is people.

Speaker 1 (20:32):
Are piste off about insurance companies.

Speaker 3 (20:34):
Yeah, and have been for decades.

Speaker 5 (20:36):
Yeah, so it's like this isn't new, it's just now
it's being monetized. I mean it was already being monetized,
but just in a different way.

Speaker 2 (20:43):
But a different way by a different group of people
who are trying it. It's unclear if they're trying to
replicate the thing or build their own infrastructure and products
around it.

Speaker 5 (20:54):
But yeah, yeah, because MiB is the whole point of
it is and why it's at an academic institution is
that it's it's a used to be you know, outside
of outside of industry, right, that's kind of.

Speaker 3 (21:05):
The idea available to everyone, right.

Speaker 5 (21:07):
And so whereas this is now, this is now something
you can put a patent on, this is something that
you can Yeah, this is this is shady.

Speaker 1 (21:14):
Yeah, yeah, to say the least, To say the least,
was there anything other than this kind of ship that in.

Speaker 2 (21:20):
The healthcare space, a lot of it was like you
know what, you know, what's called like, you know, luxury surveillance,
whor's just like we are gonna pay you know, you're
gonna pay us for the privilege of giving us a
continuous stream of your behavioral data so that we can
figure out a way to give you more cool products
or you.

Speaker 1 (21:38):
Know, generate some money off of it.

Speaker 4 (21:39):
Right.

Speaker 1 (21:39):
It's amazing, It's beautiful. I love it.

Speaker 2 (21:41):
I think it's a you know, it's a it's a
it's a genuine value chain.

Speaker 1 (21:48):
That you had me for a second that I was
I was about to get concerned. Obviously, No, I'm actually
a gobless. So what else did you see out on
the on the health flow.

Speaker 2 (22:00):
Oh yeah, yeah, yeah, So you know what else we
saw we saw, I saw a lot. There's a there's
an AI baby monitor. Okay, right, yeah, this is this
for my AI baby or no, it's actually uh for
your real baby.

Speaker 1 (22:15):
Okay.

Speaker 2 (22:15):
So the pamphlet that I had, I don't know if
I actually still have it anymore, but it opens up
with saying, did you know that you know how many
kids die from a sudden infanitet syndrome.

Speaker 1 (22:23):
That could be one of your kids.

Speaker 2 (22:24):
The way to prevent that is to use our devices
so that you can monitor your child if they start
moving forward.

Speaker 1 (22:29):
And these have existed for so long, like nan it
is like over a decade, all like the ones for
like the actual like the feet are very valuable. But
that's also solved problem. Yes, it is a solved problem, right,
but they found a new way to solve it because
what they're trying to do.

Speaker 2 (22:47):
Here, it seems, is build an ecosystem.

Speaker 7 (22:51):
Where it's always an ecosystem. Ecosystem we are in Savannah.
The catalog of products that your baby can have. That product, right,
Babies love product. Babies are consumers, go.

Speaker 4 (23:07):
For it, you know, well they have eyeballs.

Speaker 2 (23:09):
I mean, so you know, stuff for their eyes, for
their nose, for the ears, for their breathing.

Speaker 3 (23:16):
Fully online, but these.

Speaker 2 (23:18):
Products just being all separate products, replacement toaste, just.

Speaker 1 (23:26):
Care.

Speaker 2 (23:26):
Yeah, there's stuff for the ears, AI powder, nails trimmers.

Speaker 1 (23:31):
Oh you found a slop company.

Speaker 3 (23:33):
Yeah, they have the spine that goes in the base
of the skull for the matrix and caliber.

Speaker 1 (23:38):
This is an important CEES history things. So there are
some fascinating companies. The most honest part of CES is
the crust on the outside. So you go to the
very ends of CES and it's just like black text
on white background Chinese companies and it's just like that
something something electronic corporation and they sell CCTV gear, battery packs,

(24:03):
miscellaneous plugs and dildos and it is just that you
can find these every cs and they are the most
fun to talk today and.

Speaker 4 (24:10):
Give us shit.

Speaker 1 (24:11):
Yeah. But what Eddie is referring to as a slop
company where the ones that are not honest enough to
just be like, yeah, we sell everything, just we're here
to money launder. I guess not saying the people literally
at the fine companies that I've kind of named money
laundered just for the legal purposes. But I feel like
this one is like trying to pretend they're like doing
good for the world versus being a wholesaler of baby safe.

Speaker 3 (24:32):
Yeah, they opened with terror for your baby is going.

Speaker 1 (24:35):
To exactly they sell them fair and also outlet exists.
What the fuck is the AI part? What's that?

Speaker 2 (24:42):
So that one was literally just monitoring if your child
is moving and if it doesn't move a loading you
and that this really has been Yeah, that's been sold
for a long time, like you said, right, but what's
they Well that's a great question.

Speaker 1 (24:57):
What is the AI part of that? Well, I mean
there is actual very like computer vision. Like to be
clear that official intelligence has been around for a while
long has actually been working.

Speaker 2 (25:07):
When they say AI, they're implying there is a novel
thing that is artificial intelligence on top of this and
an obscurring.

Speaker 1 (25:14):
Versus on top of the actual AI.

Speaker 5 (25:16):
There's already around for a while, like some sort of
generative is what's going on.

Speaker 1 (25:20):
There's nothing new. They just want they just hope that
you'll think.

Speaker 3 (25:23):
It's generating a baby.

Speaker 1 (25:25):
Yeah, this manual, if you look at the back, it
says twenty seventeen on the.

Speaker 4 (25:29):
All of a sudden you have a new baby. Like,
it's just like.

Speaker 2 (25:31):
It's it's it's soothing your anxiety. So you can generate
that m grades.

Speaker 1 (25:35):
You know, that would be a really great way for
one of these people to sell something, just being like
you're anxious, you're not able to have sex. Yeah, anyway,
you need this baby monitoring a surveillance state in your house,
or of course the wonderful the AI cat litter thing.
Yeah yeah, oh my god, five G connected. Fuck yeah,

(25:55):
that means there's an app, isn't that five G's whole
point that it's like home broad bank connection. But what
are you streaming? So most other than cat piss Yeah,
this is the thing.

Speaker 2 (26:10):
Almost all of these devices are like, hey, so we
will give you a nap and in the app you
can see when your guy gets on and off, how
much it ships or pisses, and if there's any differences
in it, and it will give you a mediate alerts,
right because it's.

Speaker 1 (26:25):
Your cat has ship. Yeah, honestly, what do you You're
more anxious?

Speaker 4 (26:33):
Pops up like your cat has pooped.

Speaker 1 (26:35):
You're on like a fucking like on a date. Yeah, sorry,
my cat's not shitting right.

Speaker 3 (26:40):
The weird thing is my partner and I were looking
at a new cat litter box, and of course they
all do have that five G integration, and the app
associate the answer why, And it's a niche cases. Every
time we have taken the cats to the vet, they've asked,
do you have a stool sample? How can I tell
the difference the different turds? Yeah, oh, because I know

(27:03):
a fresh one has just dropped. Let me go get
that in the medium so I hear.

Speaker 1 (27:07):
So this is I'm describing this cat litter box.

Speaker 3 (27:10):
How the AI knows which is and.

Speaker 1 (27:12):
Apparently it has multi cat facial recognition.

Speaker 4 (27:17):
Yeah.

Speaker 1 (27:18):
A friend of the show Hunter Washington Post brought this
to me yesterday. It was this, Apparently it can tell
the different cats, so that if you have one sickly
horrible cat and one beautiful sweet cat, you know which
one to take to the bed for the last time? No, right,
you know?

Speaker 5 (27:32):
Or like if it commits a crime, yeah, ye, so
that you're the cat stealing.

Speaker 1 (27:39):
All the fucking food outside. But my question as well,
is this cat litter thing, which is eight hundred and
ninety nine dollars, by the way, which fucking rocks.

Speaker 3 (27:47):
I assume it has proprietary litter that goes into it
as well, because that's the consumable.

Speaker 1 (27:52):
I mean, why wouldn't you use it?

Speaker 3 (27:53):
All about I.

Speaker 1 (27:54):
Need a subscription. I assume there's a subscription fee. Yeah,
and I would be surprised that there was. If there's not,
I will absolutely fucking lose it. Like why am I
if I can't pay an extra five dollars a month
to catgenie dot biz.

Speaker 5 (28:07):
Yeah, well then I don't trust them after that because
I ever thought through this process clearly, and.

Speaker 2 (28:11):
For some time I was looking at an automatic feeder
because I had this, I had a schedule and most
of my roommates are out and friends weren't in town
where I wasn't able to get someone to watch the
cat I was like, okay, well, like I can get
someone to watch them sometimes, maybe I'll get an auto
non automatic feeder, automatic litter box. And then every single
one I tried to get there's a fucking app. And

(28:33):
then when you compare it to the ones that don't
have the app, it's like a three hundred dollars addition
to the price for nothing other than like we've been
talking about, no cat shit, gradual.

Speaker 1 (28:44):
I'm just imagining being like in court and you've you've
silenced your phone and your judge is turning to me.
He's about sentence. You're not meant to have your apple watching.
It's like being catship, just like is it? Is it
like nest where you get the alert it does and
show you the video and you drag it down.

Speaker 3 (29:01):
It's just like a big I don't want a video
of cat ship a.

Speaker 4 (29:03):
Big I'm sorry your honor, but my cat just ship.

Speaker 1 (29:08):
Yelled up your hand. It's wild, mister Tinkling, you know which.

Speaker 2 (29:14):
And it's also so funny because it's like the price
the price jump.

Speaker 1 (29:17):
Also, it's like a regular litter bar. It's not special.

Speaker 2 (29:21):
Two dollars, three dollars. I mean, it doesn't cost that much,
but it's just a little special.

Speaker 1 (29:25):
Yes, not thing is not like it's a new kind
of litter box other than the stuff they've stabled to it.
It's like, hey, you're sophisticated buyer, this is not gonna work.

Speaker 3 (29:34):
You love your cats, right, and you love your.

Speaker 1 (29:36):
Cat Your cats must be fancy cats. Yeah, and I
have to be. And they need five g We're throwing
five dollars off fancy feast. Okay, Now, something just fucking
occurred to me about this. It's even weirder. They're talking
about five G. That's a cellular wireless standard.

Speaker 2 (29:52):
Yeah, it's Chinese, which means.

Speaker 1 (29:57):
The cat let the company very well, it's u dirty tech.

Speaker 4 (30:02):
Sovereign citizens are going to come to your house and take.

Speaker 1 (30:04):
Your They're already very unhappy with me. But also it's
five G. Before the end of the show, can you
try and get back there just to ask them if
they mean five G WiFi because that is an egregious
overstatement because they are people are just like the guys
here that don't know how anything works. Go, it's got
five G in it, which they think is like the

(30:25):
cell phone standard versus this is the show though, this
is ces we're just describing. Like I know, Casey Kagawa,
friend of the show as well, was saying to me earlier. Oh, ed,
the hardware stuff's really cool. It is faster computer now,
it's great. AI stetis god great? Actually cool is that?

Speaker 4 (30:44):
Actually it's not a happy heartbeats.

Speaker 1 (30:51):
But you can go to a legacy media outlet and
hear about all the computer stuff that actually matters, because
that's not really the story of CS. That's the stuff
that happens kind of outside, and what you have in
there is people making series of a series of promises
that have gone through a law firm to make sure
they're not going to get them in actual trouble hopefully.

(31:15):
Did you see any Wi Fi grills though?

Speaker 8 (31:18):
No?

Speaker 2 (31:19):
Oh buddy, I saw a TV with a chat bot why.

Speaker 3 (31:26):
To help you choose channels?

Speaker 1 (31:27):
Yeah, you know, I saw.

Speaker 2 (31:29):
I saw sex toys that used video games for a
long distance connection with groups sinking for multiple orgasms.

Speaker 1 (31:40):
They did not promise that ties to the verst that
I went to they, so I.

Speaker 4 (31:46):
Was I'll never forget this.

Speaker 5 (31:48):
I was at the Indiegog party the first CS and
I was talking to this folks and they were telling
me about their technology. Technology was about trying to have
some sort of tactile response, uh, in a virtual environment.

Speaker 3 (32:02):
Keep in mind talking to a priest in full white.

Speaker 4 (32:05):
Yeah, exactly right, that's right.

Speaker 5 (32:08):
And so this is Chinese folks, right, and so we're
Chinese company and we're talking about it.

Speaker 4 (32:12):
Was like, man, that's really amazing.

Speaker 5 (32:13):
Technology like that would be great for for like the
medical industry, Like that would really revolutionize a lot of
stuff for surgeries and like yeah, yeah, no, but we're
going into porn because that's where all the money is.

Speaker 1 (32:25):
Just yeah, ten years later, same, you know, but it's
not taken off thing. Yeah though, I mean it's not scaling.

Speaker 2 (32:32):
There are toys that allow you to have group players
to see them.

Speaker 1 (32:36):
That's existed for a bit.

Speaker 2 (32:37):
The that I was trying to figure out, the AI element,
the metas, Yeah, the extended reality part of Wow, of
the super duper you know, turbero charged flesh lights, Yes,
the suction dil dos of the uh pro massagers lodged.

Speaker 1 (33:00):
You said, all for the you're going to Caribbean.

Speaker 4 (33:02):
Yea, this is where we need Father Bellasar like the.

Speaker 1 (33:06):
Other member of the clergy that for some reason shows
up here. But so other than dildos and other than
health things, any other weird shit, anything loathsome pig.

Speaker 2 (33:30):
Like anything like, oh, there there was a there's an
app it's called an AI Public Transportation Bell and it's
and it rings a bell when you're about to get
to your stop on the bus or train.

Speaker 1 (33:51):
Freaking. There was no one at that booth that I
was like, this is what you're talking me about.

Speaker 2 (33:56):
The alleged section of the conference where it's just this
is a front.

Speaker 1 (34:02):
Yeah, man, sorry, we're just we're using this for that.
The serbian gentleman behind you, one that smells of cigarettes.
That man, that's who we're talking to him. That's our
one meeting the things that we're here for four days
to kind of just cover for this. He will kill you.

Speaker 4 (34:19):
Cigarettes and slim of it.

Speaker 1 (34:21):
It's it's so strange that we have a show full
of this because you kind of see articles about cs
and it's like fastest computer. There's the three thousand dollars
AI machine that I should have some of rotters coming
in to talk about tomorrow. Hell yeah, but in video
also who knows what they're doing at some point.

Speaker 3 (34:40):
And the eternal Hall of TVs all.

Speaker 1 (34:43):
Of televisions, which are fine, I don't we love the
big screens and but really it's just so weird how
much of this stuff is just like fake fake or
like real, but customer base of like one hundred people
like the cat. How much do you think that cat
thing costs to make? Because I bet it's accidentally really expensive.

Speaker 2 (35:03):
I mean it depends because it's is it also one
of those ones with the motor that rotates it automatically sifts. Yeah,
then those things break. Yeah, they break easily. We go
one because they're not doing They are easy to installed,
but I feel like they're not. They don't tell you
how to install it, and so you're probably gonna.

Speaker 1 (35:22):
Break it in installation. The app is probably not gonna
work the time, you know. So I don't know.

Speaker 2 (35:28):
They're probably accidentally expensive because of how much they wear
down and how much they end up getting returned versus like,
how do.

Speaker 1 (35:35):
You really like load test those things? You get a
really big.

Speaker 4 (35:43):
Like you'll make it in garfield.

Speaker 1 (35:54):
Do you think this is how they get the training
data to find out about the TETs. Yeah, I mean
that listen chosen by perfected by signs to be fair.
This is how I describe my cat Babu. Yeah, you know,
and also myself. I do use the litther box, but
he just won't use these fancy ones. We've tried fancy
ones and he just will not go. Only one of

(36:15):
my cats uses the fancy one custody I do. I really,
that's the most cat thing to do. As well, our
gods can throw it away.

Speaker 3 (36:23):
Yeah, no, our cast will only use the robot litter
box for pooping. Altho other we have two other little boxes.
Those are the piss boxes, only the robot gets turts.

Speaker 1 (36:35):
Which one do you use? Ah?

Speaker 3 (36:37):
I I am a discerning individual. I prefer to go
use an actual toilet.

Speaker 1 (36:44):
Okay, so you're good for this show. Now, you did
see a speech from the incredible Linda Yakarina, CEO of
X the Everything app, where I currently do my banking,
my taxes, my dating, and my professional work. I have

(37:06):
a DM I need to see you too, admnute from
a fourteen words groper. That's my doctor.

Speaker 3 (37:11):
And the medical care on acts as well.

Speaker 1 (37:14):
My credit card was just charged eleven thousand dollars from it.
Everything's going great there. But she is the CEO of
this company, and you said it was kind of bad.
It just wasn't that far.

Speaker 2 (37:23):
Yeah, I didn't really know. A movie could have been
an email, could have been a press release. Yeah. She
talked a lot about the global collective unconscious, unconscious.

Speaker 1 (37:32):
Oh god, the fuck does that mean? Uh? That's X
the Everything saying. You know, did she say unconscious?

Speaker 4 (37:39):
Yeah, the collective.

Speaker 1 (37:40):
I believe, we believe. I have it in my notes.
But the thing is, if she said conscious, it's still stupid.
But if she said unconscious, I would believe that because
that's like a yak Areino classic. Yeah, just like a
complete because the thing she says sound like the kind
of thing you say immediately after walking out of a
car wreck.

Speaker 3 (37:57):
Let's go to the notes since, yes.

Speaker 1 (37:59):
But it's us.

Speaker 5 (38:00):
But it's also like what an undergrad like philosophy student.

Speaker 2 (38:04):
Right, she talks about how it's self regulating, but you know,
the first yeah, the first question was about comparing Meta
to X and talking about how it got rid of
fact checkers, and she was like, they're trying to copy
community notes basically, you know, they're realizing.

Speaker 5 (38:23):
All websites sucks shit copying Like, Buddy, I'm on community notes,
so I see all this stuff and it is a
ship show.

Speaker 1 (38:31):
It's a ship show. I mean, if you've seen the
fucking facts that people post on Facebook, Yeah, you really
want that community anyway?

Speaker 4 (38:38):
Yes, and now that.

Speaker 1 (38:39):
Great idea has been copied, and so.

Speaker 2 (38:41):
The idea there is like you know, she's trying to,
you know, talk about how Meta is going to emulate
X's innovative system that inspires great behavior, which is that
noted post or dramatically shared less.

Speaker 1 (38:55):
Like suicide bump.

Speaker 4 (38:57):
Yeah, I want to die like anything.

Speaker 2 (39:00):
The cyber truck guy got his inspiration that and chat
gpt apparently where which helped them.

Speaker 1 (39:05):
Build they explosive. The police department recently reported.

Speaker 4 (39:10):
That didn't quite work.

Speaker 1 (39:11):
Yes, and that's chat GPT.

Speaker 3 (39:14):
Yeah, as I described it, it's effectively just the anarchist cookbook.
And you're going to get similar respect results.

Speaker 2 (39:20):
Yeah, you know, I'll try and so so you know.
Her rating of the meta abandoning.

Speaker 1 (39:27):
Third party fact I team great victory for free speech.

Speaker 4 (39:30):
Cool.

Speaker 2 (39:31):
She talked about how the greatest product innovation, talked about
how there have been like two hundred and fifty product
innovations during your tenure. Extort of the fun what Yeah,
but the biggest one is trend genius, which is when
a trend kicks up on X and it hits a
certain altitude. Advertisers they have agreements with are they're they're

(39:51):
able to take advantage of those trends and they get
matched with them. And as in your ad campaign, kicks
in the high drive when there's a trend connected to
what you're looking to target, goes in the high drive.

Speaker 1 (40:03):
Almost some kind of like word you could advertise against,
like some ad words right.

Speaker 5 (40:08):
Right, you know all of the ads I get, Yeah,
they're all everything is Temu.

Speaker 1 (40:14):
I get Tamu, indie sci fi.

Speaker 2 (40:19):
Books about how Donald Trump is going to save America,
sig crypto that doesn't that that's happening right now.

Speaker 1 (40:24):
Yeah, yeah, yeah, I.

Speaker 4 (40:27):
Only get Timu. What are you like looking.

Speaker 1 (40:29):
At blocking so many ads? At one point I just
started getting the weirdest ship. I got like the most
expensive coat ever, and it was like a seventy dollars coat.
I actually think it's a great bed, but it was
the worst looking outfit. It looked like something that bas
Leman would have in his romeo and do you do

(40:54):
you buy your Oh my gosh.

Speaker 3 (40:59):
So what you're telling me is they've created an automated
system to guarantee that the advertisers won't miss an opportunity
to be next to an incredibly popular racist post, right
and all these don't want to miss it.

Speaker 1 (41:11):
All these trends are the ones that grok comes up with.

Speaker 2 (41:14):
Oh lord, yeah, rock in the in the in the
center of the racism computer.

Speaker 4 (41:19):
They changed the image recently.

Speaker 1 (41:20):
Yeah, you know, the logo for grok is really bad.

Speaker 4 (41:23):
It's super It looks like like a jpeg.

Speaker 2 (41:26):
Effects so bad one of the examples I really also loved.
She was kind of like, you know, talking about how
access to everything plays. It's real time, so many trends.
But you know, sometimes me and my husband we fight
all the time.

Speaker 1 (41:39):
We fight all the time.

Speaker 8 (41:41):
Oh really, imagine my husband my piece of ship, and
so we fight all the time about what to watch,
and we're fighting over the remote.

Speaker 2 (41:56):
He thinks football is consequential. I want to know what's
happening on the red carpet. We don't have to fight
anymore because now I have X on my phone and
I get real time updates and he gets the TV
to white football.

Speaker 1 (42:07):
Was she talking about the Golden Globes? What football game
is on Golden Globes?

Speaker 4 (42:11):
Happened? Now? This is also garbage.

Speaker 5 (42:13):
I was trying to I was trying to check news
the other day on Twitter X whatever they call it nowadays. Okay,
I guess, And I was just like, this is a
really bad place to get news.

Speaker 1 (42:23):
Yeah, okay, pm pst. Was she really that attacked? Was
he really that attached to Monday late night football? Maybe
he does hate her, he might have been.

Speaker 2 (42:34):
He's just like, h so, you know, that's the that's
like first ten minutes or so, you know, she's talking.
She talked talking about safety ad products and how there's
no surrogate for X. This is the most important place.
Talks about the NFL portal, which you know is an
example of how we are expanding the town square and
building a model that will bring to other areas of society.

(42:55):
They want to make us a global sports league portal
and I don't even know what that means nobody. It's provocative.
It gets the people going, you know, thirty six billion impressions,
four billion views, that's for what for the NFL portal?

Speaker 1 (43:13):
That thirty six billion view the impressions, impressions, that's not
even I mean that that doesn't feel like as much
as it should like it. It feels like they had
higher numbers like that than the pospit. Also, the views
numbers are completely cracked on that.

Speaker 2 (43:30):
Now, well she's saying that are the best numbers, and
they're gonna get even better.

Speaker 1 (43:34):
They're gonna go to global sports leagues.

Speaker 2 (43:36):
Uh. The interviewer tried to be like, do you want
to do a news or journalism portal? She was very
you know, she's like, look, look, we're not we're not.
She didn't say we're not interested in that, but she
basically was like, legacy media has.

Speaker 1 (43:46):
Their own thing. We're post trusted in, yeah, telling the
truth or providing information.

Speaker 2 (43:52):
We want to distinguish ourselves from legacy media because.

Speaker 1 (43:57):
That legacy media, it's one way filtered.

Speaker 2 (43:59):
That's a psio to control your mind and influence as
opposed to as opposed to Twitter, which is global, collective, unconscious,
right and free of influence campaigns.

Speaker 1 (44:11):
Right, just.

Speaker 2 (44:15):
Influence campaign right that she just mentioned when that her
boss is doing.

Speaker 1 (44:21):
Yeah, and also the wasn't there a Paris Hilton one
Paris on Saudi's I mean we could list all the
influence campaigns that.

Speaker 4 (44:29):
People still care about parasol. I know.

Speaker 1 (44:34):
That's what's so good about it, you know, when you're
like behind crypto on a trend.

Speaker 5 (44:38):
Yes, yes, Like I think the last time anyone cared
about her was when she was like eating hamburgers for
Carls Junior.

Speaker 4 (44:44):
Like that, I'm not familiar with. That's pretty bad. I haven't.

Speaker 1 (44:49):
I haven't been following.

Speaker 4 (44:52):
The j J Abrams move fair enough.

Speaker 2 (44:56):
She also talked a bit about the power of acts,
and she got really excited about the fact that one
time she was like I was scrolling through my feet,
leaned in, very excited and animated about Musk and Sundar
talking about computing the brain, the brain. Where else can
you get to executives you don't really understand?

Speaker 1 (45:19):
And it's definitely them. It's what. There's no literally every executive.
You know. Here's the thing.

Speaker 5 (45:27):
I refuse, I still to this day refuse to use
the term X like I will dead name.

Speaker 1 (45:32):
I'm only using it here. It's just it's right, it's terrible.
It feels like petting a cat's hair the wrong way. Yeah, oh,
I'm on ex What the fuck are you talking about? Man?
Everything it's everything, the everything app that has right a
lack of utility with almost every day the racism generator

(45:55):
terrible grows somehow the least racist racism generation.

Speaker 4 (46:00):
Like, look, I didn't, I didn't.

Speaker 5 (46:01):
I didn't join the full exodus when you know, when
things so I stayed there even though I joined all
the other stuff.

Speaker 1 (46:07):
But it's just.

Speaker 5 (46:08):
Like like the level of narcissism is just.

Speaker 1 (46:13):
To open up. As we are talking, I have not
spared today.

Speaker 4 (46:19):
The world a world in an event like it's.

Speaker 1 (46:22):
Gonna Yeah, I wonder what's uh, what's what's on X everything?

Speaker 4 (46:25):
We start the count for you? Oh we did.

Speaker 1 (46:27):
There's the thing about Samuelman's sister. Very very bad for
mister Rotman.

Speaker 2 (46:32):
Oh I got some betting ads. Oh that's a bat
mgm man. Do you guys want to get up to
two hundred fifty dollars back in bonus bets if your
first wager loses?

Speaker 4 (46:40):
I want that to do anything. That's why.

Speaker 1 (46:45):
Okay, so he is quote it's a quote post Elon
Musk is posting where someone said Bill Gates is donated
of three. This is just a conspiracy theory thing just
and it's Elon Musk has quoted it, and he he
has used what appears to be an emoji of a
pregnant man. What so this is the richest man in life.

(47:10):
He could do anything. And he's like, what if what
if her? What if her Bill Gates was was a
pregnant man. Wouldn't that be funny? Great grinds, grab scramps.
It's just being in the house for years, just talking
to another person. Yeah, this I love you, don't you?

Speaker 5 (47:30):
I mean I opened up the app and the first
thing I get is a nice little person from a
friend of mine who I know, and then.

Speaker 1 (47:38):
It's literally post me long mask. If protecting children makes
one a fascist, then so be it. I think that, Wow,
that's you know, perfect Segway.

Speaker 2 (47:48):
Also because the next things that she talked about were
the grooming gangs that he went on sweek long tiradeer
bom and she and they were like, is this a
red line? Because now you've got diplomats saying we're going
to change in the relationship with you. Ask and she
was saying, if not for X, where would the conversation
be about saving thousands of girls, bringing people to justice
who must be brought to justice. So I say, more

(48:11):
than did.

Speaker 1 (48:11):
That happen as a result of anything that happened on X.

Speaker 4 (48:14):
No?

Speaker 2 (48:15):
Great, that's she said, there is going to be an inquiry.

Speaker 1 (48:21):
Wow, and well there's going to be an investigation. I
have now retracted my statement. I feel like as we
as we approach the end here, I want to really
talk about something that I read about today that I
will be doing an episode on. You know, you know
that I can't help myself doing content, but we'll talk
about it today. So Mark Zuckerberg of Meta has now

(48:42):
basically he's fired all the fact checkers, and he's moving
the Trust and Safety people to Texas from California. Some
people saying, oh, it's because of some sort of that
one They're like, oh, it's because of labor laws. Sure,
But what that's actually about is the judges from what
I understand. But anyway, if you have not read the news,
they have fired all the fact checkers and they're saying that,
but they basically made a statement that was, oh, yeah,

(49:03):
we've done too much of that. And now if anyone
listening to this has opened up Facebook in the last year,
they probably did not think this could use less work,
and they definitely didn't think I am getting the good
stuff here. But on top of that, this feels to
me like the beginning of the end for META. A

(49:25):
lot of people are understandably it should be scared. This
is a fucking horrible thing to do. I think it's
the end of meta. I think we're actually coming toward it.
They can hope, so I hope so too. But I
think that this is that companies only do this when
they're very upset.

Speaker 2 (49:39):
I mean there's also like these moves also of trying
to placate Trump. I mean, do there are a few questions, right,
because other friends are doing them, are they gonna make
the antitrust trials go away?

Speaker 1 (49:51):
I don't think this.

Speaker 2 (49:53):
I don't think it's related, but I'm interested in that
also as a knockdown effect, right.

Speaker 1 (49:58):
I think what people need to realize is, sure, there's
some kissing up to Trump. Sure, but Meta has been
helping conservatives for decades now. Yeah, you know, and they
write calculation on the Meanwok plandemic. The thing that talked
about the people killing Facebook episode, it's they were Joel Caplan.
It's not Jeff Caplan. That's the Blizzard guy. Sorry, Jeff

(50:21):
Jeff Caplan from a Blizzard. By the way, look up
what the name of his his character was in EverQuest
It was tag old Bitties. That's the guy who runs
sick Activity. I want to slander him. Well. Also that
is actually a truthful statement in Activision Blizzard a lot
of weird lawsuits around anyway. Now back to Joel Kaplan,
who is the former head of public policy who is

(50:42):
now like their chief policy officer replaced Nick Klegg scam.
But Joe Kaplan is a former GW Bush guy who
specifically intervened to make sure that the COVID conspiracy theory
movie Plandemic was not removed by the Health Department, which
I now assume is just shut down. Yes, just to
be clear, when Meta was restrained, when they were restrained

(51:06):
by the boundaries, the vagaries of such ugly things as
morals and policies, they still fucking pushed this shit. They're
gonna do anything now and it's gonna turn it in
to X but worse because Meta has already fucked Facebook
for hear and back. The question is what it does
to Instagram. I think this kills Facebook. I know it's

(51:27):
gonna make Facebook even worse for a while, but it's
gonna kill it. The content already fucking sucks.

Speaker 5 (51:33):
Wait, we already call it boomer Book. Like, so you know,
I work with students. I mean I work with students, right,
so I'm always I'm always in the midst.

Speaker 4 (51:39):
Of a bunch of young twenty year olds.

Speaker 5 (51:41):
Same and it's just like, yeah, as boomer Book, and
no one goes there because it's the most toxic place
that you can go.

Speaker 4 (51:47):
Like, it's worse than Twitter.

Speaker 1 (51:48):
Oh, you need to look for the toxicity. It's usually
just confusing to me, but just to be clear, the
current state of Facebook that everyone has been used to.
This was the one with fact checkers. Yes, now it's
gonna be like lunatic, but it's gonna be actually insane.
But I think this might this may be the poison
in the veins. And I know a lot of people
understandably are very scared of this because of the right

(52:09):
wing side and the fact that this is aiding the
right wing. The only thing I can say to calm
you down is they've already been doing that in great volumes.
They already did this, and it's like that was this
plus that anyway, And it's frustrating because I wish that
was being reported alongside this part of the story. They're like, oh,

(52:29):
now Facebook's going to help all the right wing people.
They took down crowd tangle, which was a data reporting
service that reported what the most popular pages were because sorry,
most popular, I think it was basically, and Kevin Druce
of the was he at the Times it's good work
in ros Yeah, he reported this, and then they responded

(52:49):
by taking down crowd Tangle. Now, the reason they did
that was because all of the top like eight out
of ten were always conservatives about him. Trying to analyze
the political track of a neighbor or no, no, no,
this was just what was most popular on Facebook. And
it was like Ben Shapiro's show or that ship. That's

(53:12):
what was happening when they gave a ship is they
don't give a shit. They don't give a ship now.

Speaker 2 (53:18):
I mean, honestly, it's like, if I were Facebook, it's
in your interest to at work at minimum allow for
or try to ensure that you know, maybe right wing
stuff is pretty popular, you know, because they'll give you
less trouble or gluatorially. And you know, if I'm actually
putting my foot on the pedal, you know, if I'm

(53:39):
actually putting my hand on the on the on the
tiltier or trying to you know, shift the balance, you know,
to rev it up right. So it makes sense, right,
it's a it's a payoff, you know, it's a payoff
for them.

Speaker 1 (53:49):
And I think you know that.

Speaker 2 (53:51):
I think to come back to your earlier point, I
think that's like a fair point to say, which is
that like, yeah, you know, kissing up to Trump.

Speaker 1 (53:56):
Misses it right.

Speaker 2 (53:57):
Yeah, as you're saying that there's there's a deep for
there's a deeper some biosis or a deeper interest there.

Speaker 1 (54:04):
That's people overlooked David. He'd be davely if Bloomberg did
a great opinion piece to that. Lincoln the in the notes,
he did the thing where he just says it gave
Zuckerberg permission to give up on the thing that he
didn't give a shit about. Yeah, somewhat paraphrasing his words there, obviously,
because I don't think Bluembo would report that. But it's
a great piece because it basically says, look, Zuckerberg has

(54:25):
never really liked keeping care of his garden. He's never
really taken care of his shit, So now he doesn't have.
Now he can be like, oh, yeah, it's it's a
political thing. This is why I'm doing it, which is
sure it doesn't hurt its cover. It's cover for him,
just like poor like knocking the toilet into the kitchen.
He knows Facebook is effectively a monopoly on social media,

(54:45):
so he can just fucking turn it into a ran
successpot and thinks that people will stay. And I think
that I was talking to Ed about this last night.
If he was connected with the AI profiles thing. Lot
people are understandably freaked out about But I think it
comes down to one word contempt. It's just they don't
fucking care. It's not about these people don't have a
big strategy. They just I want to say evil, but

(55:07):
I don't even think then that motivated. It's just growth
all cost thinking of One of.

Speaker 3 (55:11):
The benefits for benefit relative to them is it's they
don't have monopoly, but they do have a walled garden,
so they can do whatever they want inside of their garden.
And if you want to bury toxic waste in your garden,
well there are regulations about that, but we're.

Speaker 1 (55:29):
Going to stop them from doing it.

Speaker 4 (55:30):
Yeah, exactly.

Speaker 5 (55:31):
I mean like I mean, if Facebook started as a
place for a work to get a date, you know,
and so I'm sorry.

Speaker 4 (55:37):
You don't like that.

Speaker 1 (55:39):
I feel like that point misses the problem again because yes,
technically that was like the whole horny Harvard guy thing.
The thing is is that that wasn't where that was
an evil thing that a college student did. Where it
became an evil thing that an adult started doing. Was
the idea of Facebook, was the idea of growing Facebook.
And what growing Facebook? Men, if we start with that,

(56:02):
if we reduce him to this horny in cell. Kid,
we miss the thing that this guy is, like Zuckerberg,
is something so much worse than that. He is someone
who truly does not care. I'm not gonna say sociopath
because I actually really don't know the exact definition. I
think just describing it in blunt terms is does someone
who give a shit about anything do this? Yeah? Because

(56:23):
it doesn't even feel like a political move. He's just like, great,
I don't have to. It feels like a regulation being removed,
he doesn't think, because there really isn't regulations.

Speaker 3 (56:32):
Kid, Yeah, there wasn't one.

Speaker 1 (56:34):
It's it's really good looking out of the window with
the tech industry right now. I feel very positive.

Speaker 2 (56:40):
That's just how it felt to be drinking the water
downstream of the chip fabs.

Speaker 1 (56:44):
Yeah. Great, except it's fantastic except that hurt lesbian.

Speaker 3 (56:51):
Thanks for getting.

Speaker 5 (56:54):
But no, there's a cynicism, right right, there's a cynicism
about the whole thing. And so it's like, you know, people,
people just out there in the world sort of use
these things, you know, to stay connected to their families,
you know, like, for instance, the only reason why I
still have a Facebook, you know, account, is because you like,
my mom takes photos, right, and I see that.

Speaker 1 (57:13):
And I use Instagram because there are some people who
don't use anything else.

Speaker 4 (57:16):
Yeah, exactly the same thing.

Speaker 5 (57:18):
And uh and so that's I think that so many
people like that's that's sort of their general sort of
use case, that that these more sort of problematic aspects.
They have a hard time seeing what the problem is.
But but the problem really is just this, this this cynicism,
the lack of care, the I think I think you

(57:39):
think you hit to right on the head.

Speaker 1 (57:40):
You ain't seen nothing yet. I'm deadly serious like Facebook.
So shout out to Jeff Howitz. The Wall Street Journals
taught me a lot, a lot about what I know
about Facebook. I have yet to hear from Jeff if
he was at CS to be so happy. There's no
one I want more on the chat just going fucking
And the thing is is that they've always been this bad.

(58:03):
And as we write history right now, we're like, well, now,
Facebook book, this is the day that meta became evil
and it's like always but if you do, if you
failed to actually counch it in time, imagine what they'll
do next.

Speaker 2 (58:19):
Yeah, yeah, I mean, hey, like you know, what where
do you go from? You know, facilitating a genocida you.

Speaker 4 (58:28):
Know, where do you go? You know?

Speaker 1 (58:30):
You all things are justified in growth. A terrorist attack
plan on Facebook was justified in growth. I'm quoting Andrew
Bossworth in the Ugly Letter. He is now the CTO
of Meta. Very cool, very good stuff. So as we
come towards the end of the this part of the episode,
we're shortly going to be updated with Gara Davis and

(58:52):
Robert Evans of course as a wrap.

Speaker 3 (58:54):
Up, especially because I'm almost out of cocktail.

Speaker 5 (58:56):
He's almost out of and I'm thinking about the Facebook
leaks at the moment.

Speaker 1 (58:59):
But yes, well, I think, as we come to the
end of this day, and did you like anything? Was
there a single thing that you saw? And there is
no wrong answer here. If you saw something you liked,
what was it? And don't say the exit the sex
toys seemed neat cool, So it's just fun.

Speaker 4 (59:26):
I think every ces this is what I've heard, and
that's it. Horrified by that.

Speaker 1 (59:30):
Woke up, just cracking up, all right.

Speaker 2 (59:33):
You know the I saw some screens that looked crazy,
But you know, I don't game like that, I don't
goon like that, Like I just I couldn't you know
those are really for me. Every episode you know, I'm

(59:54):
I'm gonna watch.

Speaker 1 (59:55):
I don't know.

Speaker 2 (59:56):
I feel like maybe I would have liked a sphere
you know whatever doing.

Speaker 1 (01:00:00):
No Lenny Kravitz is there tonight. Oh I forgot to
mention an important part of the things. So I found
out a competing podcast called The Verge. I hear they
are currently doing. Heard of them, haven't either, but I
just the ridge the French you doing like a little
German over had a hairball.

Speaker 4 (01:00:21):
Somewhere right, and here we go.

Speaker 1 (01:00:23):
I try not to be a catty bitch other than
everyone this show. And I must say that I was seeing.
I definitely saw this, and there was a moment I
was like, oh man, another podcast recording out of here,
big one too. I should go and check out what
they're doing before the Verge podcast. And by the way,

(01:00:44):
you can only see the Verge cast live if you
have a Delta sky Miles membership, and it features a
pre podcast panel discussion about the future of travel between
the Virgin Delta's Dwhite James SVP customer engagement and loyalty
in CEO Delta Voi Vacations probably could have not fucked
up with the words there if I wanted to make
fun of it. But I just want to say I

(01:01:05):
get a lot of ship on my ads for this show.
I didn't do a single ad break as well, so
you're gonna be really pissed with me. I will never
interview someone from Delta Airlines unless it's the CEO and
I'm screaming at them that And also what the fuck
does that have to do with tech? But also I
understand we've got to get money, honey, but like Jesus Christ,
that's your big the.

Speaker 2 (01:01:25):
Future of travel unless it's no more Boeing planes, cigarettes
on the plane and bigger seats, I don't want to
hear it.

Speaker 4 (01:01:33):
But also.

Speaker 1 (01:01:36):
Because what is it going to be?

Speaker 2 (01:01:37):
We we we calibrated the formula for sky Miles.

Speaker 1 (01:01:40):
Yeah, we we changed the app. You have made it
harder to get sky Miles status.

Speaker 3 (01:01:46):
Smart planes smart.

Speaker 4 (01:01:50):
All all the planes are on the blockchain.

Speaker 1 (01:01:52):
Anyway, I think you have had a much better experience
with a health physicist and a priest, So I think
that that's my choice. So more members of the clergy
and more safety professionals. I think that's what makes up.

Speaker 3 (01:02:05):
Generally more safety professionals is not what anyone wants to see,
because we apparently ruin fun.

Speaker 5 (01:02:13):
Sometimes apparently, so we as we close out of this
part of the episode and will of course be back
shortly after this.

Speaker 1 (01:02:20):
Phil Where can people find you?

Speaker 3 (01:02:21):
You can find me at Blue Sky at at Fundranium,
and you can find my blog Funderanium Labs dot com.

Speaker 1 (01:02:28):
Yes, and he makes a special kind of coffee. And
if you look on Blue Sky, because I cannot advertise
things on here, I will have a link for you
at some point that we'll do something. I'm not even
going to say what it is. Ed m Graso Junior.
Now that I cut you off at the beginning of
that saying what you did? What do you do?

Speaker 2 (01:02:44):
I'm I'm a tech writer, a finance writer, a labor writer,
an editor, a ship poster, a sign.

Speaker 1 (01:02:56):
As the son of a mother.

Speaker 2 (01:02:57):
Yeah, I'm a son of a mother and the father
of cats were baby monitored.

Speaker 1 (01:03:02):
Yeah, yeah, you have yet to give a link. That's true.

Speaker 2 (01:03:07):
I thought you were going to say to my cats,
and I was like, they're they're on my profiles, Big
Black Jacobin on Blue Sky and uh Twitter.

Speaker 1 (01:03:16):
Thanks to everything that too. And this Maschione Kills is
the podcast where we yap about the political economy of
tech and uh the tech bubbles my newsletter where I
yap about tech. Also where can they find that tech
bubble dot substack dot com? And wrote to that and
father Gabriel Mosha.

Speaker 4 (01:03:36):
Yeah, mosha.

Speaker 5 (01:03:37):
I mean these days, these days, just on all the
social media's in the same Okay, they do not fucking.

Speaker 1 (01:03:44):
Mm every None of these people know their links. Your
name is like luky something like if they type in
your name, it doesn't come up. Sometimes.

Speaker 4 (01:03:53):
Oh uk e I four six y five.

Speaker 5 (01:03:54):
That's on everything except for blue Sky, Blue Sky it's
eighth way.

Speaker 1 (01:03:59):
Well thank god you specified that before I said anything.

Speaker 4 (01:04:02):
Yeah, no, or you can just put in like if
you just search Gabriel.

Speaker 1 (01:04:06):
Moser's like priest into Google.

Speaker 4 (01:04:08):
Then I'm there.

Speaker 1 (01:04:09):
Well when will be right back? After these amazing products.
You're gonna hear about these products. They're so beautiful. I
personally love them all. They're definitely not going to immediately
embarrass me in a way that every week I get
one email from one guy who's like, this does not
match up with the things you were saying.

Speaker 3 (01:04:29):
We're here to embarrass you.

Speaker 1 (01:04:33):
Did you know that there is a commercial a commercial
that came on that was not in line with your beliefs.
What do you think? No noster around and every time
I respond with the same thing, which is, I bet
you'd love it if I died, and then I hit send. Anyway,
the next thing that comes up maybe by it. And

(01:05:06):
we're back and we've replaced some of the people. We've
got Gare Davis and of course Robert Evans from It
could happen here, and they are just glowing from the
time on the floor.

Speaker 9 (01:05:16):
Yeah, I'm happy to be here. I'm kind of concerned
that the former panelists you took out and forced into
a goolag outside of nov garad like that. That seems
a little extreme.

Speaker 1 (01:05:25):
But I don't want to question. That is your show business.
This is your show. That's show business. That's I'm glad
you made the cut, you know. Yeah, we have to
have two ads, at least one of the chosen.

Speaker 4 (01:05:38):
Always to.

Speaker 1 (01:05:41):
Want to crave the mind. Oh god, I'm going to
be thinking about that all evening now. That's great. So
we were previously talking about the Linda Yakarino speech. Oh
my god, and I did not know that Gare was
also that gay. What was your experience of the uh,

(01:06:02):
the matriarch of social media.

Speaker 6 (01:06:04):
It's the first time there was applause was when she
mentioned Doge. The second time was the rape gigs. So
incredible vibes happening.

Speaker 9 (01:06:14):
Audiences love Dojian rape gangs.

Speaker 1 (01:06:16):
You know. But what's so, what specifically was this rape
gang thing? I should know about things that happened in England,
but I've been running from England my whole life.

Speaker 9 (01:06:24):
Well, there was like a kind of a bit of
a bit of a program over the earlier last year,
over over the idea that there were like migrants were
running rape gangs, and so people started like we were
like right wing riots in several cities, including like.

Speaker 1 (01:06:39):
Oh, sorry, this is the thing that actually fucking caused that.

Speaker 9 (01:06:43):
Well this was no, no, this didn't cause that, but
causes an extension. This was trying to keep it going.

Speaker 1 (01:06:49):
Yeah, oh yeah, sorry, I meant it. Yeah, that right
wing ship storm was what caused the riots. Okay, great,
because I knew the riots were happening. And then we've
got very upset with the amount of racis in the
England that newly created, considering it's England and has quite
a lot already. Yeah, oh god, anyway, So that was
that got applause.

Speaker 6 (01:07:10):
Yeah, talking about that and and how how important it
is on X to protect children and as a mother,
it's very it is especially important. So and we need
we need more more posting to save the children.

Speaker 1 (01:07:25):
I mean that I agree with just the posting part.
All things are possible posting.

Speaker 9 (01:07:29):
I mean, one of the last times I was on
X the everything app, I saw a deep fake porn
of an eighteen year old. So yeah, I only I can.
I really do feel what she says that very is
deeply committed.

Speaker 6 (01:07:41):
The last time, the last time I spent a decent
amount of time on X, it was after a teenager
did a mass shooting and I had to talk to
all of their other teen friend nazis who are on
X and they everything app everything, they communications, and that's
where you find them, a whole bunch of teenage Nazis
who had all just like talked, like talk to each
other there and share phrenology memes.

Speaker 1 (01:08:02):
Oh yeah, what's that girl's name like that? There's like some.

Speaker 2 (01:08:10):
Rad fam Hitler the name on the one of my
favorite podcasts.

Speaker 9 (01:08:16):
We're having her on the show very very soon to
talk about the other Hitler yeah, I'm calling it the
two Hitlers.

Speaker 1 (01:08:21):
When seeing that name, I was like, I can't believe
this exists. Of course this exists. Well I love talking
about technology. Yeah, so that was but that What was
the context for the doge thing as well? Was it
just the idea of firing people? Know?

Speaker 6 (01:08:36):
It was was talking about like how important X will
be as Elon and and focuses on doge in the
next year for a government program, so you like, you know,
well it's a fake a fake government, like a consultancy
service that pretends to be a government but like it.
It was basically a quick question about like like how

(01:08:56):
much will Elon be using like X to like, you know,
get like people, Like how much will he be getting
like users to give suggestions for things to cut you know?

Speaker 1 (01:09:06):
Oh god? Open source government regulation? Sure, I mean open
source works. Yeah, yes, but.

Speaker 9 (01:09:13):
Yes, look think about Linux, right stable operating system run
by a guy who's only committed.

Speaker 1 (01:09:20):
A few sex crimes.

Speaker 9 (01:09:22):
We all love Linux. Why shouldn't the government be that way?
It's already run by guys who commit sex crimes. Right
match made in Heaven. It was much like I'm not
trying to.

Speaker 1 (01:09:31):
Oh, I am much like Linux it's way more complicated
than they describe as. Oh yeah, and and and in
the end, it only works for a few people and
they can barely explain.

Speaker 9 (01:09:43):
Why, but their computers are unhackable because honestly, if you're
if you're hacking ship, nobody, nobody needs that.

Speaker 1 (01:09:50):
That's really good. I'm glad you end that bit because
we've now said several things that Linux fans are going
to email me about.

Speaker 9 (01:09:55):
It's the most secure to way to do things, the
same way that invent your own language and losing the
ability to communicate with the rest of the world is
the most efficient way to avoid getting spied on.

Speaker 1 (01:10:06):
Right, kind of like trying to use another Apple Linux. Yeah, yeah, exactly.
I said something about Linux, like I wrote this newsletter
and turned into a script as well where I talked
about and I've talked about all sorts of things and
rotten with intech and I quote tweeted it quote post
of them blue Sky, I should say, and I said,
just to be clear, stop recommending me Linux. Normal people

(01:10:29):
don't use Linux. And it was like fifty percent hardcore
Linux people being like, you're completely right, man, like this
shit's this shit's because being like I will fucking kill you.

Speaker 4 (01:10:41):
I will.

Speaker 9 (01:10:42):
Actually it is, there's no Linux guys are exactly the
same as like Norse pagans in the modern world, where
half of them are like the chillest like like and
violently anti racist, will literally fucking die to like fight
a Nazi, and half of them are just the absolute fact,
just diesel apple like like the bioless.

Speaker 1 (01:11:04):
So we're also calling them back.

Speaker 9 (01:11:06):
Linux people are either will let the drop of the hat,
like like put down everything in order to like help
you secure your phone to stop the FBI from getting in,
and half people whose primary goal in life is to
reform the age of consent to be lower, just to.

Speaker 1 (01:11:21):
Be close to thing Linux fans, I please. That was
like two fucking weeks.

Speaker 9 (01:11:27):
I see all the people listening to this are the
first kind of Linux.

Speaker 1 (01:11:30):
Yeah, that's what I'm saying. They all are, mm hmm
yeah great, still going to get one hundred emails. Yeah.

Speaker 4 (01:11:39):
So.

Speaker 1 (01:11:39):
But otherwise, so there was no applause at all during this.

Speaker 6 (01:11:43):
Otherwise it was just for the mention of doge and
then for the rape gains.

Speaker 1 (01:11:48):
Otherwise you just kind of they.

Speaker 6 (01:11:50):
Were just talking for thirty minutes and like the biggest
point of this discussion was basically laying out how in
tendem X and the everything appen and and the new
Trump administration are going to function out.

Speaker 1 (01:12:03):
It's basically like trying to view.

Speaker 6 (01:12:04):
These things as like symbiotic tools.

Speaker 9 (01:12:07):
Yeah, it's kind of like a dear Sturmer type deal, right.
Like the nice thing about it being a dere Stirmer
deal is we we all know what happened to the
guy who ran dere Sturmer.

Speaker 1 (01:12:19):
I don't.

Speaker 9 (01:12:20):
Oh he got hung by a drunk hangman that the
US appointed because we didn't care who did the job.
Made extra shared that it took a long time. Julia
Striker was kicking up there for a minute. We got
something to look forward to.

Speaker 1 (01:12:33):
That one of the funniest things. I thank you for
sharing that.

Speaker 4 (01:12:37):
Smile.

Speaker 6 (01:12:41):
Oh good, did you talk about the Zuckerberg fact checking thing.

Speaker 1 (01:12:44):
We can talk about it again, because, as I was
saying to it earlier, it is just Facebook slopping up
like this is the ship they've always done. This is
just them fucking instead of flushing the toilet, just pushing
it into the kitchen. Like I said, Everyone's like, oh,
it's so Donald. Try no, he just does an excuse
now he knows laws won't or like morals will no
longer be enforced, and Facebook already fucking sucked. I can't

(01:13:07):
wait to see how much worse again.

Speaker 9 (01:13:09):
My Facebook feed right now as fully thirty forty percent
of it is. Yeah, like there's like half a number
of people that I used to know that The only
way I have of like checking in with him is Facebook.
I don't use it regularly, but every time I log in,
a full third of my feed is war re enactors
who are dressed as always dressed as members of the
waffen SS and Aijen pushers of the waffen SS, And

(01:13:33):
all of the comments are those brave boys, those poor boys,
they were just trying to defend hundreds of them, hundreds
of them, hundreds of them.

Speaker 1 (01:13:40):
It's bleak. I get like pitching and lifting things and
like boxing.

Speaker 6 (01:13:46):
Yeah, maybe this tells us more just about Robert, like
your own interests.

Speaker 9 (01:13:49):
Mine is just like, oh, I can't even read more
than like seven or eight books about the Imperial German
Army last year, so like it kind of like I'm
that into the German German military history.

Speaker 4 (01:13:59):
Garris.

Speaker 6 (01:13:59):
This might just be the Facebook tracking is coming to
be is coming to bite you all of it?

Speaker 9 (01:14:03):
How do you have the reading habits that are like
the only other people with my reading habits are very problematic?

Speaker 1 (01:14:09):
Yes, magician versus slow mo camera. Ooh, that that one
sounds Actually, this one's really good. It's kind of just
free photo editing group. And it's just from December thirty one,
twenty twenty four, and it is like a week later,
and it's just like a very graty picture of a
woman saying, is it possible to make this fuzzy photo shop?
It is now with the power of AI in hants yeah,

(01:14:31):
or like an the legacy of nerd Hulk Hoganmerton's w
W A.

Speaker 9 (01:14:35):
Crude AI cartoon of a soldier with neither of both
of his legs kind of melting off into a puddle
that says, why don't pictures like this ever trend?

Speaker 1 (01:14:43):
I don't get any of the AI slop. I got
boomer slap so much boomer slash slap.

Speaker 2 (01:14:50):
I don't get only women can understand not married yet.
And it's a picture of a tomato then got married,
and it's a picture of a watermelon.

Speaker 6 (01:14:59):
A lot of pictures of like kids crying and it's
with like with like sad captions about how Kamala Harris
is making kids cry if you agree, and.

Speaker 9 (01:15:09):
Like like literally every like fifth post is some sort
of picture with why don't photos like this ever trend?

Speaker 1 (01:15:14):
Oh my other things, I've subscribed to a lot of
Joker related that makes kind of unsettling. But yeah, no, no, no, no,
you are the smiling man. No no, no, to explain the
Joker mein thing, it's not because I believe in their views.
It's that I had a brief two year long anthropological
thing I was working on where I found the most
insane typos in Joker memes and found this entire part

(01:15:36):
of the global South there is guys who have clearly
never watched a single thing with the Joker putting the
most insane And my favorite was a picture of Phoenix
and it said would you like to know which wyt
h of them? We're liars? And to be clear that
it is a Heath Ledger joker quote. Oh okay, yeah,

(01:15:56):
And that one was just like I really didn't need
much more. I'd found it's some of the Olympus at
that point, but nevertheless, free Dan, Free Dan, get him
out the thing that you get now on here is
just apparently luxury hotels and now like some Google Maps grew.
What is this fucking experience? What is Facebook for anymore?
What is this app? It's it's all I mean, the

(01:16:18):
everything app. No, it actually is the everything appen this
is actually it actually does have everything. Actually this might
really piss off Elon Musk if everything No, if Mark
Sockerbo just says you have with the everything app, he
should do that just to see what happened.

Speaker 6 (01:16:32):
If he gets banking working on Facebook before Twitter, that
would be payments.

Speaker 1 (01:16:37):
Yeah, no, I guess.

Speaker 9 (01:16:38):
Yeah, yeah, who would you rather trust with your money
than Facebook?

Speaker 1 (01:16:43):
I mean, I mean, I thought we all agree.

Speaker 6 (01:16:47):
I went to so many panels today and half of
them sounded like the stand up solutions comedy sketch.

Speaker 1 (01:16:54):
Please tell tell me some of these wonderful things.

Speaker 9 (01:16:56):
You're gonna have to explain that joke to Garrison.

Speaker 1 (01:16:58):
Oh yeah.

Speaker 6 (01:17:00):
Absolutions is just a is a comedy stand up routine
released I think earlier this year by a really good
comedian who I cannot connor O'Malley.

Speaker 1 (01:17:08):
Okay, you should all watch it because it's very insane
and it's also exactly the vibe. It's just half of
half of it's a PowerPoint presentation of completely made up
in the same ship by it's.

Speaker 6 (01:17:22):
How it it's how you can make an A it's
how you can buy an AI that does stand up comedy.

Speaker 1 (01:17:27):
Yes, and this is what the things are like and
this is what everything is.

Speaker 6 (01:17:31):
I mean the first the first panel I went to today,
I heard the phrase like augmentation not replacement, probably about
ten times a lot a lot about keeping.

Speaker 1 (01:17:40):
Yeah, that's what the plastic surgeon.

Speaker 6 (01:17:42):
There was a lot of fear in the AI panels
from people scared of how much like backlash there is
two AI tools and trying like trying to like find
wayte like trick audiences into enjoying or like consuming AI slop.
Like a lot of it was about that I don't
like like like a phrase one of the panelists used

(01:18:04):
was trying to solve the AI ick problem.

Speaker 1 (01:18:08):
Like like there's like there's like an ick around AI.
We have this problem that fucking sucks. Nobody wants to use. Yeah,
it looks bad.

Speaker 6 (01:18:15):
And they they had someone from Meta at this panel
and they talked about how a few days ago Instagram
announced they would be having like a AI generated like
fake profiles, right, people that don't exist. And there was
so much backlash like based on this that they that
they took it away, but they explained it by saying, actually,
the market just isn't ready yet.

Speaker 1 (01:18:34):
This isn't back to the future, isn't. It's not that
it's not that this is a terrible idea. I said,
the market's not ready yet.

Speaker 9 (01:18:40):
There are certain issues where like we haven't been a
specifically hear it from like a queer woman of color
who's a mother in New York City. Right, there's a
number of issues, yes, but an actual one.

Speaker 1 (01:18:49):
But but a.

Speaker 9 (01:18:50):
Person yeah, I don't I don't need an AI that
admits it was coded by some white ladies in the
bay pretending to be this person.

Speaker 1 (01:18:58):
No, it's then like the pigs. The pigs have not
being suppressed enough. We've not we've not sedated.

Speaker 6 (01:19:05):
Yeah, get the ready for the And it really is
another phrase that it is just like like AI, like
like improvements will all be based around consumer acceptance. It's
like we have all the text having but we have
to trick people into actually like what they're looking for
is consent.

Speaker 9 (01:19:23):
There's a lot of attitude, there's a lot a vibe
from them that is very much like a like a
like a dude who's trying to pressure someone into sex
and is like, and I can like massage this into
like them being okay enough with it, right.

Speaker 1 (01:19:37):
I can get him to agree to this.

Speaker 9 (01:19:39):
Yeah, it's it's this very We're in a very interesting
position with this fight where it's still recent enough. Every
like everyone who is trying to push the the AI
line remembers n f t S and the Metaverse going
down in flames and how unified and effective the backlash
against them was. Now neither of those ever had any

(01:20:00):
actual technology on your life, if you.

Speaker 1 (01:20:03):
Can call Metaverse any consumer adoption as well.

Speaker 9 (01:20:06):
Yeah, there was no real technology that was in any
way impressive, and there was no real user hunger. There's
user hunger for some of the things AI does, and
there's real technology there, but there's also a backlash that's
very similar to the backlash for those two things. And
they see that and they're very they're scared of it

(01:20:27):
because they know it is a threat to because of
how much how they have to keep up momentum if
momentum drops at any point. A lot of these companies,
like obviously the tech won't go away. But a lot
of these companies say, like yeah, exactly, and so they're
scared of that. And and that's a really interesting position
to be in because it shows that number one, that
the fight is still winnable.

Speaker 1 (01:20:49):
Talking about it, I can be.

Speaker 6 (01:20:52):
Saying, like, the same way that we were able to
beat NFTs and beat the metaverse to some degree, like
that is this is going to be a longer battle
than that, but like it is having an impact. You
can like even look at like a like a Chapel
Rome posted something about like AI generated images that received
a lot of backlash, and like people and like ad
agencies are like looking at this and they're trying to

(01:21:13):
figure out a few ways to kind of navigate it.
A few of the other panels I went to specifically
about like how to market stuff to gen Z was
like talking about like the importance of like like authentic brands.

Speaker 1 (01:21:22):
Like saying on fleek gen Z.

Speaker 6 (01:21:26):
Is so vest is so vested in like authenticity, and
if you're gonna partner with influencers who you is AI,
it has to have that authentic angle.

Speaker 1 (01:21:33):
So what's funny about the influencer thing that I don't
think companies realize is they think every influencer is this
pliant brain who will do anything for vacation, and there
are an alarming amount of them that will. What they
don't realize is that there are many of them that,
once they reach a certain scale, go, wait fucking minute,
I don't have to go to the LG pain Maze.
I don't have to to ride a horse to the

(01:21:56):
LG compound to see the secret fridge. Yeah, and then
they become angry and they crush them. Stephen from Gamers
Nexus is my favorite. He's never done access he does. Stephen,
if you're listening to this police, come on my fucking show.
I love you.

Speaker 4 (01:22:10):
I don't know who you are.

Speaker 1 (01:22:10):
Steven Games Next is probably the single most YouTube important,
the single most YouTube, jesus, the most important YouTube in
pretty much processes and chips. There are some others as well,
but like, Steven's done this scredible thing and it's mostly
just him standing looking exhausted over a table, just ripping
these fucking companies to shreds, or going to their offices
and be like hey, hey, Steven Games, like hey, what

(01:22:31):
you're doing? Like British politician style. I don't think they
realize that there are so many influences just like that
out there. Coffee's either mm hm, He's a fucking.

Speaker 9 (01:22:40):
Beast, yeah, especially considered that. What I think is so
interesting about him is that he is he is into
and speaking to this group of people who are profoundly
anti media and like kind of chetty.

Speaker 1 (01:22:50):
Yeah, like that's a lot.

Speaker 9 (01:22:51):
But he's also he does very good journalism, yeah, within
that space, like and really has done more than almost
anyone to kill specific kinds of conn.

Speaker 1 (01:23:02):
Did a podcast with me Rabbit reporting No big deal.
But it's funny as well because they're definitely thinking that, oh, yeah,
we'll get these stupid fucking gen Z kids and these
idiot influences and gen Z as you all Knoweah, your
gen Z yes thirty eight years old and I sound
a hot. What's the age for? I do not know.

(01:23:23):
It's at this point.

Speaker 6 (01:23:24):
It's like if they can't yah seven or twenty eight,
it depends.

Speaker 9 (01:23:29):
Usually born twenty seven are are cusps, they're cuspers.

Speaker 6 (01:23:34):
If you're born between nineteen ninety seven and around twenty
twelve is usually the gen Z racket.

Speaker 1 (01:23:41):
You like you are a young millennial.

Speaker 9 (01:23:43):
Yeah, if if your knees make a sound when you
stand up your Every.

Speaker 1 (01:23:47):
Time older I do yeah, yeah, every time I do yoga,
my knee makes a noise like a pinball machine breaking.
That doctor said, it's great, but it's funny because what
they don't realize is as legacy media dies, yeah, you're
gonna have a bunch of insane influences. Those insane influences
will be the ones that actually have a genuine animus
with the LG Fridge. The fridge guy is eventually going

(01:24:09):
to turn on the fridge manufacturers, and the fridge guy
will bring them great pain as he reviews the fridge
with alarming accuracy. And I think that that they don't
realize that that is what will come to replace some
of legacy media. It's going to be posters.

Speaker 9 (01:24:22):
It's going to be people have no vested interest in
the politests that they can expect from like legacy media. Right, Yeah,
there's no there's no space in like YouTube review guys
or TikTok guys for Walt Mossburg. Nobody wants a dude
like that. You want somebody who's snarky and got an
angle and funny, and it's just easier to rip up

(01:24:46):
a lot of these shitty AI products to like than
it is to fucking pup them up.

Speaker 1 (01:24:51):
And I will tell you something, running a PR firm,
not a single fucking PR person for the most part,
can prepare you for that. The only thing that can
prepare you is well, actually that position training. Yeah, you
get a lawyer to train you, that will do you
fomo good.

Speaker 2 (01:25:06):
With media training, having you know, growing up roasting people,
that might help you a little bit bit.

Speaker 1 (01:25:11):
But you also have to realize that the truth is
generally useful and if you're trying to hide that an influence.
So we really get around that.

Speaker 9 (01:25:19):
What I think is so important to get back on
the topic of like how do we win this fight
against AI slopification of everything beautiful in the human spirit.
This is not a political fight. And that's like you
guys like coffee Zilla on it, right, Like this is
a lot of people when they look at this shit
and when they consider like the art and the actors

(01:25:41):
and musicians being replaced so that like some fucking Bay
Area assholes like can hoover up even more money and
like destroy the ability of like the human mind to
represent itself. That's not a political issue, and there is
I think an immense amount of potential still to get

(01:26:02):
a lot of people together and to fucking strangle this
thing and it's cradle because we all, we all have
this kind of visceral feeling that it's reulption. Yeah, this revulsion.
That revulsion is what we need to lean into.

Speaker 6 (01:26:17):
The aiic like the AI they simply don't have. The
thing that matters most was that like, which is its
ability to make like good compelling stuff and this is
something that's that this is something like they actually know.
I'm gonna read something about something one of the guys
on the panel from Adobe was talking about. He was
talking about how when you're making AI content specifically like

(01:26:39):
targeted towards specific people, like a personalized content is always
like the most like impactful. And there's three parts that
are needed to create a personalized content pipeline. You need data,
you need you need like a journey to take the
person on, and you need the content itself. And we
need content at scale that is highly personalized. And he

(01:27:00):
said that we're good at the first two parts, the
data and the journeys, and I would I would argue
about the journey's part, but he said, now we just
have to improve the actual content. It's like they know,
they know the content's bad, Like, yeah, that's the thing
we have to still actually like figure out. It's like
how to like turn data into like a good content

(01:27:22):
And that's that's still the thing that they're really struggling.

Speaker 1 (01:27:24):
And it also looks the same as it did, like
he does look the name as did a year ago.

Speaker 9 (01:27:29):
There's so much talk and finally on what because I
listened to him, Like the first two Tech and Holly
AI and Hollywood panels I listened to, there were a
lot of talk about like we just need to be
able to like if we can read people's brains while
they're watching stuff, we can see how they're reacting, and
then we can alter like yeah, yeah, the content that
they're taking. That meta was talking about the ending or

(01:27:49):
whatever to to really make it more engaging. And it
was finally somebody in that in the fourth panel when
it was actually like people who write and stuff on
it was like people don't want that. People don't people
like we consistently see and if you look at the
ship that went huge last year, the most profitable movies
last year was the result of two movies that were

(01:28:10):
the result of huge amounts of human effort. The Barbie
movie and the Oppenheimer movie. Say what you will about
both of those or so. I mean, I'm sure that
one made some money, but.

Speaker 1 (01:28:19):
It still extremely hard work from three das however you
feel about those disgusting creatures, dominions.

Speaker 9 (01:28:25):
And more than anything like the Barbenheimer thing was like
certainly the most significant single movie trick last year, and
that was the result of people wanting a shared.

Speaker 1 (01:28:35):
Experience and a very specific aesthetic in those cases, not
something that changes every frame.

Speaker 9 (01:28:40):
Not wanting to, not wanting to, like go, hey, did
you see that movie last night? Well, I saw the
version that was made for me.

Speaker 1 (01:28:46):
Was like people people.

Speaker 6 (01:28:48):
Who actually want personalized content. I don't want to be
music generated for me. I want music that I can share,
and that's in some kind of cultural conversation with other
people who also enjoy music, who I can talk to
like that about and compare it.

Speaker 9 (01:29:02):
Yeah, what do you think of that speech at the
end of The Great Dictator? Well, when I watched The
Great Dictator, Charlie just turned to the screen and said,
everything's good.

Speaker 1 (01:29:08):
Keep doing what you're doing, right. Nick Fury woke down
as this is this is literally the future that these
people want.

Speaker 9 (01:29:18):
Travis Kelsey, Yeah, told me I look like Iron Man.

Speaker 1 (01:29:23):
Oh my god.

Speaker 6 (01:29:27):
Someone from Meta who works in like their hyper reality division,
was talking about how they can start using AI and
the metaverse.

Speaker 1 (01:29:35):
That's the hyper reality.

Speaker 6 (01:29:36):
Let's say that you're watching an immersive live concert, something
that me and all my friends do.

Speaker 10 (01:29:41):
By the way, right away, you want to go to
a real using using m R mixed reality and and
the AI can sense your excitement and it can personalize
your experience based on your favorite song or artist.

Speaker 6 (01:29:55):
So in AI, Taylor Swift near the end of the
song can like come down in the stage and like
and like dance with you. That would freak me the
fun out or like that would not depend where they
can change the song based on what your favorite songs are.
So it's using AI and the metaverse you de crede
ultra personalized experiences.

Speaker 1 (01:30:13):
I have never done worse than we But can you
even imagine this while stone that would be horrifying. It's
following it does like a good time. The idea of
like stoned brain being like, well, this is what this
is what me and Atticus Rocks are gonna get down In.

Speaker 2 (01:30:28):
The metaverse, Yeah, we're gonna Candy Flip and the Man
and the metaverse.

Speaker 1 (01:30:32):
You get it. I don't know what this is. I
don't listen to.

Speaker 9 (01:30:35):
I would rather play We're gonna show you at the
Taylor Swift the Necessity of war.

Speaker 1 (01:30:43):
Your Taylor Swift the whole time, shot that Taylor Swift.
I want to watch a you made me.

Speaker 9 (01:30:52):
Willie peaked on a school in Northern can't save you.

Speaker 1 (01:30:56):
You gotta shoot those kids. You gotta shoot those kids.
They're gonna turn it, just like Taylor.

Speaker 9 (01:31:01):
No one else, nobody else was in that fucking bunker
with us. No one else can understand what we had
to do. And then you light your AI cigarette, your
little hit of AI nicotine.

Speaker 1 (01:31:10):
Thank fun. There's not a more crazed fan base than Linux.
I'm gonna get killed. Doesn't everybody want to go to metaverse?

Speaker 2 (01:31:18):
Fallujah Meta Fallujah, that's meta fallujahjah Verse.

Speaker 1 (01:31:27):
Cinematicise.

Speaker 6 (01:31:29):
As I was sitting through all these AI panels, I
get a text from both Ed and Robert iBOT some
breaking news regarding the Tesla truck bombing.

Speaker 1 (01:31:39):
Yeah, yeah, that he used to planet sick.

Speaker 9 (01:31:43):
First off, green Beret training is not what it used
to do.

Speaker 2 (01:31:47):
Yeah, they're taking us for a ride with whatever we're
paying if they have to use this.

Speaker 1 (01:31:53):
G had to use chet GPT to make a bombe.

Speaker 9 (01:31:56):
My uncle was a Green Beret, and I'll tell you
one thing, he did not need chat GPT to make
an explosive device.

Speaker 1 (01:32:05):
Every time I have you on, I'm just I'm just
like texting my lawyer, just I'm sorry.

Speaker 9 (01:32:13):
Well, what's interesting is he was specifically looking up as
a detonator he was trying to use at some point tannerite,
and he must have if he either got a detonator
to detonate the tanner but he was looking initially at
trying to shoot the tannerite to detonate it, which would.

Speaker 1 (01:32:28):
Not have allowed him to shoot himself. Well, that's that's
the problem. That's why he went else.

Speaker 6 (01:32:33):
Chat GVT couldn't figure that one out.

Speaker 1 (01:32:39):
I I love doing a show on the podcast about
the future, and every fucking conversation I've had, without fail
has been like everything I've seen is either fake or bad.
And I don't even mean this. We get the occasional.
You're a pessimist, motherfucker. Bring me some las. Robert should

(01:32:59):
be one good thing to please please something. Oh shit,
what was it? That's so good? Oh?

Speaker 9 (01:33:04):
The suicide helicopter. There's a helicopter that'll kill everybody. Imagine
if you had a helicopter. But it's two thirds the
size of a smart car, and it can drive on
city streets, but it has no side or rear view mirrors.
All of that's all of that's a lot of cameras
immers in this cramped little and then when you're ready
to fly, it's a quad copter and you can fly

(01:33:26):
up to two hundred meters in the air. Because if
you're under that, you don't need a pilot's liice. Do
we have a emphasize to me under ten grand? Wow,
so little portable helicopters that you can fly for up
to twenty minutes in the air.

Speaker 1 (01:33:42):
That's what you get.

Speaker 9 (01:33:44):
Yeah, what happens if you run out of that ay Again,
Let's let's say you a former Green Beret, decide to
rint the suicide helicopter. Then you just fill that motherfucker
up with fuel. All you need is ten minutes. As
long as you like you you toe it up in
your truck outside of whatever buildings you haven't.

Speaker 1 (01:34:02):
You can't see that this is never got to be
a real problem. This is this is I do love
they made like a comic cozy simulator, like that is
what it is, right, Yeah, I swear and be your own.
I was like I do genuinely ask the question of
did you see something you like? In good faith every time,

(01:34:23):
and every time it's just someone being like, no, I
mean you don't want to street. I don't. I love
committing no crime.

Speaker 2 (01:34:30):
I'm we're calling it the Tora to Torah.

Speaker 1 (01:34:38):
Oh god, I've got to wrap this up before I
get in any more legal trouble.

Speaker 6 (01:34:42):
Okay, even the solar powered pell tours are not but
I don't that. I don't think they're the military like
gunshot safe ones.

Speaker 1 (01:34:50):
No, No, they are. I saw them. What is that?
What are you talking about?

Speaker 9 (01:34:53):
The tours are high grade uh, like adaptive headphones, so
like the ones I saw had like a mic output too.

Speaker 6 (01:35:02):
Didn't have a mic just like I think they didn't.
They didn't have like the indent to wear under a helmet.
I think these were like workplace peltors, because the Peltores
make a lot of different models of noise canceling headphones
and only certain what only certain ones are used for,
like military or like law enforcement or like gunshots suppressing.

Speaker 9 (01:35:20):
But it looked to me like they were actually the
the pel tours that you get where they're not attached
to a band so that you can attach them directly
to a helmet, and they had instead attached them to
a solar band. Maybe maybe I need to go check
on it.

Speaker 1 (01:35:34):
Tomorrow to me, but those were and I talked.

Speaker 9 (01:35:38):
Them specifically about using it for hunting.

Speaker 1 (01:35:40):
Okay, okay, okay. So we found one thing.

Speaker 9 (01:35:42):
It was pretty cool, like like headphones that can suppress
loud noises but allow you to continue to converse while shooting.
That charged themselves so you never need to worry about
battery photovoltaic band.

Speaker 1 (01:35:55):
It also can be charged.

Speaker 9 (01:35:56):
By lights on the inside.

Speaker 1 (01:35:57):
That's very cool. Yeah, yeah, that's that's me. We've found
with one product everyone, we can wrap the show up. Then, okay,
Robert Web can people find you?

Speaker 9 (01:36:04):
People can find me at I write okay on Blue
Sky and X the Everything app uh And then we
have a podcast called It Could Happen. Here we'll be
talking more about ce S and where the technology industry
is going a.

Speaker 1 (01:36:18):
Very good place. Yeah, where can People Find You?

Speaker 6 (01:36:20):
Also on It Could happen here with Robert Evans and
on Twitter sorry X and Blue Sky at at Hungry
bow Tie.

Speaker 1 (01:36:31):
And Ed Wick can we find you Big Black Jacobin
on Twitter and on Blue Sky and then this machine
kills and tech Bubble dot substack dot com and uh,
you can't find me at the gym, be you find
me at the bank. It's it tron, you find me
and it's everything. Uh yeah, I am. I wanted to
get you on the show floor. Ed. I can't wait,

(01:36:52):
but I really could. It's so crazy. I don't want
to go. I have not been downstairs. Yeah you didrow
you want to Okay, everyone, thank you for listening to
the wonderful better Offline. Now you'll hear a really like
jaunty theme which Madisowski, our producer, did, and a bunch

(01:37:14):
of stuff that you continue to fucking complain about me
not updating, which only makes me want to leave it.

Speaker 9 (01:37:18):
You should use a AI generated music like and that
great uh that great heist video last night personalized.

Speaker 1 (01:37:25):
Every listener gets a different outrailer, gets a different outro song.

Speaker 9 (01:37:28):
They are all billy joels up.

Speaker 1 (01:37:29):
Can you imagine that's the only intro? All right, we're
going out now, but just to be clear, like, can
you imagine the ship that the people in the forum
on the reddit, I'm one hundred years old on the
four they're on the message boards about me again. Anyways,
thanks for listening everyone, Thank you for listening to Better Offline,

(01:37:57):
the editor and composer of the Better Offline theme song
It's Matters. You can check out more of his music
and audio projects at Matasowski dot com, M A T
T O S O W s ki dot com. You
can email me at easy at Better offline dot com
or visit Better Offline dot com to find more podcast
links and of course my newsletter. I also really recommend

(01:38:18):
you go to chat dot where's youreaed dot at to
visit the discord, and go to our slash Better Offline
to check out our reddit. Thank you so much for listening.
Better Offline is a production of cool Zone Media.

Speaker 8 (01:38:30):
For more from cool Zone Media, visit our website cool
Zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.