Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Also media, Hello and welcome to Better Offline live from
beautiful Las Vegas, Nevada. I am your host and most
punished man ever to walk this earth, Edward Zitron. I
(00:25):
live here, I live other places too, and I am
joined by an incredible crew. This is going to be
a week long extravaganza, roughly twelve and a half hours
of What I want to say is radio, but I
just mean the detritus of society and great reporters coming
here and reporting live ish from the largest consumer electronics
show I think in the world. And I just have
(00:47):
an incredible crew tonight. I have Robert Evans, of course,
the famous podcaster.
Speaker 2 (00:51):
Representing the detritus section of the team.
Speaker 1 (00:55):
Yes, yes, and Gar Davis of course.
Speaker 3 (00:57):
Representing the i uh the lower southeast of the United States.
Speaker 1 (01:04):
I am, of course representing the perverts and ed Ongueiso Junior,
one of my two wonderful reporters who were joining with
me this entire week. Ed thank you so much for
coming here, Thanks for having me. Okay, so why are
we doing this and what are we doing? So? I
want to walk you through what the experience is like.
We are sat in a recess level of the Venetian Hotel.
In the other room is a full bar that we
(01:25):
set up just for people to come in. Our guest,
Robert Gere ed Mattasowski, a wonderful producer, and we're going
to be doing this life to tape all week. This
is going to be basically a tech radio show about
a show that has no heart and no soul, a
show that, as Gere correctly said earlier, is basically the
same show as a year ago. Except a year ago,
I was a terrified baby. I had just got handed
(01:48):
the sword of podcasting. I was afraid of the microphone.
Now the microphone is afraid of me. And the important
thing is is that I brought everyone together not just
to talk to you about this show, but I want
to give everyone a temperature check of this fucking hellhole.
Because what happens is reporters come here every year and
they kind of walk in like they have to visit
their racist uncle, except their racist uncle doesn't say racism.
(02:11):
He says artificial intelligence every fucking year, and generally everyone
here feels kind of miserable. I refuse to have that
for my people. I refuse to have that for anyone
in my orbit. So I set up a week long
piss up where we talk about technology with people who
actually know what the hell they're talking about and care
about it. So, yeah, welcome to the Consumer Electronics Show.
Gare Robert. You have actually seen things today though we
(02:33):
ed and I only just got in today. What have
you seen? What horrors?
Speaker 2 (02:36):
Today? I did five hours of back to back panels
official intelligence at the ARIA, and it was It included
a number of great moments, including an entire room full
of people, led by the folks on the panel, laughing
about people losing their jobs due to artificial intelligence?
Speaker 1 (02:53):
So how did that happen? Like what were they laughing
at that?
Speaker 2 (02:55):
The MC came up and she was like, Okay, I'm
gonna do some yes or no questions, and one of
them was there's going to be a lot of job
losses due to AI. And like the the I can
actually pull up who they were from. But like there
was an initial attempt to be like, no, well it'll
just be like job job the changes in the way
jobs are lost, and then someone was like, no, there's
gonna be a lot of job losses, and everybody started
(03:17):
cackling in the entire room. It was like one of
the most ghoulish experiences I had. I've had here. I've
got the audio cued up, so we'll listen to it.
Speaker 1 (03:26):
We can loop that in that that's fucking disgusting.
Speaker 2 (03:29):
Yeah, it was pretty vile. And then there was there
was a great moment where like the first panel I
did today was about uh like AI using AI in Hollywood,
And so the founder of that company's secret level that
does like video game themed videos on Netflix and stuff
Amazon Prime Video, Amazon Prime Sorry, sorry Jesus Rob, but
(03:51):
he was also one of the team members behind that
all AI Coca Cola Christmas ad that everybody fucking hated.
Speaker 3 (03:57):
Which I did not realize until you made that connection
from me that it's it's those guys.
Speaker 4 (04:01):
So if you love those visuals.
Speaker 1 (04:03):
Yeah wait wait, So it was the actual company with
the coke.
Speaker 2 (04:05):
I mean, there were three companies, but his company was one.
Speaker 1 (04:07):
It took three companies to make the greatest piece of
art what's made?
Speaker 2 (04:14):
And I don't know what they were doing because he
came up to show videos that he had just made
by like inputting text into an automatic image generator, like
little movies that he'd made, which also looked like shit,
but looked exactly as good as the fucking Christmas ads.
I'm like, what were all those companies doing a fucking
Christmas ad?
Speaker 4 (04:33):
Laundering money?
Speaker 1 (04:34):
Yeah yeah, yeah, yes, yeah, this sounds yeah, this cost
fifty million dollars. Yeah no, no, no, we can't do accounting
due to stuff. I wonder how much frauda is like
that in AI at this but look how many just
guys exist for like yal do your AI integration and
they just dick around for three months and then they
connect chat JPT.
Speaker 2 (04:54):
I'm sure it was still cheaper than like the traditional
coke things, but I'm sure there's a lot.
Speaker 3 (05:00):
I'm not sure that's even true anymore, Like they don't
get a.
Speaker 1 (05:03):
Real polar bear, Like no, they certainly they don't. Like
if they did that, maybe even then I don't think
it would be as expensive, And it definitely wasn't. It
was definitely not cheaper on the actual AI side, because
news came out a couple day or two ago that
apparently the two hundred dollars a month chat GPT subscription
loses the money. And this is the thing. This is
(05:23):
where we are in the tech industry at this point.
We're in this world where we're meant to get extremely
erect over the idea of a Coca Cola ad that
looks like it was from made by the same people
as the black Hole Sun video. And then we're meant
to also look at like the apps we use and
they fucking suck. And then we commit to CS and
they're like, you know what's going to happen next year,
the exact same thing that we have.
Speaker 5 (05:44):
Nothing.
Speaker 1 (05:44):
It's there's AI in an oven.
Speaker 2 (05:46):
I think, oh, I'm very exciting, and and a grill.
There's a grill that has an LM integrated into it.
Speaker 3 (05:52):
Okay, the most exciting thing is that it it only
gets smarter, It's only going to get better.
Speaker 2 (05:57):
Oh, that's what Jason from Secret Level said. He played
a video of like a bank heist chase where the
main character, every time it cut back to him, he
was dressed differently and his face. At one point, he's
so sick. He's just driving down a normal city street,
but there's randomly fires every thirty feet in the road
because the AI just hallucinated them. Yes, at a certain
(06:18):
point he crashes and he's on the run from the cops,
and every time it cuts back to the cops chasing him,
there's more or less cops from scene to scene.
Speaker 1 (06:25):
This is that's the most realistic.
Speaker 2 (06:27):
But union rules they got a fifteen minute break.
Speaker 5 (06:33):
Listen, it took us fifteen million dollars when we redid
a six star chase and grand the no, someone said
it wasn't overtime, and they were like, oh, I got
the paperwork.
Speaker 1 (06:44):
No.
Speaker 3 (06:44):
When we were in Chicago running away from police, if
you turned the wrong corner, there would be more or
less cops and what you thought, So, hey, you know
you gotta give him maybe maybe maybe the benefit of
the death.
Speaker 2 (06:56):
Would I want to see.
Speaker 1 (06:57):
You have qualified immunity for that?
Speaker 2 (06:59):
Because he was was emphatic that, like all I did
was read a script into it. Was like this, just
put it. This is unedited, you know from that. And
so what they really want to do is put the
script to Heat into one of these things. Just see
what that what the big gun fight from Heat looks
like in this? Does it have that?
Speaker 1 (07:19):
Michael, I'm trying to think of what I put into it,
Like the Boris Balkan scene from the end of the
Ninth Gate where he's setting himself on the fire. I
would I would love to see what they're gonna do.
But they also like.
Speaker 2 (07:32):
About how about antichrist? I've never seen they do that.
Speaker 1 (07:36):
The end of kids moving.
Speaker 2 (07:42):
Let's just run that through. See what ESPT makes of it.
Speaker 1 (07:46):
This is their favorite lie though they're like, oh yeah,
it's just as symbol as a prompt. Even the cokead that, like,
we have an editor, we have an editor, we have
a separate colorist. Because it looks like ship and also
probably looks completely different. This is the slot they want
to give us.
Speaker 3 (08:00):
And they're absolutely like hiring like Vick effects like clean
up artists to like touch up all all of those
visuals to make them like in any way publishable.
Speaker 2 (08:09):
The colors were completely fucked on this, he complained, because
all of the comments on this video, which had, by
the way, for this company the big thing he was,
every one of the room clapped when they played these things.
I looked on YouTube. The best of them had like
fifty thousand views, like like someone, we're just like five
or six thousand, and the color grade. Everyone in the
comments is like, the colors on this look like shit.
(08:30):
His clothing changes, and Jason like referenced that and is like, yeah,
I know people had the complaint, but come on, guys,
this is the worst it's ever gonna look like that line.
That's all here.
Speaker 5 (08:40):
I keep seeing. This is the dumbest chatchy BT is ever.
Speaker 4 (08:43):
Gonna take a shot.
Speaker 1 (08:44):
Yeah, I love that. That's also their defense. Yeah, they're like, hey, it.
Speaker 4 (08:50):
Sucks, yeah all us, it's all uphill from here.
Speaker 2 (08:54):
Like we've all lived through the last eight years. If
there's one thing everyone knows, it's that things only get.
Speaker 5 (09:02):
Well.
Speaker 1 (09:02):
My shadow theory is because I got open ayes a
one model to do typos. I hope this is not
really something I've proven in any way, And indeed the
academics of us about it have not returned my emails
due to woke and it's the I saw typos and
I'm hoping that model collapse is happening. I hope they've
fed the synthetic data into it, destroying these models, because
(09:24):
that would actually be the funniest and dumbest way that
chat GBT just starts like going on the fritz, because
that's what model collapse. Just for listeners who don't know,
you go back to March. You should have been listening
to all the episodes. You go, I'm not my fault.
Model collapse is when you feed instead of real training
data from real human being, synthetic stuff created by another model.
The problem is that human beings are magnificent creatures, very strange.
(09:46):
Our fuck ups are actually quite hard to recreate. Those
fuck ups are important for informing things like how words
and logic are created. So if you have a synthetic version,
there is something that no one really understands that fucks
the whole thing up. Anyway, they are currently feeding synthetic
data into all of these models, so what I'm hoping
is it focks them up big time, because that would
be the funniest thing because there's no fixing it.
Speaker 2 (10:07):
There was a comment like that, and I think it was, uh,
let's see which one of these. Uh, I just want
to make sure I'm getting that AI Cinematic Spatial and
x R, which was which was about XR is extended Reality.
(10:27):
And there was a comment from I think it was
Rebecca Barkin, but I may be wrong about that. It
was it was one of the people on the panel.
It might it might have been a Katie Henson, but
like something on the lines of, hey, we're still going
to need human beings to be making stuff because like
otherwise there won't be anything to feed into the models
like they were. I forget exactly who it was. Like
(10:48):
that was like something I like underlined in my notes
of the panel is like, okay, so you you know
what you're doing.
Speaker 3 (10:56):
It's also cool that like the point of humans is
to have is like make things to feed into a model.
That's yeah, that's all that matters, is that as long
as we can feed the model, that's fine. That's it's
the only value we have.
Speaker 5 (11:08):
Here for the market, and you're here for the model. Yeah,
it was the most important thing.
Speaker 1 (11:12):
And what's important to know as well, is anyone saying
that is not thinking too odd about it because they
need more training data than exists. Yes, like five to
fifteen times they don't have it. We're in the stupidest
fucking era ever. There's a power like a pad of
where people like, huh, wouldn't it be cool if a
bunch of people lost their jobs they should turn into
that scene from Kingsmen.
Speaker 2 (11:32):
Yeah, like that's exactly what I thought.
Speaker 1 (11:36):
Oh, I can change And it's like, this is one
of the reasons that I really wanted to pull this
together as well, because ces I don't think the WAYCS
has been covered traditionally, and actually Garon Robert really opened
my eyes to list last year because I've been avoiding
panels for the twelve odd years i've been at CES.
Speaker 2 (11:54):
It's it's easily the best way to see what kind
of brainworms are going around because people up there just taught,
and that no one expects to be challenged or there
to be anyone in the room who's not completely taking
the kool aid.
Speaker 1 (12:07):
But what's insane is even on the floor. I hadn't
looked at it properly. I'd always been like, I'm just
here because I have clients here. So to explain the
history of this, I've got the bar, I've got the
suite previously Ashugh's with my dear friend Kevin Roposo works
at easypr otherwise not mentioned the agency again. We would
have this suite just kind of network with people who
have clients journalists, and we stopped doing it off the
(12:28):
COVID just because it was a lot of fucking money
for just to piss up, which is always fun. However,
the way I used to look at the show was
just like this is all I almost like took it
in good faith. I was like these people wouldn't spend
one hundred thousand dollars in a booth where they're lying, well,
wouldn't you know? And since I started doing better offline,
and I was truthfully a bit scared when with you
(12:50):
too last year, I was a bit freaked down. Now
I'm not very excited, but also I was just flat
out wrong about big Tech. I thought they I really
at that point was just like, look, Facebook's evil, met
as evil. But you know they have something that they get,
something like Google, same deal. They never do something really
fucking stupid again and again, and they would and everyone
here would. And there are things that are at cos
(13:12):
every year that never get made. If the please, if
anyone sees the laundry folding robom, that's what.
Speaker 2 (13:19):
If you see them, Yes, vaporware stories, they.
Speaker 1 (13:22):
Are like they have been here. I have been coming
to see yus since twenty eleven, and those motherfuckers have
been promising to fold my laundry ever since.
Speaker 2 (13:32):
Yeah, twenty ten was my first year nice and I
think what keyed me into what was really going on
here was the best like the best product in showy
that won the award was the Motorola Droid, which was
not a functional product and did not exist on the
show floor. All they had was a just the plastic
(13:52):
model that was working. It was basically just a like
a digital photo case that was very running, like still
extream shots off the phone would look because Android was
still new at this point and it won Best in
Show and there were functional phones at the show that year. Yeah,
and so I remember sending to one of the editors
at tech Crunch like it I feel like a right,
(14:14):
you know, I'm new to this like tech reporting thing,
but it's see, this seems like do you not consider
this a real problem? This seems like a like a
serious issue, Like this isn't really journalism if like, yeah,
this is what's winning here because they didn't have a phone.
Speaker 1 (14:27):
If they do, what's wrong with you? But it's true, though,
there is a general sense of like no, you just
why would you be mad at this? It doesn't exist.
Speaker 5 (14:35):
It was the future.
Speaker 2 (14:36):
Google has at their booth.
Speaker 1 (14:37):
They brought us the future. How dare you?
Speaker 2 (14:40):
Yeah?
Speaker 1 (14:40):
And what's crazy is that's twenty ten. Today's tech meter
is slightly upgraded, and that some of them don't do
them and actually a lot of them are here. A
lot of the reporters here generally just feel tired when
I talk to them, and when you ask them what
their favorite about the show that, they're like, Oh, like
the LG. I got this big monitor, big television.
Speaker 2 (15:00):
I'm excited to see another.
Speaker 4 (15:01):
I'm excited to see what new big curved TV LG.
Speaker 5 (15:05):
And we're gonna build the biggest. It's gonna be the
most beautiful okay, with the most beautiful oie l O
L e eds I'm going.
Speaker 1 (15:14):
But it's not what it's gonna be. But to be clear,
I love the big televisions. I love seeing big stupid
ship and I wish that was more of it.
Speaker 2 (15:23):
At least the big televisions are television.
Speaker 1 (15:25):
And exactly like you can't really, I guess you can
lie about like the specs and the monks and well that's.
Speaker 2 (15:30):
Fun, son of a bitch Folds. I watched it.
Speaker 1 (15:33):
I love all that dumb ship.
Speaker 2 (15:35):
Why TV?
Speaker 4 (15:36):
We have gotten so good at TV, we really have.
Speaker 2 (15:40):
It's the only promise technology and it's it's.
Speaker 5 (15:44):
Gonna watch you. It's gonna give you targeted at it's
gonna give you a chat thought, it's gonna beautiful.
Speaker 2 (15:48):
My mate just brought a replacement for our TV like
an eighty five, And that's exciting because TV, like it's
one of those like what the fuck, how does this
the only thing that that came through? Yeah, just the.
Speaker 4 (16:00):
Second Yeah it was not good. Great, it was good.
Speaker 1 (16:05):
Was it like a C one?
Speaker 4 (16:07):
No, it was.
Speaker 1 (16:08):
It was like a dinosaur picture.
Speaker 4 (16:11):
It was very it was it was very large.
Speaker 2 (16:13):
And then I plugged it into my laptop then to
play my little game.
Speaker 3 (16:18):
It was very large. The frame rate never quite worked properly.
All of all of the blacks like did not did
not show correctly. Everything was really crunched. Very good, very
very bad, very bad dynamic range, and it just slowly
got worse over the years. It's it's it's it's something
I'm still kind of confused by, Like what what.
Speaker 4 (16:39):
What happened to River's TV? A little bit more?
Speaker 3 (16:42):
A little bit, I swear, because every time, every time
I try to watch a movie, I have never.
Speaker 1 (16:48):
Heard of a television started to decay.
Speaker 3 (16:51):
No, this is actually becoming more more a thing. Although
TVs have been getting better, the new TVs are making
faster interesting and this is something that I've noticed mostly
due to like my friends, consumer choices. Is like I
I have a TV that my friend gave to me
like maybe like ten years ago, and it's it's like
(17:12):
a it's like a tenned EPLG. It works great, movies
look good. It's not four K, but it's fine. And
he's gone through like three TV since then that have
all like broken right, They've all had like, they've all
had like better specs, they've all like looked really good,
and then every once in a while something just goes
like oddly wrong. And this has just become like a
new pattern with UH, with Panasonics, with LG's, with Sony TV's,
(17:38):
and although we are getting quite good at like making TVs,
I feel like their lifespan is kind of shrinking. And
maybe it's something that I want to like talk to
people at CS about.
Speaker 1 (17:49):
Honestly, it sounds like consumer electronics in general. I mean,
my fucking iPhone's brand new, and even then it's fucking
up already. Yeah, this is early schedule. Shit, we usually
saved this bad boy for April. That's usually when the
obsolescence comes in. But now it's just my new favorite
thing that my fifteen hundred dollars fucking cell phone does
is when I try and add a song to a playlist.
(18:10):
I know it's an edge case. No one does that,
but I did that and it just crashes Apple Music.
Speaker 2 (18:16):
Now I can only add it reasonable burden.
Speaker 1 (18:19):
Yeah, I know.
Speaker 2 (18:19):
Why wouldn't only got eight cores?
Speaker 1 (18:23):
They only have a specialized piece of silicon in it
to make it work.
Speaker 2 (18:27):
What sixteen gigs a RAM? You can't handle that on
sixteen gigs a RAM.
Speaker 1 (18:30):
No, I need my fucking MacBook Pro to do that.
For my eight chrome types. I can have so format
change Edwon Graso Junior. Please introduce yourself. Tell everyone about
your work.
Speaker 5 (18:40):
Okay, I am a tech writer. Write about tech. I
write about finance, or write about labor a Silicon Valley,
generally in anything that it's you know, tentacles touch.
Speaker 1 (18:55):
Where'd you write?
Speaker 5 (18:57):
I started advice, Freelance now magazine's news outlet. It's anywhere
that will let me rant apout Silicon Valley basically.
Speaker 1 (19:06):
So I brought ed here because he wrote many great things.
But one of my favorite things is is kra Swisher piece.
It's a beautiful piece of right the burn book review
where you just really put the boots to her. But
I think c Yes is probably the most Swisher coded
event for a thing she's never been to, and I
think it's jarring for the same reason, and that everyone's
(19:26):
kind of just accepting a reality that isn't true but
pretending to care. And it's frustrating. And I'm glad you're
here to see this. This is your only ces, right, Yeah,
my first time. I'm very excited. Linda Yakarino. You're going
to listen to.
Speaker 2 (19:44):
She's doing a s feeture.
Speaker 3 (19:45):
She has she has the ex Corporation keynote tomorrow.
Speaker 2 (19:48):
I eat.
Speaker 3 (19:50):
Oh, I don't worry, Robert, I I already have it
in your schedule.
Speaker 1 (19:54):
I'm so glad you make his schedules.
Speaker 2 (19:56):
The thing, I don't bring guns to these things because
I would just myself, like have I just take care
of that problem in my hotel room.
Speaker 1 (20:03):
Well, you don't look for half an hour going like
X is the place where it happens and that's the tea. Well,
I can't wait for you to hear Yakarino talk because
I'm she's a real yak Aino. Yeah. Sorry, but also
I'm looking forward to it because well, you're African American. Yeah,
(20:26):
and I truly do not know how they're going to
treat you on the floor. Yeah, I don't know how
they're going to act.
Speaker 5 (20:32):
Yeah, I mean, do we think the school measures are
going to be out in force? For the ex offender.
Speaker 1 (20:36):
There'll be a guy like really excitedly calling you over.
He's like, no, we got ai to do this now
and use my ray bands.
Speaker 5 (20:43):
Oh, it's gonna it's gonna be interesting. I'm very I'm
also gonna be curious what the hell they're even talking about.
I'm trying to go in with as little info as
possible so I can take it in objectively without bias.
Speaker 2 (20:54):
It's like watching Queer. You really just show ye, take
it all it you know.
Speaker 5 (20:58):
I'm here for the journey.
Speaker 1 (20:59):
Yeah, yeah, And I think that's important as well, because
that's exactly what I am here for. I want this
to be a journey. And there are multiple reporters joining us.
We have Jesse Farrah from Your Kickstarts as Sucks of course,
and David Roth of course, my second elf. That's what
I was calling them in my head, of the reporters
I hired, that's what I'm calling them now, the Elves.
David Roth of Defecta, a sports journalist, will be joining
(21:20):
us tomorrow and for the rest of the week other
than for our spa day on Saturday, which is very unfair,
very nasty. And right now, our bartender Phil Braun. Now
you're speaking to the microphone and say hello Phil.
Speaker 5 (21:32):
Hello.
Speaker 1 (21:33):
We will have a full rundown of Phil at some
point later in the week. But Phil has been joining
as the bartender here for years and he just handed
me some so tall, so tall. I'm saying this.
Speaker 5 (21:46):
That's right, it's good.
Speaker 1 (21:48):
I nailed that. It's kind of but that's the thing.
The whole point of this is journalists come here in
this oasis. This is the oasis in the middle of
the bullship. Because see, yes I was telling it earlier.
There is nowhere to sit down.
Speaker 4 (22:02):
No, they always want you to moving.
Speaker 2 (22:04):
They want you to keep on, keep on traveling through.
It's like a casino.
Speaker 1 (22:09):
No, you can find place, plenty of places to sit
down in casinos. This horrible place is an insult to
our beautiful slot machines and our beautiful tables that we have.
And they'll get you a seat at the table. Don't worry.
Please come to my money. And Phil right now is
gesturing because Phil is our wonderful bartenderho is bringing us
drinks as the show goes, Probably not on the first
half episodes, definitely on the.
Speaker 2 (22:31):
Second Paloma, Okay, can I get something with the kill?
Speaker 1 (22:37):
Wow? I don't give that went flawlessly? Yeah, that was
that was an example of the bar working, but actually
that is actually the bar working.
Speaker 3 (22:45):
Well.
Speaker 1 (22:46):
No. So the point of this week, though, is we
want to be the oasis within a tech industry gone
wrong and a conference that's gone wrong. No one likes
coming here to cover this. No one is excited, everyone's depressed.
We will not be. We will be probably not, in
my case, sloppy, just in case the lawyers ask. But
everyone else can be. Well, the reporters we bring here.
We're gonna have two episodes each day, ninety episodes. Jesus
(23:09):
Christ's already doing great, and it's gonna be a lot.
But I want to bring you diverse voices. I want
to bring you the people that have been here forever
and the people who are just being exposed to this.
The reason we have Ed and David is we really
don't want this to be a fucking tech no offense
to techn it. We don't want this to be a
tech reported thing. We actually want this to be a
(23:29):
temperature check of this industry.
Speaker 2 (23:42):
You know that I always there's a lot to like
roll my eyes out and be frustrated by. I always
really like CEES. And when Gear first started coming to this,
the way I explained your job at CEES as a
good journalist is you are here to be a terrorist,
find the richest asshole you can find and makes their day.
Speaker 1 (24:02):
That is a joke.
Speaker 2 (24:03):
No, not in a legal sense, but in an emotional
terrorize there, harass them over something that overlies that they're talent.
Speaker 3 (24:13):
Well yeah, and like the the hive mind, like group
think mentality here is like crazy because you you will
walk into a panel and you you will start hearing
people say like insane ship and you look around to
be like ha ha ha, that's crazy, right, and everyone's
like no, not a long enthusiastically, and it slightly makes
you question your own sanity.
Speaker 4 (24:35):
Never once there was a complete certainty.
Speaker 1 (24:39):
Someone I would posted on Twitter the other day They're like,
would you trust a clone of yourself? And everyone was
like I wouldn't. I be like absolutely, it's we would
fucking cook anyway. Complete certain But it's so weird though,
You're right, and people walk around and like damn, really
so the AI for my pant, how's that going to work?
Speaker 3 (24:58):
It's crazy, and it like because everyone is so bought
into it. If you make any objection during like any
kind of like shocking, any kind of Q and a panel,
like people like are so surprised. It's it's like it's
like you're a heretic. It's it's it's it's it's it's insane.
It's like you don't believe in like the project of humanity.
Speaker 1 (25:19):
Two of these people, it's like you don't believe in
Santa Claus.
Speaker 2 (25:21):
My sex wild one of these. When Rugged super Rugged
speakers were like a massive product that you could find
like half a dozen booths selling them. I would just
go there and I would ask, can I can I
test them out? Can I like drop them and stuff?
And then I would pick up the heaviest object block
because they usually had like flogs or something to post
them on, and I would just break stuff on the
show floor and it worked every time. They would always break,
(25:45):
oh a bunch of stuff. And I learned that from
we went to see a demo where they were they
were showing off like an indestructible smartphone and they were
doing it by like they pulled out all the journalists.
I think we were in front of Caesar's palace and
they were driving a limousine over the phone and then
they would do it, and then the next troop of
journals would come up and they would let a journalist
set it down. And I don't know who this was.
(26:06):
I wish i'd gotten his name, because he taught me
one of the best lessons I ever learned. They were
laying the phones flat, and he just wedged it under
the wheel on its side, and they just drove over
just to stop it.
Speaker 4 (26:16):
Yeah, that's that's.
Speaker 2 (26:19):
Being a journalist.
Speaker 1 (26:20):
Yeah, but actually that's a very good metaphor for what's
going on at the moment. Everyone's like, well, they say
it works, we'll be fun.
Speaker 5 (26:28):
Would they lie about it?
Speaker 1 (26:30):
Why would they possibly lie about chat GPT becoming super intelligence?
Speaker 2 (26:34):
Do you hear that?
Speaker 1 (26:35):
Fucking Samuel? That motherfucker. He's not here. He's not here.
This is, by comparison, this is pretty real stuff. Where's
Sam Olman? He should be giving like a vacuous keynote
about how this is going to replace doctors.
Speaker 5 (26:46):
He's doing the real work.
Speaker 1 (26:48):
He's doing the real work building by which he means
trying to like leverage his his startup capital.
Speaker 2 (26:53):
As I read today, we're just thousands of days away from.
Speaker 1 (27:00):
That was said a few months ago, so we're actually
about nine hundred and.
Speaker 4 (27:03):
That was a few thousand days.
Speaker 1 (27:07):
I love that, Honestly, I need to set a calendar
invite for that. I need days from now, because I
have Sam Worman's number and he is yet too. Takes
me back.
Speaker 4 (27:18):
Oh I am.
Speaker 3 (27:19):
I am bummed that we haven't seen Palenteer here in
recent years, because oh, they've.
Speaker 1 (27:23):
Seen you though I.
Speaker 2 (27:26):
Showed up. I can say this now. We showed up
with a flipper zero and we're just turning off all
of the televisions at the Pallenteer boo. They were doing demos.
Speaker 1 (27:40):
Oh oh, someone probably got fired about it. Actually know,
someone was probably like, oh i'll fix this. No, someone
did that one ces and they got in a lot
of drops. Oh yeah, because they video themselves doing it,
which is the classic thing you do with crime.
Speaker 2 (27:58):
Yeah, I remember that, which is why I did not
about it except the shelf exactly.
Speaker 1 (28:01):
You put it on a podcast years later.
Speaker 2 (28:03):
Yeah, I approved.
Speaker 1 (28:03):
You did it once. Yes, I did try a bit
three or four times before I started getting anxious, which
was I would just pick up something and ask if
I could have it, and just like picked up the
edge of it.
Speaker 2 (28:15):
Can I have this?
Speaker 1 (28:15):
And they look at you and they're like.
Speaker 4 (28:19):
Like sometimes they say yes though, which is why you
have to do it every lot of free sh You
actually really can't surprisingly walk away with some really odd stuff.
Speaker 1 (28:28):
I bet they. Eventually a security person looked at me
in a kind of like should I memorize this guy's face? Well,
and I thought, I don't want to be arrested at
fucking ces. No. My lawyer will just be like again.
Speaker 2 (28:43):
I'm not coming to your arraignment. Yeah, I know you're
on your own bit. I know you're paid up, but
I'm not doing you.
Speaker 1 (28:49):
Fucking work this out, smart guy. Yeah, well you pick up,
pick up the judges gavel and ask if you can
have this? Click fucking ydie.
Speaker 2 (28:56):
Well, speaking as a judge, that's very offensive.
Speaker 1 (28:58):
Are you you were judged?
Speaker 2 (28:59):
Yes?
Speaker 1 (29:00):
What?
Speaker 5 (29:00):
No?
Speaker 1 (29:01):
This is not good.
Speaker 2 (29:02):
This is a punicipal judge for the state of New Mexico.
I got sworn in. I'm good.
Speaker 1 (29:05):
What the fuck?
Speaker 4 (29:06):
Right next door?
Speaker 2 (29:07):
Right next door, very high the border. A fan of
mine is a judge and was like, did you know
that judges can just make other judges? Like thing, I gotta,
I gotta.
Speaker 4 (29:23):
You can make all of us judges.
Speaker 2 (29:25):
No, it's again, it's it's it's interview with the vampire rules. Okay,
so the youngest vampire does not have the power to
make other vamps, and it's kind of unclear to me.
Speaker 4 (29:35):
Your mate does.
Speaker 2 (29:37):
Yes, yes, I's still the start of this situation.
Speaker 1 (29:39):
But I can't how do I become an older vampire though?
Because you realize that my goal here is not just
to become a judge myself, but start making more judges.
Speaker 2 (29:47):
That I don't know, because I have not looked up
anything about Dodge.
Speaker 3 (29:51):
This would be all I'd look We got to ask
the people this, but they have to know.
Speaker 1 (29:56):
I would get made to judge, and then the first
thing I would be googling is how to make more judge,
and I would be making those.
Speaker 5 (30:03):
Of your first five judges that you're making.
Speaker 1 (30:05):
I mean all of you other than Robert obviously, I
would know. It would be like the entirety of cool
Zone media. You make everyone judges.
Speaker 2 (30:12):
Yeah, and then we can all go We can all
go to New Mexico and do a blood Meridian fuck yeah, yeah,
well yeah, there was actually a lot of child if
you send that guys back.
Speaker 1 (30:24):
So this is the format that you're going to have
for the entire week with Better Offline. Because this is
also something because after a year of doing this, I'm
just trying everything, and I do think that Tech also
kind of needs a talk show. And the reason I
say that is Tech has got everything. It's got finance,
it's got basically sports team shit. Do you like anthropic?
Do you like open ai? And the thing is that
really is like a let's think Cowboys jet situation, and
(30:47):
that's gonna hurt someone's feelings. I'm very happy about that anyway,
moving on from them, you've also got a bunch of
gossip and you've got a bunch of shittheads that you
can be pissed off at an actual, real events And
I just think that nails. This format nail. And that's
why I'm excited to start with all of you, because
at the end of the first day, everyone feels a
certain way dread for the rest of the week, sad
(31:08):
about the horrible things they saw, or just wondering why
they're doing the fucking job they're doing. They just no,
none of us in this room. It's just very It's
an interesting way when you talk to the tech media
to see how they are doing, but also people who work.
I wish I could get more people off the floor,
but I don't want fucking companies on here. That would
be so boring.
Speaker 5 (31:26):
How did you feel your first cushen and you're going
around on So I was just a child back then,
I was twenty eleven.
Speaker 1 (31:34):
I can't do imagine. Yeah, when I was a young boy.
Speaker 2 (31:38):
That one.
Speaker 1 (31:39):
I have some stories I can't tell, but I was
a young I was a young PR person and I
loved it because it was like twenty eleven, so before
we realized tech had problems, we were just like ignoring them.
Speaker 2 (31:49):
Well, and this was back to when the porn industry
trade show was at the same time.
Speaker 1 (31:54):
How I was terrified for them.
Speaker 2 (31:56):
I watched Steve Balmer get out of a fucking h
L elevator with two porn stars that were even taller
than him, and he is massive.
Speaker 1 (32:04):
It was.
Speaker 2 (32:05):
It was amazing just from me. It was like watching
Greek gods straight across the field. Yeah, the goad of
sweat he did smell exactly like, yeah, you smoke crazy
and I leaked that to another tech pub.
Speaker 1 (32:19):
Very that's fucking journalism, right, But my first used to
be fun when I was stupid, when I was just like, oh,
this will happen. And also twenty eleven there was still
new lands to Conka. They still had new fun things
to find, new things that they could make that they
could actually make good on. I think it's been like
five or six years of them just being like, we
don't have it, there's nothing.
Speaker 2 (32:40):
That we've got, nothing else.
Speaker 4 (32:41):
It's it's like I was.
Speaker 3 (32:42):
So lucky that my first es the big thing was
like crypto NFTs and.
Speaker 4 (32:47):
So like like.
Speaker 1 (32:49):
So one of the lasting industries I knew.
Speaker 3 (32:53):
Going into that, like everything I hear here, everything I
hear in this convention is complete bullshit.
Speaker 2 (32:57):
Yeah.
Speaker 3 (32:57):
And then the next year I came, the big thing
was the metaverse. So again everything I hear here is bullshit.
Speaker 4 (33:03):
Yeah. And you've just been able.
Speaker 3 (33:05):
To like slowly watch the life and drain from the
people's eyes because they know, like every every year there's
like this new thing that's supposed to be like the
new thing, like this is this is what we have
done as an industry, and it and it and like
it doesn't.
Speaker 4 (33:19):
It doesn't work dry and and and it's.
Speaker 3 (33:22):
It's odd because like this year is the first year
where there hasn't really actually been anything new. It's just
it's the same as last year. It's just more AI
stuff Again.
Speaker 1 (33:29):
I actually think twenty twenty four was pretty ahead of
the pack. I was gonna say, this is this year.
Speaker 5 (33:34):
I keep seeing places be like, this is the year
of agents. They're gonna show agents.
Speaker 2 (33:38):
They were talking about that a lot in panels.
Speaker 3 (33:40):
Yeah, the one thing I've seen in the panel list
is like there's been a lot more integration for like
content creators and like streamers, right, And even saw this
with the Golden Globes a few days ago, how like
they let tons of streamers and influencers onto the carpet
for like the first time. And there's way more panels
this year that's about like how do we like get
(34:00):
this whole like streaming ecosystem that's like separate from like
legacy media and even like legacy like tech media, Like
how do we how do we get that into our sphere?
Speaker 2 (34:08):
And that was a big thing they talked about in
my third panel, AI Cinematics, Spatial and XR the next
level of creativity.
Speaker 4 (34:15):
I am putting a gun into that.
Speaker 2 (34:18):
They were specifically talking about how uh you know nowadays,
like you know, individual creators can be as big as studios.
And then someone of the panels like, well not really,
like not in.
Speaker 3 (34:29):
Any way that's actually made I'm excited to see the
new movie from Destiny personally, really excited to see he.
Speaker 4 (34:39):
Is trying that he actually is trying.
Speaker 1 (34:41):
I would love movie just Wikipedia to life.
Speaker 2 (34:45):
There was a there was a great moment in there
speaking of like failed Ces's past where they like, because
the extend XR stands for extended reality, which is just metaverse.
Speaker 3 (34:53):
It's just the metaverse. But they can't say that because
they will get murdered, and that's.
Speaker 2 (34:57):
What they said that. Well, they were like, you can't
say ever it's yes anymore, but we all know it's
a real thing.
Speaker 5 (35:05):
They say web three at any point, oh so much.
Speaker 1 (35:11):
Arrested people.
Speaker 3 (35:12):
They are characters. They still do say like they still
definitely say web three, which is insane.
Speaker 1 (35:22):
Is one. Yeah, well I'm just shocked they still say
metaverse with a straight face, even like talking about this nonsense.
But also x are x is a great term because
it's been around for a while, and it usually just
means I don't know if this is mixed reality or
virtual reality. I don't know what device I'm selling this on.
Speaker 2 (35:40):
You guys. Seeing this panel was Charlie Fink, who's like
the oldest of the metaverse. He's like a Forbes writer
who is like an old school he's writing. He wrote
a metaverse book in twenty seventeen.
Speaker 1 (35:53):
Honestly mad respect for actually like knowing stuff early. I
don't know if I'd agree with all of the assumptions
such as the metaverse existing.
Speaker 2 (36:03):
That was my big disagreement with him, that it's a thing,
because they were just on the metaverse is just you know,
attempts to extend the Internet into every day life, and
I was like, no, no, no, that's that's just.
Speaker 4 (36:14):
That's just the Internet.
Speaker 2 (36:16):
That's just every day life. That's not the metaverse. That
I can now order food to be delivered to my
house are fairly.
Speaker 1 (36:27):
I do miss that period in the media though, when
it was just like anything anything. They're like, yeah, they're
going to have share houses, and then now we're going
to have we're going to live in the meta We're
going to live in the meta us, and that.
Speaker 2 (36:38):
People spend money on nikes in the metaverse, and you'd
have banks in the meta better.
Speaker 1 (36:48):
Absolutely didn't they have like an Alby's in the meta use.
Speaker 2 (36:55):
That's the only thing they were right. Is the only
way I'm getting into it Rby's is if I know
I cannot make physical content.
Speaker 1 (37:05):
Someone too good for the for the the meat for
the beat. The meats are not for you. But also
what's important is I think we need to meet everyone
who goes to the digital arbies and then connect them
with mental health personnel, because not a red flag for
a list, but I think you've got like therapists could
be met. This is your therapy in the metaverse? Yeah problem,
(37:26):
So yeah, not doing it in the fucking real world
if they're in the metaverse arbies. But it's so funny
as well that everyone's like the metaverse is happening, and
then they're like, oh, how did you get in there?
It's very simple. We have this expensive and uncomfortable stuff.
Does it make you sick sometimes yes? Does it physically
hurt yes, But when you get in there it sucks.
Speaker 4 (37:49):
Vision Pro dropped less than a year ago, and did they.
Speaker 1 (37:52):
Discontinue it, No? They just no, there was oh my god,
I can't believe that review.
Speaker 5 (37:58):
There was one of the Apple products.
Speaker 1 (38:01):
They discontinue something that that sounds like Apple eventually, like
they just give up.
Speaker 2 (38:06):
Check that.
Speaker 1 (38:07):
No no, no. So the vision Pro I know people
are very upset with the fake news. People are very
upset with me about my vision Pro review. Very unfair.
It was second episode.
Speaker 2 (38:18):
There is still learning, but got something there is.
Speaker 4 (38:21):
Sorry.
Speaker 3 (38:22):
The most reputable outlet, game rant yea says that Apple
has reportedly ceased production of the headset.
Speaker 2 (38:31):
Really well and that's the New York Times of ranting Edge.
Speaker 1 (38:35):
Congratulations, you have won one off first.
Speaker 3 (38:41):
Apparent apparently system game Rant is citing an article in
another website called the Information dot com.
Speaker 1 (38:51):
The information more or less, it's real ish, it's real,
is the information.
Speaker 5 (38:56):
I would just climb up that very high page. They've
improved crazy, yeah, so be able to get around every
single one now.
Speaker 2 (39:03):
But also it's hard. It's hard now.
Speaker 1 (39:05):
I feel like it was kind of insane. That was
more insane to me than crypto, and crypto is pretty insane.
But it's like, oh, how I'm gonna get in the
metaverse this very shitty fucking headset that sucks Vision Pro
was pretty good. I know my review. I know people
are upset that I liked the product. There was potential there,
they just did not access it.
Speaker 2 (39:22):
Apple has the money for there to be potential there.
But I still maintained that Steve and I'm not a
guy as you know ed having said it from the hours, Yeah,
I'm not a guy who worships Steve Jobs. I don't
think he would have let that product out yet.
Speaker 1 (39:35):
No, No, it needs like two or three years. But
it had potential even then. Though this is the biggest
company intake, they're really only company capable of making consumer
electronics that are actually good anymore, like everyone else is
kind of like just mostly making up and maybe putting
shit together. Yes, they're company like Frame. There are there
are others, but like the biggest, and they're like, here
is our best.
Speaker 2 (39:54):
Go at it.
Speaker 1 (39:55):
And when I tried to watch June on my Vision
Pro and a fly back from London, I got the
worst migraine of my life. What was I was really
enjoying the film, and actually the Giant Scream was super cool,
but I was in so much pain. I took it
off and I was like sweating.
Speaker 5 (40:11):
Well, that was the extended reality of the Irakis.
Speaker 4 (40:16):
It was extending your reality.
Speaker 2 (40:18):
Yeah, like I say it, that is how it feels
to read past the third book.
Speaker 1 (40:30):
Okay, but what's important was I really watched too much
of that movie and I had like a migraine felix
a day.
Speaker 5 (40:38):
Or two, like you see the Golden Path.
Speaker 4 (40:43):
The Pain, but I did not.
Speaker 1 (40:44):
I watched like the beginning of the film. I do
not know the po that's gonna be you that Robert
on the first episode of Better Offline will be the
shihlud of content. I'm like, ah, yeah, j as we go.
Speaker 4 (40:57):
That has been proven completely drue that.
Speaker 2 (41:00):
Yeah, I live from a different era in which all
of my friends in high school none of us had sex.
We all had very strong opinions about it.
Speaker 1 (41:07):
Well, let me tell you, I too come from a
rich line of not fuckers and not having sex in
high school at all. Absolutely my thing was ever quest. Okay,
one might say I was even cooler than you were.
Speaker 2 (41:20):
Yeah, it was wow for me.
Speaker 1 (41:22):
Oh no, I got into well later it was like
getting off of cocaine to do heroin. Yeah, yeah, barely
worked on me. I just gave up after a while,
turned to drinking anyway, it's just great that this is
the tech industry now. It's just the group hallucination where
we all go, yeah, fuck it this as we sit
here and talk, and I know I'm grinding this act
quite hard, but we're hearing a lot about AI in
(41:43):
this fucking conference. Everyone's staple in chat, GBT and anthropics
clawed to everything, and it's like, okay, great, but we
are sitting here and every major company is not making
a single fucking dollar, actually losing money on this, and
I feel a bit insane. I feel a bit crazy
every day because every every day I look out the
window intake and we're just like, our biggest thing is
everyone burning money. I've now seen two stories in the
(42:06):
last week from major publications going what if AI is
a bubble? Which is probably the most worrying songs, Like
The Economist had one that was like, what if a
bubble bursting was good? What if it was good that
the bubble burst, which is insane because there is a
David Leonard story in the New York Times saying that
a housing bubble burst in the six o seven region
(42:29):
would have been a good idea too. And let me
tell you, I don't know history super well, but I
don't think the housing bubble bursting went well for anyone.
Speaker 5 (42:36):
No, So.
Speaker 3 (42:42):
You need to have the disruptor mindset right, And the problem.
The problem with the bubbles is that as they get
bigger and bigger, it stifles innovation in other areas because
we're all focusing on this bubble, right, So as soon
as it bursts, now, now you have so many more
opportunities to really disrupt who knows how many other industries.
Speaker 2 (43:05):
Well, let me do. During the housing bubble, we had
to innovate by like eating ninety nine cent burritos, and
now four dollars of those stretched through some people.
Speaker 3 (43:12):
And now we have airbnbs taking over half half of
entire neighborhoods.
Speaker 4 (43:16):
Innovation so much better.
Speaker 1 (43:18):
That's good. Innovates sleeping and people were really they were
innovating ways to top themselves. When I moved to New
York in two thousand and eight, that was great. Fortunate people.
That's the thing. You're not really going to have people
doing themselves in on this one. But I don't think
people realize the bubble bursting is going to be very
bad for two reasons. One, all the tech stock's going
to take a kind of a haircut. But also I
think CS is kind of a bad omen the fact
(43:41):
that there this is basically two thousand and four CS.
That kind of speaks to the large problem I've talked
about before, which is they don't seem to have a
new thing. AI isn't new, and that's not working, and
they're like, well, what if we put AI in a
in a we can laugh at the kind of like
AI in the oven thing, but really beneath the surface
here and I look forward to seeing the shuffle on
(44:01):
myself for this, there is this undercurrent of what the
fuck do we do it? What? What we what do
we doing that?
Speaker 5 (44:06):
What?
Speaker 1 (44:06):
What?
Speaker 5 (44:07):
Well?
Speaker 1 (44:08):
How do we make money now?
Speaker 2 (44:08):
What?
Speaker 5 (44:09):
What?
Speaker 1 (44:09):
What the money? And please?
Speaker 2 (44:10):
I think Ed had a really good point, which is
that obviously what they see is AI agents being the future,
and those kept getting brought up and what's one of them?
Speaker 1 (44:18):
What did they say to me?
Speaker 2 (44:19):
It's interesting to me is that obviously there's a there's
a lot of potential value in being able to do
something like say, hey, I want to plan a trip
to this this and this location, sure, and having a
thing that can book it. That's that's not a multi
billion dollar industry. But there's utility or to being able
to be like being able to plan out like five
or six different meals that you want to you know,
(44:41):
repeat over the course of a week or two and
then like plan out or there. There's some utility there.
But the way people talk about AI agents isn't in
that way. It's about another living thing that they like
bounce ideas off of and invite socially into their lives.
There was a lady at the uh the first panel
(45:02):
today which was about like AI in entertainment, who was
like and my agent. Obviously she just is growing every
year or every day, and it was like talking about
her like a person like gendering she does her AI agent.
Speaker 1 (45:14):
I must be clear, what was she talking about?
Speaker 2 (45:17):
Her AI agent?
Speaker 1 (45:17):
Okay, but does this thing exist? No?
Speaker 4 (45:19):
No, no, Me and my rabbit have a very intimate relationship.
Speaker 1 (45:23):
Sure, sure, real fuck up pets man, believe No.
Speaker 2 (45:26):
No, she didn't say which product she was using, but
there are a couple of them out right. Just to
be clear, character AI was on people.
Speaker 1 (45:33):
Talk, that's not an agent. Oh my fucking god, I
have to do this, okay, yeah, an AI agent?
Speaker 5 (45:38):
No?
Speaker 2 (45:38):
No, no, that's what if she she was referencing because it
might have been like her co pilot or some shit.
Speaker 1 (45:42):
A co pilot is also not an agent. An AI
agent is meant to be an autonomous AI that goes
out and does shit and can do things and then
respond to things based on how people.
Speaker 2 (45:53):
Big part of it's like what Rabbit was trying to
sell with the R one lass.
Speaker 1 (45:56):
Rabbit wasn't even doing that. They were just doing like
the very early stage agents, which is, okay, you're technically correct,
which is that agents are meant to be they take
actions for you. That's the official definition. But what everyone's
talking about with agents, and this has driving me fucking
insane because no one really wants to talk about the
reality here is they're talking about I will staple a
few fucking lms to each other, and then enough llllms
(46:18):
will talk to each other that things will just start happening,
kind of like the Incredible Machine if you ever played
They just like a Rubegoldberg situation, he says, not really
knowing that reference Peebee's Playhouse. I don't know. Nevertheless, these
series of actions then allow theoretically this thing to do
something at the end. The big common one, right, now
is sales sales agents. These things are insanely expensive, and
(46:40):
I have yet to hear of one that does more
than send emails and answer emails. And I've yet to
really see anyone explain how anything else is fucking possible.
And the thing is people will say agent about anything.
Character to AI is just an LLLM company.
Speaker 2 (46:55):
That like they got a kid to kill themselves.
Speaker 1 (46:58):
Insane, but also just like has like a denarious tulgarian
bot that will also be a therapy. It's fucking insane,
of course, So that's not an agent. Chat GPT not
an agent. They want to said, I'm so glad you
brought this up, because it's like this agentic thing is
their next bullshit. But I cannot be clear enough. If
you think regular lms are expensive, imagine a bunch of
lms playing dipshit tennis with each other. That is how
(47:19):
agents work.
Speaker 5 (47:20):
You know, I think you know in sixteen Z is
really big on this. There's a there's a manifesto, one
of their many manifestos they hear they say, every child
will have an AI tutor that is infinitely patient, infinitely compassionate,
infinitely knowledgeable, infinitely helpful, infinite and then they say that
every person will have an AI assistant slash coach, slash
(47:40):
mentor slash trainer slash advisor.
Speaker 2 (47:44):
How can it be metor it doesn't know how to
do things?
Speaker 4 (47:47):
Slash therapist slash therapist. That's good, that's good.
Speaker 5 (47:49):
It's infinitely patient and infinite compassionate, definitely knowledgeable, and infinitely helpful.
Speaker 3 (47:54):
I just they want a friend without having to do
any of the work. That makes like having a friendship.
Speaker 5 (47:59):
Yes, not only.
Speaker 1 (48:00):
Makes these pople therapy, but also on top of that,
if everyone keeps talking about this idea and I think
they've been doing it for years, and this of AI
being your assistant, why are there no AI assistants that work? Yes,
you can make your phone do reminders now. Yeah, we're
on like the sixteenth seventeenth iPhone and we can now
(48:24):
make Siri most of the time understand.
Speaker 4 (48:26):
Something like half the time for me.
Speaker 1 (48:28):
But yeah, the British sery actually understands me unlike most people.
Speaker 4 (48:32):
I have set a Scottish theory. But maybe maybe why
maybe I like I like the way.
Speaker 2 (48:39):
The way one of the panels where they are like
is voice the new operating system.
Speaker 1 (48:44):
I am going to kill myself, I will kill on.
Speaker 2 (48:47):
The panel was British and was like, well, usually they
don't understand what I'm saying, so like all of the
Americans were like, oh, absolutely, and the one person with
an accent was like, I probably not.
Speaker 1 (48:57):
Okay. That makes me so angry because when fucking Alexa
came out, we would we did this one already. We've
done this one already. I've I have already done pitching
for clients about voice being it wasn't guess what it
was not the future because it turns out that talking
is annoying.
Speaker 5 (49:28):
Well, I've been seeing discussions about this. I saw someone
online try to be like Scarlett Johansson fumbled the bag yes,
and it's like, whoa what bag? Yeah. They were saying
that if she had simply let open AI use her voice,
she would have been the future of AI Voice forever, like.
Speaker 3 (49:44):
Achieved immortalities, be like the AI Voice for generations.
Speaker 5 (49:48):
I'm like, yeah, just like you know what happened with Alexa, right,
I don't think that happened with Siri. I know the
voice actors, all of them, I know them personally.
Speaker 1 (49:56):
Gramble Chambers my favorite, like what a.
Speaker 3 (49:58):
Bad like business decision for an actress, because now you'll
have like kids who grow up with like like Alexa
machines in their house, and then they'll watch a movie
and be like, oh wow, the Alex is in. It's
like it's like a voice servant.
Speaker 4 (50:16):
It's Siri.
Speaker 2 (50:17):
Casting directors will be like, well, Scarlett Johansson, No, she's that.
She's that lady who everybody has to talk to get
a dentist appointment. Nobody wants in a movie. Like, it's
a terrible idea to be that.
Speaker 1 (50:29):
You mean lost in translation.
Speaker 5 (50:31):
I've watched her a little too many times as really
one of the one of the factors that keeps coming
in here, this idea that the voice is gonna be
the transcendent thing there.
Speaker 1 (50:43):
Yeah, I and also that it can do more than
it can, because I don't think people realize and I
know this due to my accent, how fucking bad voice is.
Even in the best case. It just does not. Sometimes
the things it hears are truly insane. And the more
data you have in your phone, the more numbers and
names you have, the more insane. The idea is, I
(51:03):
don't need to text someone I know from North Carolina
eleven years ago and I certainly don't need to text
them about the burger I'm ordering. Was I ordering a bugger?
Speaker 4 (51:10):
No?
Speaker 1 (51:10):
I was trying to do a meeting related thing. But
did this fucking thing hear me? But also it just
angers me that I have to hear this. It's been
like a decade. I thought we got away from this
fucking angle. No, very foulmouth to that. I apologize to the.
Speaker 5 (51:23):
Well, at the very least, at least there's no crypto.
Speaker 2 (51:26):
To us, right. Oh, I'm sure we'll hit some I
haven't run into it yet.
Speaker 4 (51:33):
I'm sniffing it out.
Speaker 2 (51:34):
No, I didcause one of the women on one of
the panels worked for Republic, which is an app that
allows you to everyone to do in like venture capital
investing with as little as twenty five dollars sick and yeah,
and also invest in crypto at the same time.
Speaker 1 (51:50):
Why not?
Speaker 2 (51:51):
And the one review of it said like, it's a
great service if you want to diversify, but it's extremely
risky and everyone loses money that's not diversified.
Speaker 3 (52:01):
I guess people are still reeling from hawk to a coin,
so maybe that might be more on the down low.
Speaker 2 (52:07):
Uh here, I really I.
Speaker 1 (52:09):
Love that she is probably in trouble the SEC. I
love that the SEC is.
Speaker 5 (52:14):
Gonna go somewhere with no extradition, tweet a bit, and
then get extra.
Speaker 1 (52:21):
People lacked the moxy of the classic scam artist.
Speaker 2 (52:25):
Yeah, look, you.
Speaker 1 (52:27):
Do that when you're in Costa Rica. You you fucking
He was a.
Speaker 2 (52:31):
Real piece of ship. But I missed McAfee.
Speaker 1 (52:33):
Now, if McAfee, if he'd had just hung in there
another year or two. My favorite John McAfee thing was
if you looked at the Champagne he was drinking.
Speaker 2 (52:45):
It's like the cheapest ship. He didn't have that much money, Like,
he was a shitty yacht. They had terrible guns, but he.
Speaker 1 (52:54):
Was eating like like like a woolgreens cheese.
Speaker 5 (52:57):
Yeah, hell yeah, he's drinking like Corbell libertarian.
Speaker 2 (53:01):
But like which is no one, no one, so no one.
Dominicants took his guns. Wait did they really steal They
did steal it. They steal his guns. He's sailed into
a foreign port with a boat full of illegal they wow, country,
you can't do that.
Speaker 1 (53:19):
You're not allowed to do that as usual the fucking
fun place. I woke mafia. Oh, I can't ride into
another port with a bunch of guns.
Speaker 2 (53:29):
This is obviously just because Dominican the Dominican Republic's so corrupt.
It's like, man, yeah, it would take your guns. You
can't do that.
Speaker 4 (53:36):
But also this is just like one thing of a
functioning democracy.
Speaker 1 (53:39):
If a very strange man.
Speaker 2 (53:41):
On all sorts of substances, clearly on drugs somewhere basically
trafficking human beings.
Speaker 1 (53:46):
If a guy turns up just like yeah, yeah, he
sounds sounds like beavers but looks like.
Speaker 5 (53:50):
He's dying, it's like, yeah, you could do that on
the boat, but no, yo, yo, I'm like, hey, I
have all of these guns out of that like just
fucking lying around taking your boat jack back.
Speaker 1 (53:59):
Yeah, like like like you've killed a bunch of NPCs
on there and they've dropped their loot, like.
Speaker 2 (54:03):
And he did. In fact, I mean like none of
these modern day drifters can do. They don't live interesting,
bold enough lives to have racked up the kind of
body count.
Speaker 1 (54:12):
Jean mccaut, he was like a like a Blacklist character.
He sure was like, yeah, Raymond Reddington.
Speaker 2 (54:20):
A lot of people in who died because they were
his nephew died.
Speaker 1 (54:26):
His nephew died.
Speaker 2 (54:28):
They had like a paragliding thing that they did in
the desert. They called themselves sky Gypsies, and they traveled
around like like drifting from little airport in the desert
to little airport in the desert, and like a seventy
year old man and his nephew, Booth crashed and died,
and then he fled the country for Costa Rica because
(54:50):
his whole family got angry at him.
Speaker 1 (54:54):
This is why he left. Yes, that's why he went
to Costa Rica. That first of all insane.
Speaker 2 (55:02):
Yeah.
Speaker 1 (55:02):
Second of all, none of the tech people are this interesting,
Absolutely not. Sam Molman has never fled a country. No ever,
he might though now he won't. That's the thing.
Speaker 5 (55:12):
It's the process is like Zuckerberg dressing like Stan you
know from the.
Speaker 1 (55:16):
I Love It. I love that he dresses like Kevin Federan.
Speaker 3 (55:20):
Yeah yeah, but even all of that's just a Peter
Teel's like bit. Yeah, it wasn't even his idea.
Speaker 1 (55:26):
So this is one of the many consultants that have
crawled up Mark Zuckerberg's asshole now being like, what you
should do is you should dress like like you've you've
just done a ZM pig, but you're also a nineties
rapper and he's like, oh huh, yeah, will.
Speaker 5 (55:41):
We ever get him rapping? You know those pictures of him?
Speaker 1 (55:44):
I would love him to rap. I actually should try.
We really do need a Howard Hughes era and a
Crazy Crone era of tech. I wonder what Mark Zuckerberg
is going to be like when he's like eighty, like
an Al Davis style just and.
Speaker 2 (56:00):
El director being yeah, just like a.
Speaker 1 (56:02):
Lich just floating with you. He won't quit. I think
he may Al Davis Meta And for the listeners who
don't know the NFL, Al Davis was the owner of
the Oakland Raiders and now the Las Vegas Raiders, and
he held onto the team until he looked like he
was physically dying. And I really do mean like bits
were falling off of him on air like it was
fucking insane. This is actually what I hope for Meta
(56:23):
more than anything. I actually don't think Meta lasts than
another ten years, but if it has the last decades,
I hope it's Mark Zuckerberg just crazy. You know, like
we're not running photos on.
Speaker 2 (56:33):
Anymore, nothing but the website is nothing but ais of
your dead grandparents and like literally loved once yea Zuckerberg
running it alone.
Speaker 1 (56:45):
You couldn't make your pop pop come back, you couldn't
bring your beautiful life.
Speaker 5 (56:51):
That's what he's gonna run for president? Yeah, that's what.
Speaker 1 (56:56):
Yeah. Yeah, I can't of course as a resident.
Speaker 5 (56:59):
Just want to really, you can't vote once, but you
can vote twice.
Speaker 1 (57:02):
No. No, people need to stop saying things that make
it sound like I am committing a crime.
Speaker 2 (57:11):
Don't joke about the title of our signal loop in here.
Speaker 1 (57:14):
No, we will not be saying that, nor will we
be saying it's blowing up like a cyber truck either. Yeah,
I'm sure that was a great post for you sending
it to my lawyer being like, it's a bit.
Speaker 2 (57:27):
Look, he's already dead.
Speaker 1 (57:29):
It's a bit. Yeah, I was.
Speaker 4 (57:31):
I was planning a bit.
Speaker 3 (57:35):
To bring my gray hoodie and my green jacket and
my net Gator and just walk.
Speaker 1 (57:45):
That's the thing, though, the show's got big enough now
that people are responding to me with stuff like that.
Luisg So I'm like, shot the man.
Speaker 2 (57:53):
People people love posting about crimes.
Speaker 1 (57:56):
People love to tell me the crimes they wish i'd commit.
Speaker 4 (57:59):
Yeah, that's always true.
Speaker 1 (58:02):
Yeah, yeah, I am a wow, would I am. I'm
not hitting anyone. I'm terrified of violence, both happening to
me and committing it.
Speaker 2 (58:09):
The only drugs I do were the ones I legally
buy its stores on the strip.
Speaker 4 (58:13):
Yeah, mushroom would be We're like, what.
Speaker 2 (58:17):
Kind of mushroom is in the gummy?
Speaker 1 (58:19):
Yes, it's another introduction to the Consumer Electronics Show and
this beautiful show I'm putting on. My boss, Robert evans
Uh sat down at the table and then produced a
mysterious bag. I think it's the fairest description saying these
are mushrooms I bought at a corner store. I don't
know a mushroom vape. There is also a mushroom everyone.
(58:41):
Everyone has been letting me know about the mushroom vapory.
Do not know what is? What kind of mushrooms?
Speaker 2 (58:48):
None of them tell you what.
Speaker 1 (58:49):
I'm not doing them, so.
Speaker 4 (58:50):
We would never vapor in this hotel suite.
Speaker 1 (58:52):
Yes, thank you two hundred and fifty dollars. I will
not get back anyway. This is my boss, the mushroom man,
and this is what the week is going to be
like because we refuse to be held by the norms
of tech coverage in the sense that we actually want
to have a good time. And I believe, and I've
done a lot of things about the tech media recently,
and this is not the fault of the individual writers,
but the outlets themselves. I believe we have all lost
(59:14):
the fun in tech. I don't think people enjoy their
gizmos anywhere near as much. No, No, I want to
enjoy my shit. I love my phone. It's got problems
with it, but I love technology. It's great.
Speaker 3 (59:24):
The funnest thing that this show can't be the LG
giant screen's exactly, and that's why I'm so I'm so
thankful that you've put this thing together, because yeah, usually
covering this is like kind of like brain breaking, and
I love having something to look forward to that's not
just the big screen, and also drinking as many, like
as many cocktails at Showstoppers as I can, which we
(59:45):
will be doing tomorrow.
Speaker 2 (59:47):
Night here every year.
Speaker 1 (59:48):
And the people of Showstoppers have been very unfair to me,
very nasty. They said that I'm a public relations executive
and they will not have me there.
Speaker 4 (59:56):
Will they still not let you into shit? No, they
want and make me pay and I'm not paying.
Speaker 1 (01:00:02):
They should pay me.
Speaker 4 (01:00:04):
Don't worry.
Speaker 3 (01:00:05):
I will I will oh no show drink show stop
drink for you.
Speaker 1 (01:00:09):
Maybe it's Pepcom. They didn't let me into either way,
if it showed it was Pepcom. I think nevertheless, many
mean people of Pepcom. If you don't let me in
your show, I will find a way in there. In spirit,
this is a threat, which is most threat.
Speaker 5 (01:00:23):
They're gonna blow down the door.
Speaker 1 (01:00:24):
No, no, someone is going to be there that will
tell me what happened, and then I will talk about it.
I will not be doing any crimes. Crimes, thank you.
It's going to be the entire goddamn week, isn't it.
This is going to be my entire week. Is Robert
and Gare and ed now and ed ed has joined
(01:00:45):
in saying Ed, why don't you do crime? And I
will be saying I won't. I've never committed, which is great.
But this week is I think it's going to be
a special week because I think over the course of
the week, we're really going to realize how fuck the
tech industry is. I think that they're there is some
life in the hardware side, but I think there's a
genuine fucking hole in the consumer electronics side that is
(01:01:06):
just bleeding, and I think people need to wake the
up about it. Maybe the reason CS is fucking miserable
is because there's nothing here that's cool like that. I
maybe that was why I liked it. In twenty eleven.
Twenty eleven, there was Stills. I don't remember very much
from that.
Speaker 2 (01:01:19):
Poole had a slide. Maybe there was the Mobile World
Congress that year. Either way, there was a slide at
one point.
Speaker 1 (01:01:24):
But also there was stuff that actually got made that
was interesting, that solved needs. And indeed, it might be
they've sold a lot of the needs already. But it's
like fucking how No, Like tech.
Speaker 3 (01:01:32):
Was evolving so much back then, Like like there was
like genuine like massive massive changes and improvements on like
the availability of having a supercomputerly in your hand at
all times. And I've had the same iPhone for four
years and nothing's.
Speaker 1 (01:01:50):
But that's actually the problem.
Speaker 4 (01:01:51):
Nothing is better.
Speaker 1 (01:01:53):
Maybe we're at the end, Maybe at the end of time.
Have we considered this?
Speaker 2 (01:01:57):
Yeah, I mean I have a buddy named Francis.
Speaker 1 (01:01:59):
Francis, okay, explain the joke for those of us who
get it but need the audience to hold.
Speaker 2 (01:02:05):
I mean, he wrote a paper that was largely misinterpreted
about at the end of the Cold War, about the
end of history.
Speaker 5 (01:02:12):
Right.
Speaker 2 (01:02:12):
It really wasn't saying what it sounds like.
Speaker 1 (01:02:14):
He was saying sounds like a thing I should have
read but did not.
Speaker 2 (01:02:18):
It's okay, no one actually read Francis Fuki. We all
just joked about, like.
Speaker 1 (01:02:22):
A real Shoshana Zubov situation. I don't know who the
fuck surveillance. She wrote a seven hundred page book. If
you read the first hundred page, you're like, God, damn,
I use it.
Speaker 5 (01:02:32):
To keep my door open so my cats can get
in my room.
Speaker 1 (01:02:35):
It is a very long It's a very long book,
and I say, this is someone who does very long podcast.
But the first hundred page she's like, damn, you're really
on something. In the next six hundred.
Speaker 2 (01:02:43):
You're like an editor would have been helpful.
Speaker 1 (01:02:46):
Oh, maybe definitely. But also the larger point is that
capitalism is simply sick. That we must just fix the
obvious problems because otherwise the tech companies would run well
if capitalism worked well. And the answer is shut the
fuck anyway. I'm gonna have Corey Doctor to talk about
that book at some point, because he has some fucking views.
Speaker 5 (01:03:05):
He wrote his own book about the books.
Speaker 1 (01:03:07):
Yeah, but it is interesting, and I kind of want to.
I think I'm going to try and ask everyone what
they were excited about this, because I need to find out,
first of all, who's a liar, and then second of all,
I need to find out if someone truly believes that
something is exciting here.
Speaker 4 (01:03:23):
Why.
Speaker 1 (01:03:24):
I don't remember a single thing I've ever seen at
CES other than one year where I saw a sign
for a seaman analysis thing with Ai. I was heading
out at the time because and I was not in
a great mood, because otherwise I would have gone in
and be like, can I try it? What do you think?
Speaker 4 (01:03:41):
Nah?
Speaker 1 (01:03:41):
Like what you see? The I says it's bad?
Speaker 5 (01:03:44):
Huh, I'm not, but you I'm pretty excited about that
folding laundry robot that you guys were talking about.
Speaker 1 (01:03:52):
I'd be really funny to go over there and be like,
where's my fucking shirt? Two years ago? It's my fucking shirt. Man,
he never gave me that ship back, like KG yelling, uh,
where's my money.
Speaker 2 (01:04:02):
I am a sicko laptop pervert, so I always look
forward to like Lenovo's weird twisting folding laptops. I've got
their folding the laptop now, and I love it. I
do like their sick little laptop.
Speaker 1 (01:04:12):
Weird little laptop, and I wish it was all that
weird fucking laptop.
Speaker 2 (01:04:16):
Look, give me a weird laptop. Put four screens on
the side, should have it fold down to the size
of a napkin.
Speaker 1 (01:04:21):
That's what I want from seeingos I want. I don't
mind if you don't make it. I just mind that
it's fun and it's like, hey, here's something that's plausible
and we are not building it. Yeah, that's so boring.
Speaker 2 (01:04:31):
I don't get angry at the folding one hundred inch television. Yeah,
I would never buy one, but I am like, that
is a TV that's folding. Yeah, it's pretty cool.
Speaker 1 (01:04:39):
And I could see that would be delayed due to
the fact that you're doing the thing unassociated.
Speaker 2 (01:04:43):
And no one wants mostly associated with.
Speaker 1 (01:04:45):
Clothes, and also there is no market of any kind.
But even then they still make them. You still get
like the fifty thousand dollars lg TV that can run
on a battery.
Speaker 2 (01:04:54):
It's like that three million dollar glasses free three D
television they used to bring around where it's like this
fucker works. You gotta stand in one specific area.
Speaker 1 (01:05:02):
But this is basically a giant three DS but it
doesn't work well. Last year they had I think their
biggest TV thing was like a transparent transparent TV.
Speaker 4 (01:05:13):
Yeah, yeah, yeah, I love that.
Speaker 2 (01:05:14):
That was so stupid.
Speaker 1 (01:05:15):
I love that because they were like, even like the
tech journalists just trying, they weren't trying it. They were
just reading off the fucking press.
Speaker 2 (01:05:21):
And it's like, I.
Speaker 1 (01:05:23):
Don't fucking know what. I don't fucking know why. We
have no more horizons, No, there's no more landsall got
the rock com bubble In July, I fucking said that
there's nothing left. We're at the end. And I will
not rate it because and that really it does feel
like I'm narrating the end of the world at times,
and this show does not fucking help. It doesn't give
me hope. I don't get I don't see anything at
(01:05:45):
cs WRO. I'm like, finally, no.
Speaker 3 (01:05:47):
But I think it's so crucial because the whole thing
about AI right now, the thing everyone says is like
it's only going to get better, right this is this
is the worst it's going to get. And the thing
that you've been so good on like specifically for your beat,
is showing how things don't actually like infinitely improve like
a like Google search is such a good is such
(01:06:08):
a good example that, like, no, sometimes things actually can't
get worse. We can't assume everything will just get better.
Speaker 1 (01:06:14):
Well it hasn't, though. I think that that's the real thing,
and this show kind of shows it in the nothing
has got better. They don't show us stuff here where
I'm like other than TV stuff. TV's are cool. I
love TVs whatever, big big computer fuss compute whatever. But
like of actual things we've seen at CS, I first
of all, cannot remember a single fucking thing I've seen it. See,
(01:06:35):
I just I go in. I look at them for hours,
and then my brain just leaves and I leave in.
My brain's like it's gone, okay, that's going with where Spanish, French,
German and Latin went.
Speaker 5 (01:06:44):
So who are the winners? It's it's gamers, gooners, investors.
Speaker 1 (01:06:48):
The thing is, I don't even think those always right.
Speaker 3 (01:06:52):
Three groups that always keep winning Gamers, goers and tech enthusiasts.
Speaker 1 (01:06:59):
But there's not even stuff for gunas anymore.
Speaker 2 (01:07:02):
Oh no, no, no, no, you know you know.
Speaker 4 (01:07:06):
A true goods.
Speaker 2 (01:07:08):
They're always DTG as the kids.
Speaker 4 (01:07:11):
We have been reporting on. I am come beat every cees.
Speaker 1 (01:07:13):
Okay, Well, love to introduce my listeners to this, especially
the ones that came in through the Very Serious podcast
and the Lena Kahn interview. That's the ones the people
hearing this from the Lena Connor.
Speaker 2 (01:07:23):
Lena Khan come on, come, let's talk about the FCC.
Speaker 1 (01:07:29):
No. No, she said she'd come back on theoretically.
Speaker 2 (01:07:34):
You know what you're doing, Jesus Christ.
Speaker 3 (01:07:40):
No, but I you know, twenty twenty five is the
year that we are all that we are all going
to have sex with robots.
Speaker 4 (01:07:45):
I'm excited for some kind.
Speaker 2 (01:07:46):
Of insane champaign room sex.
Speaker 3 (01:07:54):
No, I'm waiting for some like some insane like like
AI power like masturbation tool, because I I was I know,
I know there's going to be a new one on
the first year that they did. But there's gonna be
a new one on the front this year.
Speaker 2 (01:08:07):
They only sent us the sheaths, the actually handing out.
Speaker 5 (01:08:18):
This.
Speaker 4 (01:08:20):
You have to forge this one your it's somebody.
Speaker 3 (01:08:25):
It's like the most like sex averse group of people
walking the show floor and you have these like you
have these people enthusiastically trying to like give people loose,
and you're like, these people are these people are.
Speaker 2 (01:08:39):
Similar from countries where you're not allowed to come.
Speaker 1 (01:08:43):
Yeah, they're actually tell me I can't anyway moving home
from that one. But I'm so sorry to like the
respectable like table. We got like awards this year from
like real business publication. There's some posts like I can't.
Speaker 4 (01:08:57):
Wait to him this is like, this is good. This
is actually a pretty big part of how the.
Speaker 1 (01:09:03):
Fuck do we get gooning into the first episode of
this goddamn because.
Speaker 3 (01:09:07):
Because it's what else are you going to do with
all of those curved all of this curved tvss?
Speaker 5 (01:09:14):
Because what do you think they do with them after
they hold the laundry? You know, they put them, put
them off the line.
Speaker 1 (01:09:23):
Mistake.
Speaker 3 (01:09:24):
This is also like one of the one of the
biggest uses of like all of all of these lbs.
Speaker 4 (01:09:31):
It's just people using them as sex spots.
Speaker 1 (01:09:33):
And actually, you know what, I don't mind what people
do with that, but I will say I don't think
that they've got anything else other than that. I don't
think there's any other sticky that's.
Speaker 4 (01:09:45):
Really walked into that fucking hell.
Speaker 1 (01:09:49):
No, I actually conquim it sticky.
Speaker 2 (01:09:51):
I learned a lot socially by meeting other weird freaks
on World of Warcraft, right and doing role play sex
with them. That was the way you grew up as
a seventeen year old.
Speaker 1 (01:10:01):
Race were you? And what race were they?
Speaker 2 (01:10:04):
I mean you usually Niels obviously, of course. Okay, it
was two thousand and seven.
Speaker 1 (01:10:09):
I mean you could have been a NiFe. You could
have been a no. No, they were good new No,
you just you.
Speaker 2 (01:10:15):
Just tried things on the internet back then. Lou is
a person.
Speaker 1 (01:10:19):
But nevertheless, it is funny that that appears to be
the one use case of like because people are just like,
just like millions of people getting into erotica for the
first time. Now this is bartender. Phil just walked in
and that was the first thing he heard. So very
good stuff, Phil. All the normals are happening here.
Speaker 2 (01:10:37):
We're talking about how AI is helping people come.
Speaker 1 (01:10:41):
M I'm so I'm so sorry. By the way, some
people who listen to this podcast ostensibly to learn about
the Rotti economy and the tech industries collapse and now
everyone's talking about their willies and who hasn't.
Speaker 3 (01:10:55):
I mean, this is a core part of the industry though,
like really, like this is why we have a little
driven yah.
Speaker 1 (01:11:02):
Yeah, Actually pornography has always driven tech, which is why
they pulled the AVN away because then they'd have something
to aspire to. And also that's a profitable industry, right.
Speaker 5 (01:11:11):
Yeah, you can't remind people there's no money being actually made.
Speaker 1 (01:11:14):
Yeah, Like, oh my god, this is the thing. I
really I really didn't think about how likely it would
be that someone would bring up gooning, let alone on
the first episode. But I mean, this is still more
fun I've had any c Yeah, because you go into
these shows and they are definitely sexless. Yeah, and there
are definitely guys in the tech industry like if they
(01:11:35):
had like good sex, they'd be really happy, Elon Musk
being one of them. But it's it is also weird
how just like joyless, it is putting aside horniness deliberately.
It does just feel very fucking joyless. No one seems
to I think it's problem in the tech media as well.
No one seems excited about anything, no one even is
looking forward to it. It's like a group burden that
we all share every year. I just refuse to have
(01:11:57):
it that way. Anymore.
Speaker 2 (01:11:58):
Yeah, it's uh. I think it's good. I'm excited for
the rest of this. I think we probably should close
things up.
Speaker 1 (01:12:06):
Oh okay, don't worry. I'm I was just planning to
and the MC Robert.
Speaker 2 (01:12:10):
Now, I gotta show you guys some weird AI videos
after this.
Speaker 1 (01:12:13):
And I have to peek so little as we do
our outro. Now, I'm going to talk you through the
rest of the week. Robert and Gare will be joining
us again on Friday for one of our two slots.
But the way this is gonna work is you're gonna
get this episode. You're listening to this around twelve am
met I assume that this is the one thing you
do with your day, and then maybe about eight ten
hours like are you're gonna get the second episode. Then
(01:12:34):
we're gonna have David Roth of Defectory's gonna be joining us.
My two elves are going to be here to look
on the floor. I will be going to the floor,
Yes I will, but not until Wednesday due to situations.
We will have a gaggle of wonderful reporter as well,
Jesse Ferrara of your Kickstarter Sucks. We will have a
real life priest an actual Dominican monk. I'm gonna have
many more fun things in store. Robert Gare where can
(01:12:56):
people find you?
Speaker 2 (01:12:58):
You can find our d show It could happen here.
We'll be talking about some of the great AI generated
movies that I watched today during my panels. I do
not look forward to seeing Oh you're gonna oh, I
really unhappy and yeah, that's it. You can listen to
us on it could happen here Social profiles. I am
on the Blue Sky I write, okay, I don't want
(01:13:20):
to tell you my Twitter. Get off Twitter dot, go
on Twitter.
Speaker 1 (01:13:23):
Get weacome people find I am not following you, now
you're following me.
Speaker 3 (01:13:26):
Now you're not following me On Blue Sky, right, I
responded to a post game.
Speaker 4 (01:13:30):
But yes i am.
Speaker 3 (01:13:31):
I am on Blue Sky at at at Hungry Bootie
dot blue sky dot Social.
Speaker 1 (01:13:37):
I'm gray, so welcome.
Speaker 5 (01:13:38):
People find you on on Twitter and on Blue Sky
I am a big black Jacobin. And then I'm on
the tech bubble, the substack mostly address tech bubble dot
substack dot com.
Speaker 1 (01:13:55):
Very good, and you can of course find me at
zychron and gab and otherwise not No, I'm on I
am not on gap.
Speaker 4 (01:14:02):
I am not on anymore because it's just Twitter.
Speaker 2 (01:14:05):
But it's a secret account.
Speaker 4 (01:14:06):
I mean.
Speaker 2 (01:14:09):
Some people I'm following you to do a shoot.
Speaker 1 (01:14:12):
Just no, I'm not gonna tell you ship because you
can see the episode links and also Zitron is on.
Who the fuck else is a dumbass name like this?
This is, of course, the first episode of the Better
Offline c S Extravaganza Saga of course mastered by the
wonderful Mattasowski, who is here producing us in real time.
Thank you so much for listening, very excited for this week.
(01:14:38):
Thank you for listening to Better Offline. The editor and
composer of the Better Offline theme song is Matasowski. You
can check out more of his music and audio projects
at Mattasowski dot com, m A T T O S
O W s k I dot com. You can email
me at easy at Better Offline dot com or visit
Better Offline dot com to find more podcast links and
(01:14:59):
of course my new I also really recommend you go
to chat dot Where's youread dot at to visit the discord,
and go to our slash Better Offline to check out
I'll Reddit. Thank you so much for listening Better.
Speaker 4 (01:15:11):
Offline is a production of cool Zone Media.
Speaker 2 (01:15:13):
For more from cool Zone Media, visit our website cool
zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.