Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media.
Speaker 2 (00:04):
We're still here, We're still in Las Vegas. I'm still
at Tron and this is still better offline. This is
the second episode of Day five, our last two parter,
(00:25):
and I was trying to think of something like glib
and kind of sardonic to say, but I'm just gonna
be honest. I've had one of the best weeks of
my life. I've just been really enjoying myself. We've had
so many really, really great guests. I've got David j
Roth of course from Defecta Hello, and edwardso Jr. Hello,
(00:45):
And Carl Schernard of the Las Vegas Sun, and all
week has just been awesome. People. Kyle, I talked over you.
I apologize just saying hello. Yeah, that's enough. So Kyle
your Ed's show, Okay, getting ahead of you. No, So
you're a general assignment reporter. Yeah, So what have you
been covering at the show?
Speaker 3 (01:05):
Well, it's pretty unique for me because, you know, a
lot of the media that comes here is from you know,
across the country, across the world looking at whatever new
text coming and I get to cover from a local
angle and how it affects Vegas and fifically, you know
what shown here could be implemented in the city in
the next couple of years.
Speaker 2 (01:21):
So can you talk a little bit about that. I'm
a local. Yeah, I'm out in Green Valley, Henderson. Well
then no, it's still less Vegas no ate true way.
Speaker 3 (01:30):
Yeah, so yeah, I was. I was talking to a
couple of different companies this week. One of them was
Autonomous Hotel, which is they call it the first AI
powered hotel.
Speaker 2 (01:40):
Okay, can you define any of that?
Speaker 4 (01:42):
Yeah?
Speaker 3 (01:43):
So, a lot of what that means is collecting a
lot of user data and then using that to personalize
kind of the experience of It's also it's a hotel
slash apartment, so there's gonna be some split between there.
But for the hotel, it's you know, like remembering your
coffee order, what direction like your windows, and then when
you come direction I like that, No, like if you
(02:05):
like it facing south or west or north?
Speaker 2 (02:06):
Oh, sorry for your Okay, that makes most sense for
a second. But being like brutalized by CS long enough
that I'm just like, okay, man, they got moving windows.
Speaker 3 (02:14):
Now, No, the windows I'm I've been I wasn't told this,
but I'm pretty sure they're stationary.
Speaker 4 (02:20):
Uh, saying somewhere else.
Speaker 3 (02:23):
So, yeah, there's there's a lot of premig just data
scraping from the week they were I asked about, you know,
data privacy and stuff like that. Uh, and they were
all about, you know, whatever data they collect, is the
user giving it to them?
Speaker 1 (02:34):
And did they sell it? I imagine no, they said they were.
Speaker 5 (02:39):
Well, really more regifting in this scenario.
Speaker 3 (02:43):
No, they said, you know, they they're put a lot
of emphasis on keeping that data very secure. You know,
I was talking with the culinary Union.
Speaker 2 (02:51):
No, no, but sorry I had to push back. Yeah, yeah,
secure is not the same as not sharing it. So yeah,
have they been remotely giving on I mean, if the
answer is you don't know, yeah, I'm not sure. No. No,
but this is not a failing on you. This is
them being like, thank god, thank god, we don't have
(03:12):
to say. The thing is, which is we are selling
this to the points guy and airlines and hotels, because
that's the thing. These hotels around here are like data warehouse.
Oh yeah, the amount of shit they collect on you
is crazy, not just through them. I actually, this is
a question do you know anything about data collection practices here?
And this is I don't either, So if you.
Speaker 3 (03:33):
Don't, I'm not super familiar with it. I mean, surveillance
is nothing new in Las Vegas. You know, it's not
rare to find hotels working with the government. That's pretty normal.
Speaker 2 (03:46):
In what ways they work together.
Speaker 3 (03:47):
Oh, you know FBI, if they're looking for something I'm not.
Speaker 1 (03:52):
I honestly don't know these.
Speaker 2 (03:54):
If there's even because I love living here, but I'm
also a weird, greasy freak and I understand what Vegas is,
which is you walk in here if you're not being
to Vegas, so you haven't spent a lot of time here.
There are cameras everywhere, Yeah, everywhere you go. Not in
the rooms, I think, but in the hallways, in the casinos,
and there are You may think that the scariest place
(04:16):
cameras can look at you is a bank. It's actually
a casino. They are they are watching and you can
make the glip to ooh the iron sky. No for real, though,
it's one of the safest places to be, yeah, because
they are watching you, and they're watching you because you
could do stuff with their money, which is not good.
But anyway, continue, So you've seen this this AI powered hotel, Yeah,
(04:39):
and it's it's.
Speaker 1 (04:39):
Opening next couple months.
Speaker 2 (04:41):
Which one is? Where is it?
Speaker 3 (04:43):
It's actually by Allegiant. It's like, I think, like a
mile away from Legion. It's not on this trail.
Speaker 2 (04:47):
Oh, so it's one of the closer places.
Speaker 1 (04:49):
Oh whit? Yeah, walk into Legions fun.
Speaker 2 (04:53):
I have season tickets. Man, I want to die.
Speaker 5 (04:56):
Can you walk it?
Speaker 3 (04:58):
You can walk from the MGM. Yeah, I it's not
delan know anymore. It's w but I usually just park
there and walk. Yeah, it's I went to the Syracuse game.
Speaker 6 (05:08):
So yeah, this is a question that I've always sort
of struggled with here as a and you know, in
Los Angeles as well, any of the sort of as
a New York person that like I like the idea
of being able to either take mass transit or walk
to a thing, and yet like Las Vegas isn't big enough,
like theoretically, like your phone will tell you that's a
forty five minute walk, and yet I feel like most
of that is just fully impractical.
Speaker 4 (05:29):
Yeah.
Speaker 2 (05:30):
I mean I wasn't that bad to get too though.
Speaker 1 (05:32):
I was talking with Oh god, what's the CTA not CEO?
Speaker 2 (05:36):
The other guy, Oh I do not know, don't want.
Speaker 3 (05:39):
I was talking to him and He was like, you know,
I was asking for a bit of advice on ceas
and he's like, every walk is longer than you think
of this.
Speaker 2 (05:46):
That's actually great conference advice.
Speaker 4 (05:48):
Yeah.
Speaker 2 (05:48):
So, but I'm still kind of confused this AI hotel.
What else does it do other than allegedly remember my preferences?
Because I have an idea you could have some sort
of data base of sort, like a place for datar data,
and you could put the data in that base and
then you could simply remember which way my thing. What
(06:11):
does what does AI do with this bit?
Speaker 1 (06:13):
So a lot of it's with their app. So they
have this thing.
Speaker 2 (06:16):
Oh it's key.
Speaker 3 (06:17):
It's called k ee where they and they described it
as I have it here at twenty four to seven
butler and the poll of your hands.
Speaker 5 (06:28):
Sure, so it's just a lot of the requests that.
Speaker 3 (06:30):
You'd be making, not really having to go through a human,
just saying it into your phone right room.
Speaker 2 (06:35):
This doesn't add functionality to the hotel, surely because the
butler brings you things.
Speaker 3 (06:40):
Yeah, so a lot of the functionalities I guess just
having to avoid humans. I mean I asked them about that, like, hey,
how many humans are going to be working here and
it's around thirty.
Speaker 1 (06:51):
Which is a hotel, which is not a lot. I mean,
there's we've that's that's what they told me. I want
for myself.
Speaker 2 (06:57):
This isn't your company.
Speaker 5 (06:58):
Don't worry.
Speaker 2 (06:58):
I'm not mad at you.
Speaker 5 (06:59):
But there's like a.
Speaker 4 (07:01):
Full time and part time employees.
Speaker 1 (07:03):
That's what That's what they say.
Speaker 3 (07:04):
At a big like Las Vegas three hundred rooms, it's
not that big.
Speaker 5 (07:07):
That's still pretty big.
Speaker 3 (07:08):
Yeah, like that's ten rooms, but a lot of the
also is apartments and you don't need as many stuff.
Speaker 6 (07:14):
It is funny that that's like basically where the the
AI thing. It goes from being like it's like a
you know, an application on your phone. It's like that,
but it's AI to suddenly like you get to the
where the actual rubber meat throat on all of this stuff,
which is fewer.
Speaker 5 (07:28):
I am going to stay there and I am.
Speaker 2 (07:30):
Going to do an episode, okay, and I'm going to
have a piss fit.
Speaker 4 (07:34):
You're going to disappear.
Speaker 2 (07:37):
Well, I will be disassociating, like that's the That's so
it's just frustrating because it's like theoretically an AI hotel
could well in the sense that if they were to
find use of preferences that they could just kind of
move around them. I'm not talking about generative A. I'm
talking about theoretically algorithms that are capable of knowing a
(07:58):
user's preference, but only in a much larger hotel system,
like I don't know Marriott.
Speaker 3 (08:04):
Yeah, well, it's interesting you say that, because I asked
them about that. I asked, you know, is there any
interest from hotels in the area, and they said, you know,
currently we're trying to get everything up and running and
ready to go, but that there was a lot of
interest from other hotels and kind of the system you're making.
And right after I got my interview, I saw someone
from a Vegas hotel repsent it from a Vegas hotel
(08:25):
come by and talk.
Speaker 2 (08:26):
Because this isn't entirely out of the Roman possibility for
thinking about Marriott, for example, they have I don't know
why I'm talking up like a publicly traded hotel phone,
but like Marriott's pretty decent, like you can put your
shit into it. And the frustration I have with the
AI part is that, yeah, I've specified what kind of
hotel pillow even very different from a hotel that I
(08:48):
like in a hotel stay. I'm just not sure what
else this does other than what they've is suggesting, which
is what if we just had less people.
Speaker 6 (08:56):
Well, it's the same thing as the stuff that we
were talking about yesterday with the like smart homes. There's
a lot of that with AI related stuff, and it
seems like some of it it's like a a degree
of convenience that like doesn't just verge upon but like
goes fully into infantilization or just like i mean, which
is the place where you get infent That's more of
(09:17):
what you'd in some ways, like it makes more sense
for the Isn't that what a hotel's for?
Speaker 2 (09:22):
Right now?
Speaker 5 (09:22):
You're like you're removing variables from the equation that Like.
Speaker 2 (09:25):
It also feels like there are very obvious a I
thinks a hotel could do, such as making check in quicker,
making sure you check in with that, making sure no
but making sure the cleaning is done based on when
Like I don't know, I'm I ain't no tech doer,
but I don't know if I had an algorithm that
would say, and I'm sure Vegas has these where it's like, okay,
(09:45):
we have a hotel of X size and why number
of people come in on a Friday, so we can
say that we need this much, but we have this
one person.
Speaker 5 (09:54):
It definitely has that.
Speaker 3 (09:54):
Yeah, so this isn't that, No, this is about what
the experience for the and I mean to be fair,
A lot of hotels, and I talked to professor, I
talked a lot of people about this. A lot of
hotels have just a billion different legacy systems running every
single function.
Speaker 2 (10:09):
Which feels like the things to upgrade rather than this,
which is and this is that's kind of the point
they're trying to do.
Speaker 3 (10:14):
It's one completely integrated system for the entire hotel.
Speaker 1 (10:18):
You know, everything is.
Speaker 2 (10:19):
Content's gonna have so much fucking trouble selling that. So yeah, sorry,
just there's no way. It's why we have airlines on
nineties computers. Like you think the Grotzi system of Venetian. Yeah,
people know I use the Venetian a lot. They do
you think they're going to upgrade an entire multifested system.
(10:39):
It seems like an unreal that's physician.
Speaker 1 (10:41):
Yeah, when I was talking to a professor from you
and LV, great guy.
Speaker 2 (10:45):
You're a real journalist. Also very clear here, like's.
Speaker 3 (10:49):
Great I'm trying to be very very accurate my wording.
One thing he told me is that that's one of
the main reasons. I mean, hospitality is Vegas kind of unique,
but hospitality as industry is not for being on the
cunning edge. It's known for kind of not the way
of say, being very stubborn and not really changing it.
And one of the reasons they're kind of stubborn and
(11:10):
don't really adapt to the times as quick is because
of that. Because there's a million systems interacting with each other,
and when you change one, oh god, that took down everything.
Speaker 2 (11:19):
It's an equilibrium thing. You can't just mess with one
part of a hotel.
Speaker 3 (11:23):
And that's kind of why, like for a new hotel,
for them, it's like, Okay, let's not make a billion
systems and then have to integrate them all later.
Speaker 1 (11:31):
Let's just get it at the outset.
Speaker 2 (11:33):
They can't be the first windows of hotels though they cannot.
Speaker 3 (11:37):
Yeah, I'm not sure if there's other companies having that,
but they were emphasizing that Witell.
Speaker 2 (11:40):
I'm just imagining startups will do that. So what else
have you seen moving off of hotels?
Speaker 3 (11:44):
Yeah, I mean I think one of the most interesting
things I saw was from this company Sorensen, which I
believe is based in Utah. They're at West and so
we were talking earlier about this being kind of like
a decent use of AI, like there are.
Speaker 5 (11:57):
Still good uses of an no I would love to hear.
Speaker 3 (12:00):
And so it's a real time translator that works for
specifically for like longer form presentations in a city like
Las Vegas.
Speaker 1 (12:08):
That's obviously very important.
Speaker 3 (12:10):
You know, we have a convention authority, and the way
it works is that once it's set up by the event,
all you do is scan a QR code and then
you can have a real time translation of whoever speaking
at the front on your phone in I think it's
twenty five languages. And one of the cooler parts about
it is that they also trained it with different dialects.
Speaker 2 (12:29):
Okay, I was gonna say, so is this generative AI?
Speaker 1 (12:32):
It's a lot of training training data, Jobert.
Speaker 2 (12:36):
Is it a generative model?
Speaker 5 (12:37):
Is it?
Speaker 2 (12:37):
And you a technology?
Speaker 4 (12:38):
Is it?
Speaker 2 (12:38):
Basically?
Speaker 1 (12:39):
I believe it's a generative model, right.
Speaker 2 (12:41):
So my one concern with that, and I'm glad that
you mentioned dialects is everyone that I've talked to about
I can't speak any other languages. I can barely speak English.
I thus have no experience with it. But everyone I
talked to is like, there are these subtleties. Yeah, it
sounds like so how with extra dialects.
Speaker 1 (13:02):
So it's like all good things.
Speaker 3 (13:05):
It's a very human solution to that, where they just
have a lot of really trained people that you know,
pretty consistently are checking the models to make sure that
they're they're working. It's it's not They don't just set
it to the side. They were really big on this.
They don't just set to the side and hope it
works that. They are pretty consistently checking it with a
group of trained professionals.
Speaker 2 (13:24):
One of the reasons I love having you here is
being able to respond with this, which is Vegas is
quite intolerant of bullshit. Weird for this place. It's the
you can bring whatever you want to see, yes, but
it's like, oh, you want to sell to our beautiful casinos,
with our beautiful slot machines that bring us off.
Speaker 5 (13:40):
We love our money.
Speaker 2 (13:42):
Thank you, thank you. You understand we love our slots.
Why we don't spend we don't have state income tax.
Our beautiful slot machines. We love them, folks. But it's
like Vegas is quite intolerant of just shit that don't work,
because ship that don't work is extremely unprofitable.
Speaker 1 (13:58):
Yeah, and a lot.
Speaker 3 (13:59):
When you're working in the hospitality industry, your main job
is to keep people happy.
Speaker 1 (14:04):
And when you.
Speaker 2 (14:04):
Want Vegas people are babies.
Speaker 3 (14:08):
And when you are running into tech issues and this
was again this professor. He's giving me all my lines.
Speaker 2 (14:15):
What was the professor's name.
Speaker 1 (14:16):
Oh, let me get it out.
Speaker 5 (14:17):
Don't worry about no, no, no, no.
Speaker 2 (14:19):
Look, this show can be quite cynical and hitty, and
I say words that people don't like, and they get
upset with me, and they email me every day and
they say.
Speaker 4 (14:29):
Ed, I hope you die.
Speaker 2 (14:30):
And I imagine the cyber trucking you ed. I imagine
the Ford f one fifty ed, I sort of a
Ford f one fifty rapped or hang you you bounced
and you you went in so much.
Speaker 1 (14:41):
His name is a met or dem.
Speaker 3 (14:43):
He is a resorts sorry, the chair of Unovis Resorts
Gaming and Golf Management department.
Speaker 2 (14:48):
And I imagine a prius anyway. But that's the thing.
I want these people on here because there is a
thing I love about Vegas, where there is a dishonest
on this day. It's just you can't just fling shit
here because it's a very working class city. It's a
very it's a pragmatic city in many, many different ways.
(15:09):
So it's kind of like, I'm more willing to humor
the idea that they would have this translation thing. Yeah,
just because putting aside all my feelings, Vegas would simply
be like, n this fucking sucks. It's going to get
between the customer and the slums.
Speaker 3 (15:22):
But I mean, when you're running a com I mean
especially comments like ces having real time translation when I
think every panels in English.
Speaker 2 (15:29):
Yeah, Anglo well, anglers, anglophote.
Speaker 3 (15:33):
Anglo centric when you know, I'm talking people and there's
a billion languages here, and you mentioned working class. Another
part of this is for at story I'm publishing later.
I talked to the Culinary union about their tech protection.
Speaker 2 (15:47):
Cool. What did they say?
Speaker 1 (15:48):
So they have they've.
Speaker 3 (15:50):
Been working since twenty eighteen, probably longer actually, but they've
got protection since twenty eighteen, specifically regarding tech replacing workers
with that. When tech gets introduced to that would affect
someone's job, they have to and they negotiated this. They
get a six month notice and part of that there's
there's kind of two things that come out of that.
(16:11):
One that gives them time to kind of work out
the kinks. One example type of a George the Treasury
Secretary Treasurer Secretary told me, was there was this new
system for housekeepers that basically was sending them all across
different zones, all across different floors, and they're like, hey,
this is you're gonna.
Speaker 1 (16:32):
Break the backs of the workers here.
Speaker 3 (16:34):
Yeah, And so with that time they were able to
get a fix, no problem. And what it also does
that if a job gets eliminated or if a position
is eliminated that like Vegas has a there's plenty of jobs,
so especially in the hospitality industry. So it gives people
the time to find somewhere else within you know, the industry,
(16:56):
stay with the union and you know, keep their pension.
And there's also a pretty decent severance package if your
job is eliminated from the attack.
Speaker 2 (17:05):
So fucking cool working union man.
Speaker 5 (17:08):
No, I simply accept no substitute thing.
Speaker 2 (17:10):
Like I live here in people like edge. You live
in a vending machine and that's why I like it.
But also there are actual really strong unions. Yeah, there's
a strike. I've been covering it Virgin. Yeah yeah, so
what's the strike? I care way more about that than
the do.
Speaker 3 (17:27):
Oh yeah yeah yeah, So they've been on strike for
a little bit. Now regarding which who is this is?
This is Virgin Hotels, right with the culinary union. The
latest proposal, and this was from a little bit ago,
was well, the original proposal I believe was no raise
for the first couple of years of the contract, and
this was from Virgin from Virgin, and then the secondary proposal,
(17:47):
at least the secondary proposal I heard about his journalist
was a consistent thirty cent per hour race per year, wasn't.
Speaker 2 (17:54):
The argument from Virgin They couldn't afford anything?
Speaker 1 (17:56):
Yeah?
Speaker 2 (17:56):
And how true was that?
Speaker 3 (17:59):
So it's hard because their private company, Yeah, so kind of.
So they used to have their casino run by Mohegan,
which you can look like the tribe, right, so like
you you can look at that data and their casino
or I think it was the only one that was
(18:20):
I think losing money, which isn't great. So again they
do have a point, like Virgin is not a super
successful hotel when it comes to its casino. But what
they what the union keys pointing at is all you know,
these giant corporations, a lot of them are. There's what
it's the Leuna pension fund, one of them in Canada
(18:40):
that owns the owns the hotel part of the ownership group.
And then there's another company called fen Gate. So you know,
the company points to you know, we're an offstra property.
We can't give you strip pay. And the union's pointing
at their management saying, you definitely have the money. If
you want to invest in Vegas, you have to invest
(19:00):
in the workers, and they've consistently said go back to Canada.
Speaker 2 (19:08):
Yes. So I just want to be clear, Kyle can't
say this and is no way offering any opinion on
what I'm going to say, which is solidarity. Now, fuck
you virgin. Now moving on, what else have you seen
the Vegas related while you've been here?
Speaker 1 (19:23):
Yeah, So the other.
Speaker 3 (19:24):
Major Vegas company I was looking at was a robotic
company that's been here for a while.
Speaker 1 (19:28):
That's rich tech.
Speaker 2 (19:29):
Okay, tell me about them.
Speaker 3 (19:31):
So they work on a couple different things. So you
have like this atom it's called Adam. It's a bartender robot.
And when I was at the booth for cees. It
was kind of interesting because I.
Speaker 1 (19:42):
Didn't see demos of the robot making drinks.
Speaker 2 (19:47):
But it was a bartender.
Speaker 3 (19:48):
Yeah, it's like a bartender robot, so it has two arms,
you can do everything. And the thing that surprised me
was like the main thing that brought people over to
the booth was when the robot was dancing to Apa
Tu by Bruno Mars and I think.
Speaker 2 (20:02):
Familiar to a song since two thousand and seven your views.
Speaker 6 (20:07):
It is funny that they had the robot partender not
making drinks, but it was like.
Speaker 1 (20:11):
It might have been at some point when I was there,
it wasn't.
Speaker 6 (20:13):
It was consoling another robot, right, better than.
Speaker 3 (20:22):
Go back to so But the the thing, it was
interesting because it was drawing a crowd by its little
dance with the music. And I was talking to someone
with the company and they were like, you know, the.
Speaker 1 (20:34):
The show of it, the spectacles.
Speaker 3 (20:36):
I asked like, you know, at what point do we
get past a spectacle and into just being there?
Speaker 1 (20:40):
And he's like, spectacles part.
Speaker 2 (20:42):
Of the cell. This is Vegas quiet, spectacles part of
the cell.
Speaker 5 (20:49):
Yeah, that's also CS too though, right, like it seems
like the Vegas.
Speaker 2 (20:52):
Is honest about it.
Speaker 5 (20:53):
Yeah, yeah uh.
Speaker 3 (20:56):
And then they had another system that was a lot
more utilitar. And you also, by the way, can find
some of their robots at all the Boyd Gaming or
not all of them.
Speaker 2 (21:05):
But what are the Boyd Gaming for?
Speaker 6 (21:07):
That's like the Awe strip, that Sam's Town and those ones. Yeah,
somebody was telling me about there was the it's.
Speaker 3 (21:12):
At the Orleans Alante. I might be pronouncing that wrong. Sorry,
I'm new here.
Speaker 1 (21:17):
And sun Coast cool.
Speaker 3 (21:19):
So they'll have like delivery robots like delivering food and
then this other robots Skylark and that's more like getting
a robot to clean floors and make deliveries and stuff
like that. And so, I mean, you know, like all things,
the main one of the main things preventing it from
mass adoption is price. What was it they used to
(21:40):
have a lot of the robots used to be like
you just it's like I think it was like one
hundred and eight thousand dollars, But now they're working off
more of like a subscription model.
Speaker 5 (21:51):
Baby much of that maybe.
Speaker 2 (21:55):
Do exactly.
Speaker 3 (21:56):
So you have the bartender robot, which is I have
the price can it tend bar clean, I mean, like
actually drinks reliably.
Speaker 2 (22:06):
How does it accept the drink order?
Speaker 5 (22:09):
Yeah, I'm sure aware of how drinks.
Speaker 1 (22:11):
I'm not sure.
Speaker 4 (22:12):
I know the drink make a little hand motion, but
he also probably does. It also come with like an inventory.
Speaker 3 (22:19):
System, Like I'm not sure exactly how its inventory system works,
but i know for like Tipsy Robot, which is kind
of like the other that's actually at the Phenician.
Speaker 6 (22:28):
Yeah, there's one downstairs and that has like f.
Speaker 3 (22:32):
That's like a the point of sales system where you
just kind of like tap on a tablet and it
gives you the drink. So I imagine it will be
something similar to that. I'm not positive, but I imagine
its probably a similar system.
Speaker 4 (22:43):
To robot because it feels like it probably it's probably
not too hard for them to be like, okay, as
long as we catalog and keep every single drink in
a certain place, and maybe if you like make a
request that's outside the bounds of the inventory, then it says.
Speaker 5 (22:59):
Do you mean Yeah.
Speaker 2 (23:02):
I feel bad about these though, because there are two
cities that have my heart, New York and Las Vegas,
Like it was my favorite. I got here in a
weird way, and I will probably leave here in a
weird way. But the thing is, the bartenders here are fascinating,
that's fun to talk to. I don't want to rope
at replacing them, because first of all, I don't believe
(23:22):
Robert will do his good job. But also that was
not me coughing for any ironic reasons. I really was
just coughing. The bartender's here rock. But also, like the
accumulated experience of witnessing Vegas is what makes a bartender
masks all about people watching, yeah, but also people experiencing Yeah.
Speaker 6 (23:42):
This kind of comes back to another one of the
things that we kind of keep bumping up against with
going down there is this idea that somehow the important
thing I understand it from a business perspective, but from
any other perspective, the idea that you want to remove
human interaction from every process and every transaction.
Speaker 1 (23:58):
The thing I I keep saying this.
Speaker 3 (24:01):
I asked them about that, and both this company and
I think Atomics also to this. They don't want to
replace humans. They want them to like work alongside them.
Speaker 4 (24:14):
They want to augment them.
Speaker 1 (24:16):
The phrase, the phrase I was told was coba.
Speaker 5 (24:20):
That parallel, So it's cobot.
Speaker 3 (24:24):
So instead of it being a robot replacing you, it's
one you work with.
Speaker 2 (24:29):
That is loathsome Well, I don't like the idea of like,
unless the robot is like doing annoying things, you don't.
Speaker 4 (24:37):
Want a robot understudy.
Speaker 2 (24:40):
Yeah, what if?
Speaker 6 (24:41):
What if it just what if it just sat on
your arm and looked at you and blinked?
Speaker 5 (24:44):
Sometimes?
Speaker 1 (24:46):
Was that the pet?
Speaker 5 (24:48):
Yeah, we've heard.
Speaker 6 (24:49):
Good things weirdly like previous episodes, but yeah, they're kind
of like.
Speaker 5 (24:52):
Look, I don't I was not initially that into the idea, but.
Speaker 1 (24:56):
It's very it's very large eyes like I have my
cat and I love my.
Speaker 6 (24:59):
Yeah, a lot of people are living things. But yeah,
that the cobot thing.
Speaker 2 (25:06):
I just Also, it sounds like something said by someone
who has not worked a job in a while.
Speaker 6 (25:11):
It also sounds like the hyper loop shit where it's
basically it takes two people to get two people into
a car that then takes them someplace.
Speaker 5 (25:18):
But it's not.
Speaker 3 (25:18):
And then you had the pro public of story that
came out this week about which was about what about
him to do something bad?
Speaker 5 (25:25):
This don't tell me if you did something I want to.
Speaker 1 (25:27):
Do something against I mean no me.
Speaker 6 (25:31):
Oh yes, I got when I went over to the
convention center yesterday, and it was fully the dumbest shit
I've ever done in my life.
Speaker 5 (25:39):
I really loved it. I thought it was amazing.
Speaker 1 (25:41):
Yeah, I need to give this a proper read.
Speaker 3 (25:42):
But pro Public published a piece of car Elon's Elon
Musk Boring Company is tunneling beneath Las Vegas with a
little oversight.
Speaker 5 (25:48):
Oh really, Well, I'm sure they're probably doing a good job.
Speaker 2 (25:51):
That's the weird thing, though, It's like this city seemed
more resistant to that kind of stuff. Are they just
letting him dig tunnels because it gets the money?
Speaker 3 (26:00):
Well, I mean it's I have not covered the enough
to answer that.
Speaker 2 (26:04):
That's fine, No, sorry, I must be clear. I'm not
holding you to account here. You didn't do the reporting.
But it's just like it feels like such an aberration
because one of the things that destroys people about Vegas
is that everything is convenient at all times. Everything is
fifteen minutes exactly, thank god. Damn. It's so good that
real resident here which is five months, three years baby,
(26:27):
and you know it better than anyone. And it's like, wow,
I can have diet coke whenever I need to The
problem is there are people who have other kinds of
coke they can get in fifteen minutes, and then they do.
Then there are other proclivities they can fuel in fifteen minutes.
Speaker 6 (26:42):
Is this what they mean by fifteen minute cities when
you hear that phrase, Yes, yes, I start a place.
Speaker 3 (26:50):
It's a fifteen minute city run by cars.
Speaker 5 (26:53):
Yeah. Actually that's really weird.
Speaker 2 (26:55):
I'm so glad you're on the last episode. This is
valuable information about Vegas. The thing is the result of
every proclivity Jesus Christ being available at all times, is
that Vegas is just like no, man, I understand, you
got this new tech and you're very horny and you
raised all this money, like very exciting. However, you're between
the beautiful slot machines and our beautiful customers, who are
(27:18):
anyone who is here, and they ain't going to PRIM.
Speaker 1 (27:22):
So ye had a pretty bad believe they are pretty
bad gaming.
Speaker 2 (27:26):
And by the way, if they ever find a way
to get people quicker to PRIM one hundred billion dollar industry,
I'm just giving people like that. Primm is a city
that is far further than it looks.
Speaker 6 (27:37):
Oh it's Nevada. You guys, did one of those that's cool.
Speaker 2 (27:40):
No, yes, in the sense that there is a strange
authority with money that has created the city.
Speaker 5 (27:49):
He's ambitious.
Speaker 2 (27:50):
It's a strange, but I don't have enough prim exp
some shrooms.
Speaker 4 (27:56):
And going to Robert, Let's find Silica Valley's Eldorado.
Speaker 2 (28:03):
But the launcher point is that Vegas is intolerant of things,
not because of good or bad, but because of efficiency.
Speaker 5 (28:11):
Oh yeah, it's all bad efficiency.
Speaker 2 (28:12):
And that's the weird thing. Cus exists in this very
inefficient way here and it's just Vegas is like, I'll take.
Speaker 3 (28:20):
A little well yeah, I mean it's it might be
inefficient for the tech industry, but it's a great money
maker for the city, which.
Speaker 2 (28:29):
Is a great description in Vegas. I mean, this may
destroy you, but it.
Speaker 3 (28:33):
May you know, it might not be great for you know,
there's AI being put into everything all that stuff, but
I mean, look, it's a very large conference that the
city makes good money off of. And we just had
a recent F one report or at recent November gaming
revenue report and F one not performing the way people
want it to. So I mean these conventions are in
(28:55):
I mean, they keep the city afloat, especially I mean
when people couldn't travel here in twenty twenty. I mean
I wasn't here about it. I wash then you know strange.
I mean it was also the depression here was it
was terrible.
Speaker 2 (29:07):
Like this is a working class city, and when you
think about c s a lot of engineers and such,
Vegas is a place which is very working class. And
the effects of these conferences are quite pronounced.
Speaker 1 (29:19):
Yeah, and it's important, it's important for the city.
Speaker 2 (29:21):
So very meaningful. Ending for a third of the episode
that mister David rothwork and people find you.
Speaker 6 (29:29):
Defector dot com the website and I do the distraction
podcast there.
Speaker 2 (29:33):
Well you're on okay, Well just move past that. You
messed up the order there.
Speaker 5 (29:37):
Oh I did. I'm sorry.
Speaker 2 (29:38):
No, no, no, no, it's gone.
Speaker 5 (29:39):
I don't get any second shot at this working people
take it over.
Speaker 1 (29:43):
I am still do it.
Speaker 2 (29:45):
Another cut it.
Speaker 1 (29:48):
I'm still on X believe it or not.
Speaker 2 (29:50):
I told the Everything app.
Speaker 6 (29:52):
Yeah, I'm on there too, but I only only for banking.
Don't use it for social network.
Speaker 2 (29:58):
I use it.
Speaker 1 (29:59):
I put my cat scans in there. Ye see what
Crock tells me about it.
Speaker 3 (30:02):
It's like, l.
Speaker 5 (30:05):
You have disease.
Speaker 6 (30:07):
Your discs are very cheeky, you get the elon butt
head last when it uploads successful, you're like, cool, it's
processing way, Kyle, where could people actually find you?
Speaker 1 (30:20):
Kyle Underscore Schnard C H O U I N A
R D.
Speaker 3 (30:23):
And then the Las Vegas Sun dot com or just
Las Vegas Sun dot com, edam Gwaiso Junior.
Speaker 4 (30:28):
I am on uh Twitter and blue Sky the Foreign
Agent Registry of the shack up in I won't say
it for which country, we can guess. For my newsletter
it's uh the tech Bubble dot substack dot com. And
for my podcast this Machine Kills.
Speaker 2 (30:52):
You can find me at Where's your Head dot at
for my newsletter and the podcast is called Better Offline.
And you're thinking this seems way too honest for it.
What's he gonna do to me next? And the answer
is nothing. This is a clean break. We're about to
go to some advertisements and you're gonna listen to them
intently or else you are committing crimes against me. And
(31:14):
I know you love the crimes you do on this show.
You love to fill up the reddit something that will
be called Exhibit A through Z and I regret the
crime jokes, but I'm not gonna stop them anyway. The
following ads very legitimate companies or podcasts. You're gonna listen
to them, download them, and I won't feel any pain
by their products. And we're back. So we have replaced
(31:46):
David Roth using science with gear Davis. You are here
from it could happen here. I am a cool Zone
Media product. We're colleagues. How are you doing.
Speaker 5 (31:57):
I'm so tired. I've watched so many steps this week.
Speaker 2 (32:00):
I feel like I have new energy. That's the problem.
I'm going into the last day with more energy than
I have when I arrived.
Speaker 5 (32:06):
That's that's that's another Cees miracle.
Speaker 2 (32:09):
It's it's beautiful. Kyle Shanad from the Las Vegas Sun
either and of course Edward and Guiso Junior. Hellow, that's
really bad. Yeah, Kyle, you were at the same panel.
Tell me about this panel.
Speaker 5 (32:25):
So it was this panel done by a number of
tech companies. Adobe had a spokesperson there as well as
one of the DHS Science and Technology at representatives. It
was ostensibly about like deep fakes, AI generated like information
UH and disinformation and misinformation. And I've been to a
(32:46):
lot of these panels over the years. I went to
one last year at ce YES put on by Deloitte.
That was that was actually okay. And then I went
to a few at them at the r NC earlier last.
Speaker 2 (32:58):
Year, and that's the Republican National Convention.
Speaker 5 (33:01):
Correct, correct, So you know it's a good good, uh
good panel for journalists to go to speaking of speaking
of disinformation. But yeah, I I uh, this was on
Thursday that we were we we walked into the LVCC,
went to this panel and it was uh one one
person in front of me did fall asleep?
Speaker 1 (33:22):
Yeah, I thought it was.
Speaker 2 (33:23):
What was the panel about?
Speaker 3 (33:25):
Also, it was about U misinformation, disinformation and deep fakes.
Speaker 7 (33:28):
Uh.
Speaker 3 (33:28):
And I think if you didn't know much about it,
it probably could be a little shuffle. But for I
think as audience, maybe not as much.
Speaker 5 (33:37):
Yeah, yeah, I mean it was. It was talking about
you know, various various tools to identify, uh, like AI
generated or or or deep faked like information or or
you know like you know, pictures, videos, and you know
how a whole bunch of the previous like visual tactics.
Right a few years ago, it's pretty easy to spawn
a AI image. It's maybe slightly harder now. It depends
(33:59):
on the model. But how you know, how with with
with how fast that kind of that is developing, especially
for stills. It's it's it's going beyond like visual detection.
You have to actually create tools to detect this, and
those tools can also be prone to false positives and
false negatives. And so there's this another technology that a
whole bunch of companies like a Microsoft making a big
(34:21):
push for it as well as well as Adobe. It's
kind of like a build in systems for when you
generate AI content that that like clearly identifies it as
such in the metadata. And that that's something I heard
a lot about last year, and they talked about it here.
I think they called it providence.
Speaker 1 (34:38):
Providence was the word of the day.
Speaker 5 (34:40):
Yes, that was the That was the word that they
that they used was these uh, these providence systems as
opposed to like, uh detection systems, which is like you know,
like like post post post talk, you know, we we
will use we will use that against you know, AI
or suspected AI content. Uh that was you know, maybe
not generated with this built in information.
Speaker 8 (35:03):
Yeah.
Speaker 2 (35:04):
So how was the panel though? Was it useful? Like
did it make profound statements or was this more cslop.
Speaker 3 (35:13):
I mean I wouldn't call it c yes slop because
there's the oh yeah, the ones where you just sit
there for half an hour and go like a, well
that was half an hour I could be doing something else. Yeah,
I mean it was good for you know, learning the basics.
And they talked about, you know how like you mentioned,
it was a lot easier to detect things a while ago.
(35:33):
One thing I wish they and this is not your
reallyid question, but one thing I wish they did talk
about was just how text based a lot of this
AI misinformation, Like it's a lot of the misinformation that
gets published.
Speaker 1 (35:48):
I mean, if you go on X be everything.
Speaker 2 (35:51):
App as I do for my Bankking of course.
Speaker 3 (35:56):
I mean I I go into I can't even read
the comments anymore because I have to scroll through a
billion in blue checks, and so many of them are
just like uh, obviously AI generated based off analyzing the
image and then writing the most basic comment possible.
Speaker 1 (36:10):
So I wish there was a little more focus on that, if.
Speaker 5 (36:13):
You want to no, sure, I mean like that is
that is a massive section of it. And this is
something even I asked a question about at the Republican
National Convention at Microsoft panel being like you have all
these tools for like AI images, right, you know, images
of politicians and in debaucherous acts, you know, all all
these sorts of things. But they also advertised like AI's
(36:33):
ability to make specifically like like a specific like a
user uh user specific political like press releases basically like
like you know, like a campaign can send an email
like based on like a voter's profile that can be
tailored using AI to specific voters and like that what
(36:54):
could go wrong with that?
Speaker 2 (36:56):
And also do you mean creating data to specifically push
vote in one direction?
Speaker 5 (37:00):
Well yeah, yes, but like but you know, like you
if you're working for a campaign and you want to
target specific voters, you can analyze their social media presence,
you know, whatever kind of information about them is in
certain data seats that can be bought and and make
you know, an AI written press release specifically for them.
And this is something I also like asked questions about,
like you're talking about you know these like uh these
(37:22):
AI AI metadata watermarks for images, but what about for text?
Like how will I be able to know if an
email I'm getting from a politician was written by a
person or written by a robot They're like, well you can't.
We simply aren't gonna worry about that.
Speaker 3 (37:37):
And like, I mean I walked up to y'all afterwards,
and I was like, I guess the lesson is that
we're all screwed.
Speaker 1 (37:44):
Yeah, we all, we all are screwed. I mean that
was the main takeaway.
Speaker 5 (37:47):
That was what the DHS was saying, which I was like, maybe,
oh god, that was the main takeaway I had.
Speaker 3 (37:53):
Was everyone pretty much saying like this is going to
get worse, and it's an arms and like all the
stuff is an.
Speaker 5 (37:59):
Arms and it is and like you know, they they
gestured towards you know, quote unquote bad guys or like
you know, foreign state actors specifically Iran, does you know
a lot of work on this? Russia? But and you know,
I think in some ways to focus on those two
might be kind of lifted as the new is the
new administration focuses more on China, but specifically for like
(38:20):
for like just disformation using AI tools. Iran and Russia.
Like the past year has been like the main players
targeting US voters.
Speaker 2 (38:29):
It is so wild. We just have I'm this is
a real dumb guy statement. Do we just have other
countries who are just fucking with the US citizen tree
and it's just like a thing that happens.
Speaker 5 (38:39):
Oh yeah, it's a huge it's like a it's a
it's a huge project.
Speaker 2 (38:43):
Disidered an act of war.
Speaker 5 (38:45):
Yeah, I mean it's it's it's an extension of like
Cold War stuff, right, like it's.
Speaker 2 (38:49):
So we just have wars that are like not really
yeah yeah, yeah, pretty much. I mean, I mean, I'm
also describing how colonial Britain work by paying people not
to like teach people to write. So I realized I'm
like the Bill Belichick of But it's just very frustrating
to hear this stuff and not really know what to
(39:10):
do about it because it could happen here. Does amazing
welk on disinformation for something as like a person?
Speaker 5 (39:16):
Okay no, but no, no, okay, you know, it is
a huge problem and there really isn't much to do
realistic like we have tried to make fact checking work
the past few years, and I don't think Americans are
that much better at identifying false or genuine information. Like
fact checking has kind of failed as a as like
(39:37):
a large as a large project.
Speaker 2 (39:39):
I almost feel like what they actually need is vibe checkers.
I mean, they need someone, but it's why do I
feel bad looking at this? What I'm saying is everyone
needs therapy.
Speaker 4 (39:50):
Jesus Christ, astrologers and estate.
Speaker 5 (39:54):
State mandated therapy is the solution here.
Speaker 2 (39:56):
Yeah, definitely a state mandated vibe checker. Is this real?
Speaker 5 (40:02):
I would apply to be checker? What do you think
about this? It's not.
Speaker 9 (40:11):
Big.
Speaker 10 (40:13):
Don't read the end of Jiu Jitsu Kaise and it's
a very disappointing manga, very unfair. The end of Demons
LA have far better seven similar, terrible ending. It took
out the US superhero.
Speaker 4 (40:28):
Joe's bizarre adventure is holding up?
Speaker 5 (40:31):
It was normal. I didn't know. Look, think you were weaves.
That is interesting. I am I appreciate other cultures. Oh,
I don't give a ship.
Speaker 2 (40:39):
I'm a huge weave.
Speaker 5 (40:41):
Do you know what I'm wearing?
Speaker 2 (40:43):
I was wondering if that was a reference. You look
like you might explain this one gap because I don't know.
Speaker 5 (40:48):
I don't guess.
Speaker 2 (40:49):
No, like Anti Anti was it that one?
Speaker 1 (40:53):
No, there's no chainsaw man right, No?
Speaker 5 (40:55):
No, Oh, I thought you were one of those agents.
Speaker 2 (40:57):
It's one of the club ones or in high school, yes,
host club one yes, just to be clear, is the
best dressed of any of us and also fucking rocking
like an actual Manger related outfit and nailing it.
Speaker 5 (41:13):
I spent a lot of money on this as you
should have looks great. I I I found. I found
a very nice blue blazer that I defaced by putting
on the orange patch.
Speaker 4 (41:23):
This is good. This good. I've been admiring your fits.
I feel like back home. I'm usually I'm usually dressed
the best dress and dressed best.
Speaker 5 (41:32):
Your pants are fantastic.
Speaker 4 (41:33):
I appreciate it.
Speaker 2 (41:34):
I appreciate it, but no one's saying anything about me.
Speaker 4 (41:38):
Good personally. I think you should have gone with the
aviator shades.
Speaker 5 (41:42):
I think you should have gone with the shade I
would love. But I was talking about this with with
my boss Robert Evans last night. We should bring the
men's wear a guy here one year. I have him
walk around.
Speaker 2 (41:58):
Ceyes, you have anticipated. He's also not cheap, No, he
wants no.
Speaker 5 (42:07):
Derek guy is legend.
Speaker 2 (42:09):
I love him, and he also knows what he's worth
and should he should It's a pro labor podcast.
Speaker 5 (42:14):
Is that his name Derek Derek, which is so fucking.
Speaker 2 (42:18):
Funny to be like the menswear guy, and your second
name is guy.
Speaker 5 (42:21):
Very very cool. I would I would love for him
to walk around the show floor. I think it could
do some real psychicama.
Speaker 4 (42:30):
It doesn't rate my fing. I never want him to
put his I have so on on me.
Speaker 2 (42:36):
I say this with my not quite parasocial but here's
my Derek Guy's story. So I lost a lot of
weight this year. And this is not a me trying
to actually conjure up people saying anything. I'm fine. What
I'm saying is I lost all the weight and I
bought a bunch of clothes and I took a picture
and I'm like, I still feel like shit. So I
emailed Derek Guy, as one normally does in tears, and
(42:58):
I was like, why do I feel bad still? Because
I was like, you know what, if he doesn't respond,
he doesn't respond, but if he does, he can help
me deal with something like an emotional thing where I'm like,
I feel better about my body, but I don't like
what's on it. And he was like, you have no
aesthetic and he explained the concept of aesthetics. One of you.
Speaker 4 (43:13):
Responded, yeah, there, he's like the chump ski fashion.
Speaker 5 (43:21):
For the.
Speaker 2 (43:23):
He's still responding, well, this is kind of what I
was addressing with Gare, which is he isn't there to
eighty six people. He's there to try and explain what
looks and feels good. And he talks about clothing his
social language. And he said, you have no aesthetic. You're
wearing like trainers with like a threadbelt. My shit was busted.
I look terrible because of the clothes, not because of me.
(43:45):
I would actually love him to do that, but also
walk around being like, this ship looks fucking good because
I feel like and with this show, especially if you're
just a hater, were loving nothing, you're just vacuous.
Speaker 5 (44:00):
Occasionally you see a very well dressed Asian businessman fucking rocks,
and then you see a lot of a lot of Asian,
a lot of Asian businessmen in very ill fitting suits.
But occasionally you'll see one guy who has that ship on.
Speaker 4 (44:14):
Yes, I did see someone who had that ship on
and they were like looking at other people's fits. Would
discuss That's.
Speaker 2 (44:20):
What I've been doing all week, baby fashion ped.
Speaker 4 (44:24):
I really I should have gone up to him and
talked to him.
Speaker 2 (44:27):
Yeah, you have notes about the convention, rapp, Why this
is my actual job. Okay, yeah, I forgot why we
have we just you know, yeah, at the YAP index
is over five thousand right now, Japanese, can please bring
us back to why we're actually well?
Speaker 5 (44:45):
So, I guess one thing I was spending almost all
of yesterday doing, as we've been intundated with these AI products,
is is learning about all of the AI products targeting
your offspring, your kids who are being raised now with
AI the same way my generation was raised with social media.
Speaker 2 (45:03):
Nice son of mine.
Speaker 4 (45:04):
This is like cocoa melon ship where it's like put
this in front of your kid, or it's like the vice,
like consumer products for the anxious parent.
Speaker 5 (45:11):
You know, both these things kind of these these things
canna kind of overlap. The first thing I did yesterday
morning was a TENDED panel called Raising AI Kids Responsibly.
Great title because this could either mean you're making an
AI child that you get to raise, or it's about
how do you raise kids in the world at AI?
And now it was the latter. I kind of wanted
(45:33):
the former.
Speaker 2 (45:34):
I hate all of it.
Speaker 5 (45:36):
It was bad, Uh it was. It was in some
ways it was bad and you know, a little bit cringe.
But there's also some interesting you know stuff set here.
There was two products that were that were displayed. One
of them was from this company called ready Land, I
believe partners with Amazon. It's basically like it's an AI
story book that interacts with an Alexa machine. Now, one
(45:59):
thing about them that I think is actually really good.
They don't they don't generate any new AI content. It's
just using AI to stitch together basically kind of a
choose your own adventure book, but for kids to to
have like a physical book that they read with with
the Alexa machine that then can make them like toxic
characters change the story in different directions. But all the
(46:19):
content is pre baked. It just it just gets it
just gets assembled in different ways.
Speaker 2 (46:23):
Just one idea.
Speaker 7 (46:25):
As the father of a son, the son of a mother,
a brother of a sister, you can do this thing
when you're reading to your child where you can think
of something else.
Speaker 5 (46:36):
I understand, And this is what my parents did to
me and macination. This is what I was thinking about.
Like in the panel, I'm like, yes, this is this
is cool. You know, it's safer than a whole bunch
of the other other stuff I'm seeing. But how this
kind of steals away the joy of reading to your kids.
Speaker 4 (46:56):
When my little brother is growing up, you just don't
realize how bad shit child is.
Speaker 2 (47:01):
So I avoid talking about my son in general. I
won't name him because I believe he should have his
own destiny. But one of the most wonderful parts of
being a father is having my son come and talk
to me about something he just thought up, and it
will be something Minecraft related, and he will explain something
I did not know about Minecraft, and you will, like,
(47:24):
he will explain something in such detail that I've never
even considered in my life, and it's something quite simple.
But it's because he's been allowed to go off in
these directions with no prompting, with no.
Speaker 5 (47:35):
Yeah, no totally, I mean my bit, Yeah, it's beautiful.
Speaker 2 (47:39):
Yeah, And it's like the idea of depriving children of
this makes me so.
Speaker 5 (47:43):
Fucking actually, especially when it's targeting like five year olds,
which is like where it gets more upsetting to me.
Speaker 2 (47:49):
And taking the fucking parenting thing of having an imagination
about what your child could be is so fucking sickening.
Speaker 5 (47:56):
I'm getting angry this is now. Unfortunately, like I actually
felt relatively better about this product. So it's basically kind
of just like it's kind of it's kind of just
a visual It's kind of it's kind of just like
a visual novel, you know, like like those uh like
those visual novel games. But you have much you have
physical books you're reading now. Compared to the other product
called Poe the AI Teddy Bear, which I also saw,
(48:22):
so much worse is it? It is what it sounds like.
It's a teddy bear that comes with an app where
you can put in certain like parameters for like I
want the story starring this character with this archetype of
civillain in this setting, and it'll generate an AI story
for your child, generating new content. So unlike unlike story
Land or unlike unlike ready Land, Poe the AI Bear
(48:43):
is actually is actually generating live content, unreviewed by unreviewed
by moderator. It's just straight to your child. It's fifty
dollars on Amazon. You can order this thing right now
and talk to it.
Speaker 2 (48:54):
He putting my tongue in my cheek. When did I
do when I'm pissed off about?
Speaker 5 (48:59):
Side is gonna hang out? You're low key tweaking right,
I'm like names places he did talk about. He's like,
you know, like chat GPT does does have guardrails for content.
But but those guardrails you know, don't always reliably work,
(49:21):
but they're better than nothing, and and and content moderation
is an issue that we're working on. I'm like, yeah,
but your product is hit the market like you are selling, we're.
Speaker 1 (49:31):
Working on moderation. Isn't a great answer?
Speaker 5 (49:33):
No, not a good answer. And this is something like
even even the other guy with like the AI storybook mentioned,
He's like, it's pretty easy to make like your AI
kids toy not say swear words or even even even
like you talk about like sex or drugs. But one
thing that's even harder to moderate. It's like what if
it's what if it says like inaccurate or or actually
(49:53):
like like like a like dangerous information, you know, like
what if it's what if it goes in a really
weird direction and starts like and starts like talking about
he son so that generally exactly general exactly models.
Speaker 2 (50:03):
They don't even understand how the training data truly interacts
with the system totally, they don't know how this shit
truly works. The only useful thing in quantum computing related
to AI right now is the fact that they can
actually have models that can discern what the training data does.
So the idea of my child interacting with all of
these I realize now like nothing actually makes me angry
(50:27):
other than harms to my child, at which point I
might actually go falling down mode.
Speaker 5 (50:32):
So it's such a good movie too.
Speaker 2 (50:34):
It's such a great movie.
Speaker 5 (50:35):
That's a good movie.
Speaker 4 (50:37):
Now.
Speaker 5 (50:38):
The other the last part of the last thing about
this panel is that it was it actually opened with
this person from the company Ido Ido Play Lab Partnerships,
who was the first company to partner with Sesame Workshop
to make to make apps for kids. So I'm like,
this is interesting, Like Sesmei Workshop usually I consider I
consider being like pretty pretty thoughtful in how they produce
(50:58):
media for kids, and like, if they're choosing to partner
with this company, maybe I'll listen to what they say.
And they didn't have anything to sell. They just had
data that they've been collecting on how gen Z thinks
about AI. This thing's becoming increasingly invasive in our lives,
Like how do we think about it and what do
we really want out of it? And some of the
way this lady presented stuff was a little bit odd.
(51:20):
She kind of presented all of all of the data
findings as like shocking surprises, which I think may have
been tailored for the CEES audience.
Speaker 2 (51:26):
Sure, so I feel like more attention to the details
isn't bad.
Speaker 5 (51:31):
Sure, and like this is this is what she said
her uh, her data like showed and and you can
you can look you can look this stuff up on
their website. The main question she asked is what if
the tech savvy generation gen Z isn't buying what we're
selling anymore?
Speaker 2 (51:48):
Fucking hell, imagine the customer does the thing you're building.
Speaker 5 (51:52):
Uh, she said, Like, you know, gen Z is typically
seen as early as like early adopters, early users, and
they usually are, but they also come with the most
amount of informed opinions about how badly the tech feels,
how cringing it is to use, and how it affects
their sense of humanity.
Speaker 1 (52:07):
So so the issue is that they know what's up.
Speaker 5 (52:09):
They know what's up, which was you know, if you're
in businesses is like as either a hurdle to overcome
or some insights to help you, you know, maybe maybe
pivot or change in a completely different direction. She identified
like the key areas of tension around AI for gen
Z is one creative expression.
Speaker 2 (52:27):
Uh.
Speaker 5 (52:27):
You know, it's its ability to you know, have us
feel proud of the art that we make and how
it affects human relationships. And she she brought up a
few questions or like examples of of the types of
stuff that she's that she's like you know, asking asking people, uh,
as a part of this data collections, Like let's say
you've let's say let's say you've had a friend breakup?
(52:49):
Would you rather an AI tool kind of like like
like counsel you through that process, you know, like like
you know, like bounce ideas off and try to try
to like move on or figure out like what happened?
Or do you want like a temporary friend replacement? Do
you want the AI to become a friend for you instead?
Would you rather would you rather that be your friend
(53:10):
for the time being? And like no, we we actually
don't want AI friends. That's that's not what we want.
Which is a lot of stuff at Yes, this year,
it feels so much of AI this year is like
it is like about replacing human friendship, so to preserve
human friendship.
Speaker 2 (53:27):
Right, Sorry, it's fucked up because the idea of tech
at least ten years ago was they would bring us
closer together and would allow for deeper connections. Now it's like, buddy, like,
yeah about that, you want to connect with the computer.
Speaker 4 (53:42):
I mean, this is what happens when I mean all
these you know, this is what happens when everyone starts
reading Marcus Aurelius a little too closely.
Speaker 2 (53:49):
So true, I don't know that reference. I am an idiot.
People need to stop quoting books to me.
Speaker 4 (53:59):
I think that you don't know.
Speaker 2 (54:04):
Yes, yes, ed, Yes, Actually, Russell Croche will say that
one of the funniest things on Twitter, which is he
responded to one of the CNN reports I believe at
the time and he was just like blocked blogger, which
is one of the funniest things to say. Anyway, let's
just ignore my forty.
Speaker 5 (54:23):
But this was the same thing that she was talking about.
It is like, you know this thing where that what
we're all seeing is like these air products designed to
replace the role of human friendship. And another thing she
presented as like this like surprising the problem of meeting people. No, no,
it's it's it's it's it's it's actually helping you like
overcome the fact that you like lost a friend. It's
(54:44):
trying to be like, hey, it's okay. If you lost
a friend, this AI can be your friend instead. Yeah, yes, I.
Speaker 3 (54:50):
Mean the solution to like this loneliness epidemic, especially young
people are going through. I can't imagine is more robots,
you know.
Speaker 5 (54:57):
And yeah, one thing I keep thinking that's driving alienation.
Speaker 4 (55:01):
One of these guys who did who does? Like these
companion bought chat bought sites. I think character AI, you're like, oh, okay, well,
you know, you seem in interviews to be earnest about
like people are lonely. Maybe we can't replace, but we
can offer a salve that helps people get back to
the point where they have human friendships. Again, what does
he believe about people? And it's just like let's hang
(55:23):
homeless people and you know, let's murder the poor like
these like this sort of it's not a coincidence that
someone building that sort of firm has such deeply pathology.
It's so such the.
Speaker 2 (55:34):
Top as well, because the idea of an AI you
can bounce ideas off of, is not inherently a rib idea.
We sit there, we think about shit, and the idea
of having a log of it. I general a great
deal and I think many people listening to this, dude,
there's an idea of looking at this, but are like, yeah,
what if your journal was a person.
Speaker 5 (55:53):
Can help it gets worse?
Speaker 1 (55:55):
Oh yeah.
Speaker 5 (55:56):
The next thing that choose as an example is like,
what if there's an AI that's trained on your preferences
to train on your dating preferences what you like aesthetically
and what if that could instead go on your first
dates for you? What what? What if?
Speaker 1 (56:12):
What if this is literally a black mirrors?
Speaker 5 (56:14):
What what if this could like handle like icebreaker questions
and like and like get over like hard life experiences
easier for you.
Speaker 4 (56:21):
One of my favorite little perverse things was there once
said one time said he doesn't like sex. What he
would like he's just a Slovenian philosopher psychoanalytics.
Speaker 1 (56:32):
That's a lot he's you know, he means, yeah, yeah, exactly.
Speaker 4 (56:36):
And he was talking about how the ideal sexual encounter
for him is two people taking their sex toys and
those sex toys playing with each other.
Speaker 5 (56:44):
Exactly, and this is this is exactly what that's disgusting,
and the ideal, the.
Speaker 2 (56:50):
Ideal perversion, the pure perversion of that is two people
using those sex toys together to get each other off.
Speaker 5 (56:56):
No, that's great, that's wonderful.
Speaker 4 (56:58):
Good perversions pro sex podcast Christ the the the one
where you send an avatar.
Speaker 5 (57:04):
That different, completely autonomous from from you and evil. And
it was like this is this is why what she
was saying was so odd because like she presented this
as like a surprising revelation that that gen Z would
rather live life themselves than have an AI live your
life for you, and like like that was what she
(57:27):
was trying to say, but it was so odd having
that presented like some like surprising like exclusive fact that
you can only get through like data research and and
and she she talked about like there is like gen
Z sees value in having like bad dates and like
and like and like actually and actually like and actually
like overcoming like challenges in life and like that's that's
(57:48):
actually a core part of being human and we don't
want that process like uh, like like smoothed over with
like tech can't solve the hardship of life. The hardship
is part of life. That's what makes life work living and.
Speaker 2 (58:00):
Like, yeah, how old are you?
Speaker 5 (58:03):
I'm in my early twenties.
Speaker 2 (58:05):
Sorry, I didn't want to get so gen Z. Yeah, Yeah,
how do you feel about these assumptions?
Speaker 5 (58:13):
I mean, like again like this is these are questions
that she's like that she's imposing to like you know,
focused groups of of of like Jami, but like but
they're really pandering, They're they're there.
Speaker 2 (58:25):
Is it accurate?
Speaker 5 (58:26):
Well no, because they assume such a base level of
stupidity that they're like kind of offensively like even like
framed the fact that I would be like would you
rather have an AI go on dates for you?
Speaker 4 (58:36):
Like?
Speaker 5 (58:37):
Why why would you ever ask me that? That's that?
That's fucking stupid?
Speaker 2 (58:41):
Some of the fun parts about romance. Wow, set the
word correctly. They're always saying this about me with how
words go? Is fucking up about exact of yourself that
get banged off someone and you feel that you learn
about yourself exactly.
Speaker 5 (58:57):
And that's and that's what she was saying, is like
that's act atually what people want. People don't people don't
actually want AI telling to to like live your life,
you shouldn't take and and and and she and she
she specifically provided to push back against this idea in
in the in the tech industry, where like the smoothest
possible path is the best one, right, you you want
(59:18):
you want, you want to like optimize every part of life,
and like what if that optimization actually isn't the point?
What what if this like idea in the in the
tech industry that we have to optimize and smooth over
every hardship misses the entire point of living. And you
have all these tech bros being like, oh yeah, huh,
I guess so maybe maybe we shouldn't smooth over all
(59:40):
the problems.
Speaker 3 (59:41):
It's like a it's a very maybe I'm getting this
reference wrong. It's a very like Patrick Baateman Bateman like
things of like I am optimizing everything to the tea
and I will you keep it, keep it going?
Speaker 4 (59:52):
I mean no specifically in his in the book, you
know that is like the sort of train of thought
he has. This he's like kind of imp empty, engaging
with life at the very superficial, craving something else.
Speaker 5 (01:00:08):
Like the final point she had is like and again
this is all it's all kind of panderings, like you're
saying like you're saying, like gen Z, but like this
is like acrossing a lot of people, like but she said,
like gen Z doesn't trust AI to understand the nuance
of their lives and minial of course, but yes, like
it was not a panel because on one hand you
(01:00:29):
have these like like AI toys for kids, and then
you have this woman with this company that works as
sess to me and she's like, yeah, actually gen Z
doesn't want AI to run their lives, and you're like, yeah,
who could have thought.
Speaker 2 (01:00:44):
So, as we approach the final second of the thirds
of Better Offline, Yeah, where can people find you?
Speaker 5 (01:00:53):
Well? I am I am on X the the Everything
at Hungry bow Tie as well as Blue Sky at
Hungry boy Tie, at Hungrybowtie dot Social and it could
happen here and the podcast with me Robert Evans and
a few of our other colleagues where recover sometimes tech
but you know, politics, culture, just disinformation, all that kind
(01:01:14):
of fun stuff that greases the wheels of our society.
Speaker 2 (01:01:17):
Yummy, Kyle, I can be found on X the Everything
app I've Got.
Speaker 3 (01:01:24):
Kyle Underscore Schnard and then my my writing's on Las Vegas.
Speaker 1 (01:01:28):
On dot com.
Speaker 2 (01:01:29):
Oh yeah, and Edward Guaso Junia.
Speaker 4 (01:01:31):
You can I in the real world. I live at
sixteen hundred Pencils in the Avenue Northwest in Washington, d
C s code two zero five zero zero.
Speaker 5 (01:01:40):
That's a nice neighborhood.
Speaker 4 (01:01:41):
It's it's kind of nice. It's got a great view
of everything.
Speaker 5 (01:01:45):
Nice.
Speaker 4 (01:01:47):
What else? I'm at a big black Jacobin on Twitter
and Blue Sky and This Machine Kills is my podcast,
and the Tech Bubble dot Substack dot com is my newsletter.
Speaker 2 (01:01:58):
You are approaching the final third of the final two
part episode of the Consumer Electronics Show, and I just
want to say something to you, which I'll elongate at
the end of this. I'm so grateful for you giving
me your patients. With this, I will say, I need
you to listen to the ad. I don't know actually
what happens after I'm done talking. I never do. Frankly,
(01:02:19):
I barely understand what I'm doing when I am talking.
Because Madisowski over here heals Jesus Christ. I can't even say,
here's we're keeping it. Madisowski hears me mess up my name,
mess up the name of companies, or just mess up
a word. But you know, I keep reading because podcasting
is in my blood and that's who I am, and
that's my identity. It's very normal and healthy. Please listen
(01:02:42):
to the ads, Please download the podcast, plead buy the product.
If they don't give me another contract, it's going to
be very bad and my therapist is going to be mad.
(01:03:05):
So we enter the end of the truly unified end
of the Better Offline podcast. This incredible week, and I'm
not even gonna do any like sardonic shit. I'm just
so happy. I'm surrounded by people who are super game
to make podcasts too. Like I'm like genuinely like near
tears of how happy I am with everyone being game
for this. Everyone has been so amazing, all the journalists
(01:03:26):
who have joined us. It's just been the best fucking
week of a week that is generally quite depressing. Also
feels like a good point to admit two things. One,
I watched Neogenesis Evangelian and I was a teenager, and
I did not understand the subtext. Oh yeah, so the
whole time for most of my life was watching this
(01:03:47):
show and being like, Danny's robots are fucking cool, and
then like some weird shit happened. I guess I'm like, okay,
but okay, the robot's back. Okay, there's like some stuff
from space and there's a big head kind of fucking weird,
I guess anyway. Welcome to Better Offline. I'm Ed Zichron.
We are joined by everyone. We're joined by Robert Evans
of cool Zone Media. That's right, all right, David J. Roth,
(01:04:09):
who is gonna have to grab the mic? Hello of
defector Edward Nguiso Junior, of course, will now have another mic.
Kyle Shanard of the Las Vegas Sun, Hello, and Gaed
Davis of it could happen here, Cool Zone Media and
Associated Properties.
Speaker 5 (01:04:26):
Hello, Hello, Hello.
Speaker 2 (01:04:28):
We're at the end of CES. We have one more
positive masculine e day coming after this, but this is
really the close out, as all of us just slop
ourselves into the remaining quarters of this convention center. I
could not have had more fun if I tried, But
next year I'll try. Robert, how has the show been?
How would you summarize the show?
Speaker 5 (01:04:47):
It was great?
Speaker 8 (01:04:48):
I mean, obviously AI is here to stay. This is
the worst it's ever gonna be. Everything's thank you, only
gonna get better and thank God. You know, there's this
little kid that I helped take care of and it
takes a lot of time, as you know, as a father,
ed a lot of time and a lot of effort
to raise a child, and I'm just excited that TCL
(01:05:10):
has a solution to that problem.
Speaker 5 (01:05:13):
We can do.
Speaker 8 (01:05:13):
It's what I've always dreamed, which is that you keep
like a hamster like feeder and a water like you
just took up a hose to the room and you
lock the child in with a robot until they're eighteen
or twenty like the room exactly.
Speaker 5 (01:05:24):
Yeah, it's perfect.
Speaker 8 (01:05:25):
You know, as long as you get like one of
those lights that gives you some sun kind of effect,
you know, they won't die probably, like and that's ideal.
Speaker 5 (01:05:33):
Have you introduced this product?
Speaker 2 (01:05:35):
I I'll introduced Oh wait, no, no, we have not
just into the disparagement.
Speaker 5 (01:05:44):
No, I'll move on.
Speaker 6 (01:05:45):
No, I mean this is in the we had talked
about how like cute little guys are part of the
like that again about me, it's but that like this,
the TCL is the most I looked at this also,
and they have like a whole little video of like
a towheaded child interacting with this thing. Yes, and his parents,
the kid's parents are kind of like standing off to
(01:06:06):
the side being like super listen, do you want to
like go to the movies or whatever?
Speaker 5 (01:06:11):
But it's the most.
Speaker 6 (01:06:12):
It's probably the cutest of the cute little guys, but
then also easily the most sinister, the most evil. Yeah,
it's the most sinister.
Speaker 5 (01:06:19):
Well, and you stray towheaded.
Speaker 8 (01:06:21):
I didn't disagree with with Garrison, So I'm wondering how
you thought, because I felt like they cast that child
because it looked like he had leukemia definitely.
Speaker 5 (01:06:31):
Well, I don't want to say anything bad about the kid, right,
I don't know. Yeah, yeah, yes he did.
Speaker 2 (01:06:38):
Well.
Speaker 6 (01:06:38):
What he looked like was honestly the scenario you described,
which is basically like we're growing a child the way
that people grow mushrooms.
Speaker 5 (01:06:44):
Yes, in their house. Yes, it looks like.
Speaker 8 (01:06:53):
Yeah, huge bags under his eyes. Never seen the kiss
of the sun or his mom.
Speaker 4 (01:06:58):
You do not know what vitamin sees is, little boy,
But don't worry or d d well.
Speaker 6 (01:07:03):
Pfirst robot specifically for children that look the way the
actor Brad Doroff looks right now, designed for that.
Speaker 5 (01:07:10):
I don't know who that is.
Speaker 2 (01:07:12):
I am the host of this fucking thing. You need
to Okay, so we've also got Phil our bartender. We
should talk in Hello. Thank you Phil, very good. That's enough,
thank you. I really don't know how to direct this
at this point because I invited like seven people into camp. Okay,
well with us, let's start on an easy point. We'll
(01:07:35):
just go around has everyone feeling at this point of
CS we are at the end. Let's start with Gare
because you are probably the like second least likely to
say something legally actionable, and I realized that it is.
Speaker 5 (01:07:51):
Now a challenge. You haven't had all the conversations I've
had with care. What am I doing?
Speaker 4 (01:08:01):
I don't know.
Speaker 5 (01:08:01):
I feel fine. It gets I was able to find
some cool stuff despite having to sort through lots of slop.
There's always one or two gems at cees that makes
kind of all like that that makes all of this
slop sorting worth it. So I was happy to find
a few of those in Eureka Park. I also had
my my family no, absolutely not, we're not We're not
(01:08:22):
gonna do free publicity. Agreed, that's my job and it
costs money anyway.
Speaker 2 (01:08:27):
But no.
Speaker 5 (01:08:27):
I was also able to have a little bit of fun.
As soon as I realized that this show is just
gonna be last year's show again, I kind of relaxed myself,
being like I can just kind of do what I
can kind of just like funk off. So I talked
to a flying car company. It was real. I was
(01:08:48):
waiting to interview someone and instead one of their like
pr guys walk up to me is like, hey, can
we interview you about your thoughts on this flying car.
I'm like absolutely, I wish that happened to me, like
I would just thought I will kill my It was.
It was every the phrase ces miracle gets used a lot.
That really was what did you do it?
Speaker 2 (01:09:10):
Though?
Speaker 5 (01:09:10):
Oh I did hell and so I talked a lot
about like my concerns around safety for these flying cars.
I I predicted it has twenty minutes in the air.
It was actually it was a huge drone and I
got exactly right.
Speaker 2 (01:09:21):
It was.
Speaker 4 (01:09:22):
It was a.
Speaker 5 (01:09:24):
Twenty minutes up in the air. It launches from something
something that looks like a cyber truck. It launches, it
launches out of the classic and it launches out of
like the trunk of the cyber truck. You can get
twenty minutes in the air and then you're gonna crash
or which is my god, you will have some kind
of AI assisted landing, And like I asked them, like yeah,
so I'm crashing what's your plan for that? Like, like,
(01:09:44):
what if you're in a populated area and you're like
and they said, no, well, we'll have guard rail software
to make sure it doesn't land in populated areas of
like we have some of the papers from Night Capital.
I'm launching this thing right in the middle of midtown Manhattan. Oh,
I'm gonna do hover for twenty minutes, And which brings
me to the second thing that they asked me about
(01:10:05):
was like, like, what's you know, what's maybe some of
your concerns or like, what's the first thing you think of?
I'm like, well, a few weeks ago, a few weeks ago,
a Deloitte consultant drove a car into like into fifteen
people in a in a terrorist attack. Like same day,
someone used a cyber truck to make a bomb. What
(01:10:27):
if some rich guy loses his mind and flies this
thing into a building and they did not like that?
Speaker 2 (01:10:34):
Why.
Speaker 5 (01:10:36):
I don't know what kind of PR trading these guys had.
I don't know if they were prepared to this.
Speaker 6 (01:10:42):
Is like good enough PR training that they were Like
this person seems nice, Let's ask them what they think.
Speaker 2 (01:10:46):
Turning to my turn to my class, like just killed
you killed, like just gun in your mind, just don't.
Don't also diet, like it's the best way you could
go about this.
Speaker 8 (01:10:55):
But you would get into commercial aviation and not have
an answer. The nine to eleven question is is just.
Speaker 5 (01:11:02):
That's in the past, so I was able to relax
and have fun, you know, with moments like that, you
had fun, you know, once it became clear, it was
just like, you know, like AI software was like the
king of this year once again, a whole bunch of
things that used to be like, you know, actually kind
of like all of the like university projects in Eureka Park,
which sometimes has like a really cool new thing. Right now,
(01:11:23):
all of that creativity is being is just being channeled
into AI software, and that's in some ways disappointing to
see some of the Some of the software is like good,
like it works, it can solve problems, but it's also
again like it's it's it's part of the slop. There's
also a lot of slop software.
Speaker 2 (01:11:38):
I feel like we're in the slop society. I feel
like we're at this point where everyone here, I think,
is experiencing some form of mental damage from being here
too long. Oh yeah, but also the thing, to be clear,
definitely least the one commonality I have with Robert, which
is Robert is the only person I've seen exhibit the
(01:11:59):
same thing of like the Jokers feel where you're just
like You're like it gets worse and I get better.
I'm just fucking suffering, and like the more I suffer,
the stronger I get, and the more intrigued.
Speaker 5 (01:12:10):
I get with the more pain I get.
Speaker 6 (01:12:11):
He's also the only person that I saw that I
recognized on the floor of the South Hall of the
Conventions that are where it was just like cell phone cases,
like not technology, but just like Chinese stuff designed to
be sold on Amazon.
Speaker 2 (01:12:24):
That's where Robert his best was.
Speaker 5 (01:12:26):
And he was he was like locked in.
Speaker 6 (01:12:28):
I was like hey, and he got like some of
the way past me was like, do you fucking need
something or whatever, like, oh man, it's from yesterday.
Speaker 5 (01:12:38):
Table that we were at. Oh my god.
Speaker 8 (01:12:40):
In my defense, there were really huge cell phones there,
like massive, like it literally and inho and a half.
Speaker 5 (01:12:46):
Thick rugged cell phones.
Speaker 8 (01:12:47):
Okay, And to look at them, I had to touch them.
We're not reporting on them, I have nothing to say
about them.
Speaker 5 (01:12:52):
I had to feel them.
Speaker 2 (01:12:53):
Are we establishing that Rubbert Evans accidentally Big Times people,
is that what we're saying.
Speaker 5 (01:12:57):
I did not feel big Times. I was just like,
damn this guy, I will.
Speaker 2 (01:13:02):
Give the most derisive view of I did not know
who Robert Evans was before August twenty twenty three, a
fact I am regularly reminded of by everyone else. They
mentioned Robert Evans too, Oh my god, the behind the
Bastards guy, and I was like, Eh, this is a
fucking guy with the weird avatar. Oh you want to
do podcasts, but the.
Speaker 5 (01:13:23):
Fucking ass what you're gonna do about it?
Speaker 2 (01:13:25):
Mayah need money? It turns out I already a corporation.
Thank you, no negative statements right now, all of you.
I actually need this. But nevertheless, it is really fun
being here with Robert because Robert again is one of
the only people who experiences CS in the way I do,
which is the Metallica song Frantic, where you're just like,
it's not great but you're here, Oh yeah, and it
(01:13:47):
kind of bangs, but not for the reasons everyone else
feels totally sat anger fans line up all three of us.
But it's also really fun being here this time because
last year I was but a babe and very nervous.
Now I'm just like, oh no, you wade into it
with like your pants off, like you're just like I
will see everything.
Speaker 8 (01:14:06):
And speaking of Metallica, I heard a pretty good Master
of Puppets cover at the MGM Grand in the top
three covers of Master of Puppets I've heard on that
exact same stage at the MGM Grand.
Speaker 2 (01:14:16):
No longer interested in the text.
Speaker 5 (01:14:17):
So where was this? It was right near the ski ball.
Speaker 2 (01:14:21):
Okay, well I know where we're going. Dinner is canceled, fellas,
That's where I'm going. But it is interesting because talking
to the various reporters here about CS and why we
do this, and no one can answer that question. But
the way, it's just like I we're all fucking here
every January. But it has been interesting. Gang one of
(01:14:43):
four reports, like Kylin ging it could happen here, Robin,
gare gang ed and Dave in like truly like I
don't want to say objective, but like fresh looks. And
then Robert and Gare of course, who are very much
used to Robert is a separate creature. I mean that
with more love than I could ever put into my voice.
But it's interesting to get this few and then bring
(01:15:05):
reports in and talk and they're like, yeah, well we're
here and we saw stuff. But there's also an ephemera
that is kind of hard to cover an objective journalism.
Speaker 5 (01:15:16):
Kyle totally.
Speaker 3 (01:15:17):
Yeah, it's I thought that my main takeaway from the
week was the politics of it.
Speaker 1 (01:15:25):
I mean from I was at Panasonic's.
Speaker 3 (01:15:29):
Keynote and immediately you had people from the Consumer Tech
Association talking about tariffs. I mean I walked by people
wearing Department of Government efficiency shirts.
Speaker 1 (01:15:41):
Really yeah, which was really interesting.
Speaker 2 (01:15:43):
Did you hit them? No?
Speaker 1 (01:15:44):
Right, did not.
Speaker 2 (01:15:46):
I'll take care of it. No, no, don't crime so bad.
Speaker 3 (01:15:50):
But yeah, I mean, tariffs were just such a thing.
I think a scene in articles I wrote about it.
They were just kind of looming over the entire event,
and it seems like, i mean, Trump coming in pretty
pretty soon. It is just kind of hanging over everyone's
heads on top of all the tech. So I mean,
is that what, Who's Who's Donald Trump?
Speaker 5 (01:16:10):
Trump?
Speaker 2 (01:16:11):
No? Sorry, I'm drinking a bit. Yeah, I mean I'm
trying to think of the political feeling.
Speaker 3 (01:16:20):
Yeah, there's there's a political feeling there, and at the
same time, CES is happening. You're seeing this shift from
meta that's just just I mean, especially on their their
new policies regarding comments about trans and and just other
vegan people.
Speaker 5 (01:16:36):
Yeah, what was it, something like you're allowed to say
don't repeat.
Speaker 1 (01:16:40):
Yeah, no I'm not.
Speaker 2 (01:16:42):
But it's basically you can just insult trans people, you
can sell get without any issue, Jewish people.
Speaker 5 (01:16:47):
You can refer to women as property too, and you
can actually I really like, I mean at this point,
I'm just like you could say, well, I think it
actually is worth mentioning, like specifically, like the types of
like insults around like mental illness that you're now allowed
to call to like call queer people but not call
other types of people. Actually is worth actually is worth mentioning.
Like it's a it's a huge change.
Speaker 2 (01:17:10):
It's just an interesting show because it desperately wants to
be a political, a moral totally bereft of these feelings,
but it still shows it in the things they show,
the collection of data, the kind of surveillance aspect. And
it sucks because it feels like this show could be
better because on the fringes of the conversations we've had
(01:17:30):
of like, yeah, everyone's creating solutions for problems that they
haven't even come up with yet. You get these things,
the skin care product, the caine for blind people that
can tell you what's coming up, really useful things, the
translation stuff exactly, like yeah, like actual translation for a conference,
so you could go to a conference in Taiwan and
(01:17:52):
actually computext one of the most important conferences in the world.
Speaker 5 (01:17:55):
Really some really impressive like tech to assist people who
are disabled. Yeah, I thought accessibility tech was like my
main Hi. Yeah, a lot of good That's what Eureka
was best at show.
Speaker 2 (01:18:06):
Yeah, it's weird that Eureka.
Speaker 8 (01:18:07):
The Poisonville actually had some good shit in it for that,
and it had I would say the one like consistent
like form factor for consumer tech that seems to be
getting a lot better every year, and that reminds me
of like how phones and tablets felt ten years ago.
Is glasses tell me like they're getting smaller, they're getting
more capable. I'm seeing like glasses that are designed for navigation,
(01:18:30):
that are designed for recording. I did see to what
you were saying about politics. A lot of glasses with
that were specifically marketed based on their ability to aid
TikTok that I'm like in a week that might not be.
Speaker 5 (01:18:41):
Such a good business. Well, yeah, yeah, I had a
full conversation with someone speaking Chinese. Yeah yeah, yeah, absolutely
the same.
Speaker 2 (01:18:49):
King That is objectively very cool.
Speaker 5 (01:18:53):
It's great.
Speaker 2 (01:18:53):
And was it a conversation with Nuance like it was?
Speaker 5 (01:18:56):
Yeah, more than I'd have been able to do, you
know on my before we talked about making specific specific
dinner plans, what we do for work a real coper too.
Speaker 2 (01:19:08):
This is the thing that I I and I've said
this many times in episodes. I'm not anti tech. I
wish tech was able to do the thing they promised,
but I love technology. The only reason I know literally
every person I'm looking at in this room is through posting.
And I'm not even kidding. And this is the sickening
thing that upsets my father. No, my dad's very proud.
(01:19:30):
My therapist kind of like my multiple lawyers. Anyway, the
idea that one can do that, but the internet and
tech is fully capable of making these wonderful connections like genuine.
I laugh about the Robert Evans think he found me
through a podcast called Western KOBOOKI with my friend Caleb
Wilson June and other people who are wonderful Alex as well,
like the great podcasts and these all digital things, tech
(01:19:50):
is fully capable. As much as I can be cynical
and angry about this stuff, the reason I'm fucking angry
is it's fully capable of helping people. And we've all
experien that.
Speaker 6 (01:20:01):
That was the bit that I sort of was struck
by again as like sort of somebody who hasn't covered
this stuff as much and has had sort of, you know,
the same sort of experience like a normal middle aged
person's experience of tech. I was blown away by the
capacities of the things that were there. I mean, like
I didn't realize that it was like you could have
a conversation with someone speaking in other language.
Speaker 5 (01:20:20):
Like I didn't see anything that that cool.
Speaker 6 (01:20:22):
But the accessibility tech stuff for me too, Like I
found not just really impressive but really like heartening to
see that. The bit that was strange and we obviously,
like we have talked about this in previous episodes and
stuff is the sort of contortions that are required in
order to make that marketable in a way that like
(01:20:42):
it's not like it's not a niche market to be
like an old person or to be disabled like one
way or another. That's like most people will experience that
at some point in there. Yeah, and yet like the
way that you have to sort of this is something
that really struck me from yesterday that we talked about,
you know enough that I probably shouldn't be bring it
back up again that like you still have to come
up with some sort of like industrial application to it,
(01:21:04):
or you have to say AI or you have to
do this stuff that in order to get people to
invest in this stuff, which is expensive to develop and
expensive to produce, you have to say the words that
the money.
Speaker 5 (01:21:14):
People want to hear.
Speaker 6 (01:21:15):
The mod which is also like another of the sort
of political aspects here that you've got this like in
some ways many in many ways like a sort of
tempted not to use this word, but I will say
that it is like a good hearted intention. It feels like,
especially with the disability attech, that is like actually aimed
at using this new human capability to make people's lives better.
(01:21:37):
And yet you still have to fucking pitch it the
sociopaths if you want to get the thing made.
Speaker 2 (01:21:41):
And I say this is someone who runs a fairly
successful podcast. I have fifty one thousand subscribers. I have
a successful be off them. I've had emails about this.
I have dyspraxia, which is a coordination disability. It limits
my I'm wearing zip up boots, which look I'll bang in,
but I can't tie my shoes. And this is an
embarrassing thing about my life. I fucking hate tying things.
(01:22:03):
I physically can't do it. And you explain it to
people and they laugh, which is really good with something
that you're very upset about that people love to be
laughed at for that. I don't think people realize the
capacity for technology to bridge the gap between your own bodies.
I don't want to say failure, but inability to fully
complete an action. And I think people are myself included
(01:22:26):
at times exactly, but technology is one of the greatest
What was that you said again, Jesus Christ, assistive aid
Thank you assistive aids. And I nearly fucked it up again. Nevertheless,
technology for me is a person has been something that's
allowed me to bridge with so many of you. I
will tear up on the fucking show people. I love
(01:22:49):
people that I've been able to experience through their own
writing and connections to them. But most of CS isn't
fucking this. It's about bridging gaps between my people and
money people to another money person so they can sell
nothing to nobody.
Speaker 5 (01:23:05):
Totally.
Speaker 8 (01:23:05):
Yeah, I mean that's one of the things that's disheartening
is I was at like the Samsung booth, which is massive.
You know, it's the size of a very large house,
like a mansion, and every square foot is tens of
thousands or hundreds of thousands of dollars in terms of
what the cost to rent it and the cost to
Like these displays they've got are very advanced. A lot
of effort is just going into making the presentation as
(01:23:26):
advanced as possible. And what I'm saying on display there
is like, well, we've got a fridge that lets you
know when your milk is off, and at the cost
of you can never ever have a guy over to
repair your fridge, never, never again.
Speaker 5 (01:23:38):
And you compare that too and Eureka Park.
Speaker 8 (01:23:41):
Today we saw a company who had a small booth
that was maybe like four feet wide five feet wide
called Knaki in a qi that their attempt was to
develop like a way to allow people to control computers
and interface with their machine with their phone in a
way that is similar to how neuralink works, but without
any sort of surgery.
Speaker 5 (01:24:00):
So it's an earpiece you wear in your head.
Speaker 2 (01:24:03):
Oh yeah, we kind of made fun of this, but
if this is real, awesome.
Speaker 5 (01:24:08):
It seemed to like we like the use case.
Speaker 4 (01:24:12):
We like the use case for people who are not
able bodied. But what happened when we came up to
the booth, The first pitch we got was for retail,
and then the interest in it was like, Noah, what's
real is you can use this if you are someone
who's not really able to use your arms or to
(01:24:33):
move the rest of your body. You can slightly tilt
it and that makes more sense than like being able
to do a sort of like a retail cashier U
secondary labor productivity aim.
Speaker 8 (01:24:44):
Yeah, I mean there was a because, like my mind
was very much geared towards quadriplegics, but not just but
the other because it won't work if you have as
long as you have that muscular control like above your neck,
like it will work because it reads micro gestures of
your face and them the live demo they did, you
could see the signal coming into the phone when he
would make micro gestures and it would like it would
(01:25:05):
you know, control the phone. So it's it seems like
it works. I can also see I can't see like
a retail app for like you've got your smart glasses
while you're biking.
Speaker 2 (01:25:13):
Fucking works because we talked about it, it seemed to.
Speaker 5 (01:25:16):
But it's like, but it seemed I don't have the
ability to like sound though they did.
Speaker 4 (01:25:20):
A conceptual yeah that so this is the thing because
I think also there were three or four of them
where it's.
Speaker 5 (01:25:26):
Like, yeah, there were there were variations.
Speaker 8 (01:25:28):
There was the one that was like almost like a
retainer that you put in and it gave you the
same It was a similar sort of idea.
Speaker 5 (01:25:35):
Yeah you use your sorry sorry, yeah you used your
tongue for it.
Speaker 6 (01:25:38):
We were struck by that because it was a guy
standing completely still with no expression on his face and
a bunch of people standing around him recording them on
their phones like this is like you've entered the daw Da.
Speaker 5 (01:25:48):
Section of the here Eureka Park.
Speaker 8 (01:25:50):
But but all of those booths together were like a
tenth of Samsung's booth, and all of them were people
utilizing significant ingenuity to attempt to solve problems For real
human beings who were suffering as opposed to the Samsung booth,
which was this massive edifice of capital attempting to solve
the problem of like, well, what if your milk goes bad?
Speaker 4 (01:26:10):
Yeah, I know, and you know, and that's the thing
I think also, you know, to this like inventive stuff
was the most interesting of a few friends when you're
degenerative diseases and like the you know, some of the
stuff that gets made and offered to be able to
give finer control over tasks that you need to do,
(01:26:31):
especially when you have sudden jumpson what you're you know,
you're able to do or not do very impressive. But
then like kind of similar to what you were just saying,
sometimes you actually you literally can't get that made unless
you have to clonjure up some sort of secondary application,
which is a shame and a problem with how tech
innovation proceeds, right, because.
Speaker 2 (01:26:51):
You must show growth. You must be like this is
how this is going.
Speaker 4 (01:26:54):
To tell There should be no reason why you have
to consider anything other than that mediate and and an
urgent use case, which is like someone losing the ability
to communicate with the outside of regulatory avoidance.
Speaker 9 (01:27:10):
So from the earlier conversation we've had several times regarding
regulatory power and authorized medical devices. So all of these
products I have, I've sounded negative where you're trying to
skirt and avoid these requirements. However, the agencies that certify
(01:27:31):
them to want to help people. They want people to
go through this process. So these are the things that
the FDA loves to see. And it's hopeful.
Speaker 1 (01:27:43):
Yeah, I mean the I guess you talking about capital.
Speaker 3 (01:27:46):
I mean, the main worry is that if there's no
growth opportunity for a product, then there's no investment, and
and that's just it just kills products that have a
one you know, use purpose.
Speaker 1 (01:27:59):
Uh and there that's all there is to it, because
that's all there needs to be to it.
Speaker 3 (01:28:03):
And I think that's one thing that struck me as well,
is everything needs to have beld and whistles that aren't
usually necessary.
Speaker 2 (01:28:09):
So one of the themes I've discussed on this show
is the rock economy, the idea that everything has growth
or costs. And some of you, at the very beginning
with very unfair, you're saying that he's just angry at nothing.
I'm angry at everything.
Speaker 5 (01:28:21):
I love that when you do you Donald Trump voice.
I can tell now I turn my head, but it's.
Speaker 2 (01:28:29):
I hope it's been obvious how pernicious this problem is
because there are companies doing really useful things, and I
talk about my dyspraxia because I don't know whatever platform
I have. I want people to realize, if you fucking
have this and someone makes fun of you, give me
their email. I will personally make them regret being online.
This is a personal thing. I was bullied, I will
(01:28:50):
bully back, very unfair to them. But the point is
there are real solutions, the real problems, real things being
fixed today by companies that actually raising money. And then
you've got the Samsung milk simulate that uses generative AI
to assume when your milk won't go bad, as opposed
to look into the fucking top and it'll bandar you
a little bit, and the Bailey bought the Wooks and
(01:29:13):
your milk has expired.
Speaker 8 (01:29:15):
I did see a product that I would say was
my best in show in terms of products that you
see an ad for in the first three minutes of
a zombie movie and then it causes the outbreak and
like that's the end of the world scenario, and it
was called Vera Doxo. It's a product that generates a
miss noise sweeps over your fruits and vegetables to stop.
Speaker 5 (01:29:41):
The most spoilt using plasma plasma. And here's the thing
they explained this to me.
Speaker 8 (01:29:48):
They said, it extends the shelf life of like you
can put this little box up and it'll it'll get
the missed over groceries in your kitchen, or you can
use it in like a in a grocery store to
get all of the great and it'll extend the shelf
life by thirty three percent. And that's a real thing
that's massive, right, Like this, I have no ability to
vet this based on what I know from Like I'm
like absolutely no ability to bet.
Speaker 5 (01:30:08):
I'm gonna hat that missed I'm not I'm gonna huff
that I did.
Speaker 8 (01:30:12):
Buddy, didn't get high, so tasteless as far as I'm
so quick question anyone.
Speaker 6 (01:30:17):
It's like, if that's a shelf life of your vegetables
leaves you glowing, youthful, ye, and then also like there's
like a seventy percent chance you grow tail. Hey hey, hey,
whoam Yeah you're trying to not sell Some people are
that that's like the most innocuous, like that actually sounds good,
like it's similar to something you would see and they just.
Speaker 5 (01:30:37):
Called it like like doom slayer, like but it's for
your cucumbers, like we just we thought it sounded cool, volunteer.
Speaker 2 (01:30:48):
But also two things, Well, let's just focus on one,
which is what the fuck does the miss what's made?
Speaker 8 (01:30:57):
The way they explained it is that it's a my
and it kills before you actually get like molds starting
to form. It kills them, so it extends a period
of time the molds. It's so far just the mold,
it's said. Because I asked if it was dangerous and
they said no, I have no ability to visit the
vera Ox people at this moment. I'm not trying to
(01:31:17):
shit on that. Maybe this will massively improve the world, Okay,
it just seemed like a product that caused the apocalypse,
Like okay, that was when somebody said we.
Speaker 5 (01:31:27):
Make it agent vera Dux. This is like, oh, you're
going to kill everyone I love.
Speaker 4 (01:31:32):
Okay, great, taking a wet market that's spread from a bag.
Speaker 2 (01:31:38):
So many nine inch nails fans here like a year
zero I'm thinking of this is the big hand from
the year zero.
Speaker 8 (01:31:45):
Yeah again, this scene right now could be the start
of the Apocalypse movie, you know.
Speaker 5 (01:31:49):
Okay, one of us who's alive in three weeks like
it is thinking back to this as they're fighting off
viad ox. Use again to complain about the end of.
Speaker 2 (01:31:59):
Jiu Jitsu Kaisen, the manga, I'm going to just really
put this in them.
Speaker 4 (01:32:05):
How did you want Jujitsu Kien to end?
Speaker 5 (01:32:07):
Okay, welcome? Did you want to get again? What you
want going to take a break?
Speaker 2 (01:32:13):
Well, first of all, the people of the Jiu Jitsu
High School thing, they do very unfair to mister Ryan
and Suker All. They spend a lot of time building
up abilities that do not manifest into an interest in
these real people are these and I'm being very you
made this podcast happen. Consequent the ending of Jiu Jitsu
(01:32:39):
Kaisen involves a bit where Ryan mister Ryan and Sukiner
mister Sto Go Joe. He's very unfairly treated. They show
him winning a battle and then at the end of
the battle he's dead. He's treated so unfair, and then
it's just a simple dream sequence, which is very unfair.
Speaker 5 (01:32:58):
For Rigged, unfair for mister Ran and seeking now I'm thinking. Now,
I'm thinking I have a different trumpet person. I am
so sorry, I can see.
Speaker 2 (01:33:18):
I'm sorry that this is how David Roth feels about
the Mets. So I'll move us back to the podcast
as we wrap this bad Boy up.
Speaker 5 (01:33:28):
I have no ability to vet it based on the conversation.
Speaker 2 (01:33:32):
I have you never will. I genuinely want to thank
everyone else in this room because we're doing another episode tomorrow.
But this show has been conceptually one of the more
insane things I have done, and I must give real
credit to Sophie Lichtman, who is one of the hardest
working and also most patient people in history.
Speaker 5 (01:33:52):
Oh yeah, and so oh yeah, she.
Speaker 2 (01:33:56):
Is the only person who can really move Robert oh yeah,
who is one of the comment single most talented people
I've met in my fucking life. And my comment there
we go. And Robert turned to me a year ago
and said, you seem to be more pissed off at
these people than anyone And he was wrong, only because
(01:34:18):
I was yet to get pissed off enough. Robert has
been insanely supported of me in a creative means that
no one else has, and honestly, everyone in this room
has Geb turned to me and said, you seem to
just be a series of grievances, and Gare was completely correct.
Speaker 5 (01:34:36):
It has been so fun hearing you get progressively more
angry over the course of a year. It's it's been
a real treasure and I mean it sincerely.
Speaker 2 (01:34:46):
I say this as someone who is a pair of yours.
The work you make seems to get better every episode,
and you're incredible at it. Thank you so much.
Speaker 8 (01:34:52):
You're getting none of this, Robert, You're just very good around.
I say this as someone who is profiting off of you.
Are you in a stanton?
Speaker 5 (01:34:59):
Should we should we make what that is like just
hard medication. I'm just I'm just worried, dude.
Speaker 2 (01:35:04):
I am like the healthiest I've ever been, Like my
v O two Max is like doing well, like and
my dog just really happy. They don't know what I
did ever since.
Speaker 5 (01:35:12):
I got a via dox. And that's another Cees miracle,
which which leads which leads me to the which leads
me to the last thing that I'll say is the
first thing we all said on episode one is this
feels like the Cees from last year, and we're not
(01:35:34):
the only ones to think true last night, I was
at the chandelier in the Cosmo. That is the place.
Now we can write off the drinks and thank you.
There we go and Robert I was sitting at the
bar and Robert was upstairs waiting for a table and
we got a table. Hard to do, and I said,
it's another Cees miracle. There's a There was two people
(01:35:56):
sitting next to me who turned was like, what's see, yes,
or there was two people sitting next to me who
turned and said, what Cees are you going to? This
is this has been terrible and I I had a great,
if if brief conversation with these people who were two exhibitors.
Now I was a bad journalist, I was, I was
(01:36:18):
too drunk. I did not I did not learn which
company they were from. But there were two people who
were exhibiting a CES and like, this feels just like
last year's Cees. I was like, yes, this is what
me and all my friends have been saying, like it is.
It has been so disappointing. And it's not just us,
like you know, somewhat tech critical journalists. It's people who
(01:36:40):
actually like go to Cees to present who are saying
the exact same thing. Like, yeah, this is this is
last year's CES, but worse because there's nothing new. Yeah,
and and this is like I did not feed them this.
They turned to me and said this like unprompted, and
they're like, what kind of miracles are you seeing? Like no, no, no, no,
(01:37:00):
no no, that was sarcastic.
Speaker 8 (01:37:03):
It's remarkable to see. And David, I'm gonn guess yournees
are about as bad as mine. But starting out in
twenty ten, like right after kind of the iPad came out,
there was so much excitement every year of like I'd
never seen a thing that could do this. Every year,
I'd never seen a thing that could do like and
I would that would be every like room I walked
into every like thirty minutes, I would see a thing
that was like I didn't know technology could do that
(01:37:25):
until this exact second.
Speaker 5 (01:37:27):
And that's just not CS anymore.
Speaker 6 (01:37:31):
Yeah, I mean it's like even as like a soft
touch you can be impressed by, like a cool new
TV screen. It's not the first TV screen, you know,
not to brag, but there is like, yeah, there's certainly
that sense. It's interesting that like the people that are
showing here were also kind of like what am I doing?
Speaker 5 (01:37:45):
Like, I feel what am I doing here?
Speaker 2 (01:37:47):
Yeah? I think the and what batt Roughline has tried
to do with this entire show is CS is a
combination of people who don't want to be here and
do want to be here, And a lot of the
people that want to be here are the people that
already live here to sell services to those who are visiting.
But the fundamental problem with CES is that the show
itself doesn't seem to be serving the use case of
(01:38:10):
making things happen in the future. It's like, how can
we make the present continue for longer? Yes? And the
thing is something I've tried to do with this, and
I should be clear about what this is. Everything you're
hearing this week is something I came up with about
five weeks ago. I wanted to do this. I booked
this months ago, Robert then booked much later. It could
(01:38:33):
happen here, and Robert, please get it.
Speaker 5 (01:38:35):
I was hungover at all. He is.
Speaker 2 (01:38:38):
Actually, Sophie is so much stronger willed, and that's very true,
willing to respond. But Robert has been an incredible mentor.
And I'm going to be sentimental and you're just gonna
have to fucking suck it up. All Right, We're an
award nominated podcast. I can do whatever the hell I want.
But what I think this show needs going forward is
(01:38:59):
more independent voices. And it means bringing in people like
Kyle from Las Vegas Suck and allowing them to speak
because it isn't so much. The problem these journalists can't
say what they want is that formats demand things in
certain ways, and the way to talk about tech is
no longer as flat as and the product exists or
tech company sucks. I can do both, but the thing
(01:39:22):
is having these different voices, having these people talk about
their experience of CS is on some level an explanation
of how the tech industry feels. They're in this thing
where you have this dichotomy between this vast millieaure of
different things that are like, hey, what if this happened,
you'd then give me money. What would that be? Like, Well,
(01:39:43):
I'd have the money, and then you'd have the product
and did something good happen?
Speaker 6 (01:39:47):
Oh?
Speaker 2 (01:39:48):
I don't care. And then you have these people like
I'm gonna fucking solve people with eye twitches, people with
twitching in their eye, which turns out to be a
huge industry of people who are genuinely fucking suffering people
with way bigger problems, like mobility problems, having that problem
solved in real time. But the people that get talked
about I have the most massive television, and no, I
(01:40:10):
actually take that back.
Speaker 5 (01:40:12):
It's not even the big TV.
Speaker 2 (01:40:14):
I want a larger TV. I want like a two
hundred and fifty inch fucker. I want to watch the
Raiders lose.
Speaker 5 (01:40:22):
Well that was so.
Speaker 8 (01:40:23):
I had a moment at the Samsung with and they
were showing off one of those transparent TVs and there
was a lady who's hold another one another one.
Speaker 5 (01:40:29):
Well, I'm probably the same was No, that was LG.
Speaker 2 (01:40:32):
Last year it was LG.
Speaker 5 (01:40:34):
Yeah, I think it was LG. Yeah, yeah, I it
all blurs together.
Speaker 8 (01:40:37):
There was a lady behind a transparent TV whose whole
job was when the TV went transparent, to waverhand behind it,
and I was like.
Speaker 5 (01:40:42):
That's that's your job.
Speaker 8 (01:40:44):
And it was this recognition as they're like walking as
we Garrison and I sat through a guy who played
us Ai generated Scot and tried to convince this that
we did we know a lot. They needed musicians, even great.
Speaker 5 (01:41:00):
I would know No.
Speaker 8 (01:41:02):
The next time someone played that is a situation it was.
It is very much a following down situation, yes, but
the realization that, like there is a chunk of guys
running big tech who will see holding your hand behind
the transparent TV is a thing that human beings should do,
but not making music, and like that that upsets me.
Speaker 2 (01:41:25):
It's just it's frustrating because entering into this and the
format that I created in a Google dog three weeks ago,
and some of some people even read it. Uh. The
goal was to try and pull out how the show
affects people and indeed the implications of this show do.
(01:41:46):
I'm now going to return to the sentimental bit I
was diverted by. I want to thank every single fucking
person who listened to this but also joined me on this.
Gare Davis one of these single insanely young and in
a non specific way, I realize the single most talented
person to come into anything associated with tech, insanely young,
(01:42:08):
but also is insanely presient and aware of the social
issues but also the context of basically everything they look at.
Robert Evans the single most focused but disorganized person I've
ever met, but also someone who cares so deeply and
has such an innate talent at finding talent and empowering
(01:42:31):
those voices. Without Robert Evans, I would not have done this,
and I did try and like wave him off when
he offered the podcast to me, I was like, yeah, mate,
show you a fucking budget down even is in the
actual jack off gesture in front of the computer. Robert
and Sophie Lichtman, who will never get enough compliments and
by the way, universal law with better offline. If you
don't love Sophie, I will fucking kill you, not literally,
(01:42:54):
but I will think about it very fucking aggressively. Robert
has actually have faith in me that most people haven't,
and the result is a fucking successful tech podcast that
does things more successfully than most of the tech podcasts
out there. Otherwise, Robert supports Gare, who has been so
incredible and will do better work than I will ever do.
(01:43:16):
But you know what, that's what doing good ship is
knowing the people who do things well.
Speaker 8 (01:43:20):
What I've been really excited about this year is getting
to meet David and Edward and now getting this is
the first time we're in a room together.
Speaker 1 (01:43:27):
But it's not we talked yesterday.
Speaker 5 (01:43:28):
Oh Ship, I was so drunk, So sorry Kyle during
the panel, During the panel the disinformation, Oh no, no,
I was.
Speaker 8 (01:43:36):
I was just just high on mushrooms. Oh okay, and
I've got some creative still in my part.
Speaker 4 (01:43:44):
I'm okay.
Speaker 2 (01:43:45):
I was trying to do like a sincere moment. Now
let Robert finish his fucking thing.
Speaker 8 (01:43:56):
I was just saying, like, it's been very exciting to
get to meet these folks, some of whom I had
been reading for a while, or some of Moore. I'm
excited to start reading and get to like make these
connections because in a cees that is so anti human.
It's nice to make connections to people.
Speaker 2 (01:44:09):
And I'm about to harpoon you with sincerity. Robert and
Sophie have been the single and geare as well have
been the most single supportive creatives I've ever fucking worked
with my in life. I have had issues with believing
in myself and believing what I can do. I have talent, whatever,
Like who fucking cares? Email me? You're mad were you're
listening to this little pig? But the thing is these
(01:44:32):
people beyond when even a year ago, I didn't think
I could fucking do this, And now look at me.
I'm a fucking ultra ponce and it rocks. And I
believe what cool Zone Media does is the future of
fucking creativity. The idea that a big corporation can give
someone multiple seasons to work out their audience and work
out what they're building. You look at it behind the
(01:44:54):
bustards could happen here, sixteenth minute politics, cool people who
did cool stuf. There is so much cool shit that
comes out of the idea of damn, what if you
give people more time to build something than seven minutes?
What if you didn't rush them to make something good? Better?
Offline at the beginning was fucking rough, but we worked
(01:45:15):
it out and you people seem to like it, and
the people in this room are fucking adored. But don't
We're not done. Phil Broughton over there feel brought in
health physicist East picked me up two or three times,
and I've been like shaping my pants, not literally, I've
never shipped my pants around you. Let's not talk about
that further. But the truth is Phil has been here
for multiple CS's tending bar, grabbed the Mikey motherfucker tending
(01:45:40):
bar for people and utimately doing the thing of asking
them why they are here. I think that is the
most valuable thing you can do in the consumer electronics show,
asking people the reason they are there and finding out
what it is they actually fucking cared about. And you
do that so well while also serving various bourbons. And
I was even like a man he replaced them.
Speaker 9 (01:46:01):
And I like this one, by the way, one of
my that would be a part of Premium uh cass
prank Thryer. That was a goddamn journey that made fury
for weeks finding it film.
Speaker 2 (01:46:14):
I love you, I love all of you, seriously. Thank you.
Speaker 9 (01:46:18):
One of the things that made me happy is for
everyone that I picked up at an elevator. By the
time we'd hit the twenty eighth floor, I knew what
I was serving you.
Speaker 2 (01:46:27):
Thank you so much, ma'am. David Roth. David Roth is
someone I brought here.
Speaker 5 (01:46:32):
No take the fucking mic. You grab the goddamn mike.
Speaker 2 (01:46:35):
I will shove it in you out. We all know
how good I am ed, which is why it's necessary
to remind people. David Roth is the is actually my
favorite writer, and he is one of the single most
empathetic people who understands why people enjoy stuff, and his
(01:46:55):
sports writing and cultural writing is genuinely influential over everything
I've ever done. And I'm absolutely going self indulgent. There's
nothing a single fucking person in this room could do it.
Mattasowski is just sitting there being like, yeah, vacuuming up.
But David Roth being here is like watching the movies
with eBAM and it's genuinely I'm getting a little emotional.
Speaker 5 (01:47:15):
But thank you, David, and thanks for having me. For real,
I feel the same.
Speaker 2 (01:47:20):
Edward Graso Junior. You are an undiscovered talent and anyone
fortunate enough to listen to this should now they should
hire you immediately because when you finally take off, when
you finally get big enough, you are the single most
capable grab the microphone.
Speaker 4 (01:47:35):
Motherfucker, hello, I said, when I get my gun.
Speaker 2 (01:47:40):
Okay, let's walk back to the guy.
Speaker 4 (01:47:43):
Okay, let's cut that out on the words, my.
Speaker 5 (01:47:47):
Matt, this is still keep it in.
Speaker 2 (01:47:49):
But just to be clear, the gun was a metaphor
right for my potential.
Speaker 5 (01:47:58):
I'm really trying. Sorry, no more, no more.
Speaker 2 (01:48:03):
I know You're not done, ed there are there really
isn't not the right that it can do the kind
of labor reporting you do and the understanding of the
human experience that you can do. You are just at
the beginning, and I can't wait to bring you back
on this show and have you do more. I love
all of you so much. But also, your potential is barey.
(01:48:25):
You are, your potential is barely getting started. Is some
ship you'd see on a CS banner.
Speaker 5 (01:48:29):
Though, this is the worst.
Speaker 4 (01:48:33):
This is the worst I'll ever be.
Speaker 5 (01:48:36):
That's right.
Speaker 2 (01:48:37):
That's right, brother, That's why I put on a hinge
that I actually might put that on hinge. Any any
singles want to email me like, that's what's.
Speaker 5 (01:48:48):
On, darling. I know I've vomited in your car, but
let me assure you this is the worst I'm ever
gonna be.
Speaker 2 (01:48:57):
Oh my god, you may think I just meant you,
so that's true. You came on this show completely, You
were actually well prepared. You had a laptop out, which
fucking rocks. You actually like took this seriously much like
I did with my laptop. Glad that I don't have
in the preparation I definitely did. Here's the thing. On
(01:49:19):
an instinctual level, when your first thought is labor, that
says a lot about you as a person. It's incredible
that you immediately jumped to the hospitality workers in this
city that are before by the various conferences, and you thought,
how the fuck does this affect them? Because when it
comes to better offline and what this fucking show actually means,
it's what the actual effects of technology are on people.
(01:49:41):
So thank you for joining us, of course, and please
keep doing your shit. You're going to be back on
because we live in the same city we do, and
I will be finding you.
Speaker 8 (01:49:48):
God, no, God, somebody more actionable threats in these episodes
than I'd expected.
Speaker 2 (01:49:54):
That's not a threat, it's a promise.
Speaker 5 (01:49:56):
Oh, thank you.
Speaker 2 (01:49:59):
I would thank me, And then of course to Mattiasowski.
Manisowski is our producer, and he is the silent party
who has been sitting here patiently the entire show, working
through texting me time signals. When I text him I
have BURBD. He edits that out because it's happened three times.
I believe Matt. I and I bring up Matt Hughes here,
(01:50:23):
my word editor, who has been so patient with him.
Megan wernstromo over at Penguin, who has also been patient
reading the dross. I assume she says it's good. But
back to Mattisowski. Matt has been here since the beginning
when I was really not fucking confident about this show
and I was genuinely experiencing like real anxiety and worry.
(01:50:44):
Matt encouraged me, work with me and edited. Matt. Love
you man, Thank you so much for your hard work
this entire week. Could you come over to the microphone
and just say hello, Hello. Mattisowski is a protected party.
If you ever wrong Mattisowski, I will destroy you, actually
everyone in this room. But really, like, if you fuck
with Mattasowski, I will actually fucking peck out your eyes
(01:51:05):
like a podcast voice, like a big bird if you
play Final Fantasy fifteen, specifically the zoo Bird, very big bird.
You've been listening to the better off line CS experience.
I am so grateful that you've been here, and tomorrow
you'll get a final wrap up episode. But this is
really like the rap night. I am actually really grateful
(01:51:28):
for everyone who came in. Everyone's really fucking showed up
and just had an incredible show for a show that
is so regularly so fucking miserable and so lifeless about
people that people are imagining rather than real people. And
everyone's been here and just shown an incredible fucking show.
I'm so grateful to everyone for being here. Thank you,
and I'm now going to pass the microphone around, starting
(01:51:52):
with Zai, who's been here, Zi please join us? Thank
you for hi. Hello, You've been wonderful taking photos. Thank you.
And then we'll now go to Robert Evans. Thank you, Robert.
Speaker 8 (01:52:02):
I just want everyone to know that you know as
the verver a ox miss takes your loved ones. Ed
Ed cares about you.
Speaker 4 (01:52:10):
Phil.
Speaker 9 (01:52:13):
I want to thank everyone. It's been a pleasure to
be here and to serve, to serve and be of service.
Speaker 5 (01:52:19):
David Roff, thanks very much. It's a good time and
I feel like I learned a lot.
Speaker 2 (01:52:24):
Mister Wrongways.
Speaker 4 (01:52:27):
Had a great time, and I'm not going to make
any more legally actionable threats.
Speaker 2 (01:52:31):
Thanks.
Speaker 5 (01:52:35):
Yeah, Coon, you can just say yeah, this was This
was my first ES.
Speaker 3 (01:52:39):
Ever, and I really enjoyed figuring out what was real
and what wasn't and support local journalism.
Speaker 5 (01:52:47):
Yeah well, and thank you Ed for putting this whole
space together. This has been a fantastic escape away from
the show floor. I spent five hours in the Venetian today.
Coming up here was a wonderful reprieve and you really,
you really put this all together. So thank you so much,
and your hard work over the last year on Better
(01:53:07):
Offline has been has been lovely, has been lovely to watch.
Thank you so much, and thank you so much for
your listeners. If you've got to this point, I I'm
so sorry, but also thank you for your patience. I
hope I've successfully encapsulated ces. I've given you the various
joketor positions of the show. Tomorrow you'll have a wonderful
(01:53:30):
positive masculinity day. As the remaining crew me Mattasowski, Phil Broughton,
and Edon Guaiso Junior kind of like smooth.
Speaker 2 (01:53:37):
Ourselves out, real smooth like. But today's the last day
of the show. Email me easy or ease ed. If
you're one of those people at Better Offline dot com,
please let me know what you thought. A lot of
you have emailed like we had we didn't have enough
woman on this We next year we're going to fucking
(01:53:58):
correct that. We luckily got victorious Charisa Belle and of
course Sheerlyn Lowe, who really balanced that out. We'll do
a better job last last time. Wow, the concept of
time is fucked up. Nevertheless, we'll do a better job
next year. And I love your feedback and I actually
read all the emails. Will I respond to them. It
gets increasingly harder each each week, which is a good sign,
(01:54:20):
I guess. Either way, I'm very grateful for all of you.
A lot of people have given a lot of people
given me faith, given my given their faith Jesus Christ
in the show in the last year, It's only going
to get better. But next c Yes is going to
be weirder. Geah has ideas, I have ideas, and it's
(01:54:42):
just going to be sharper. It's just going to be weirder.
You have another episode, There's going to be more Viera
Nax too. Palpatine will be back.
Speaker 1 (01:54:51):
Cees twenty twenty four, Part three.
Speaker 2 (01:54:53):
I'm actually going to try and contact the e mcdummot.
But anyway, thank you so much for listening. I know
it's a lot of audio and and I I'm really
grateful for everyone who contacts I love everyone who listens.
I genuinely mes so grateful. I am not really good
at hiding any of like the me in this and
a lot of podcasts are very performative. I've phil can
(01:55:15):
speak to this more than anyone I'm not really good
at like pretending there is no off switch. I genuinely
love you all. Thank you for listening. Thank you for
listening to Better Offline. The editor and composer of the
(01:55:36):
Better Offline theme song is Matasowski. You can check out
more of his music and audio projects at Mattasowski dot com,
m A T T O S O W s ki
dot com. You can email me at easy at Better
Offline dot com or visit Better Offline dot com to
find more podcast links and of course my newsletter. I
also really recommend you go to chat dot where's youreaed
(01:55:58):
dot at to visit the discord, and go to our
slash Better Offline to check out I'll Reddit. Thank you
so much for listening.
Speaker 1 (01:56:05):
Better Offline is a production of cool Zone Media.
Speaker 2 (01:56:08):
For more from cool Zone Media, visit our website Coolzonemedia
dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.