Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media.
Speaker 2 (00:04):
Forty years ago, the old gods decided to create a podcast.
Are capable of withstanding twenty hours of podcasting in the
space of five days. I am at Zeitron and this
is better offline CS coverage. We are large and in
(00:27):
charge in the Palazzo Hotel in Las Vegas, beautiful Las Vegas, Nevada,
with an open bar, Tacos and the strongest bench of
tech commentator talent in the known universe. As every We're
bringing you coverage of the world's biggest tech conference. It's
day two, Episode two, and our first contestants have just arrived.
Joining me for the first block is the amazing, victorious
song of the Verge. Hello, And we've got the wonderful
(00:47):
stand up comedian host of the Factually podcast, Adam Connover.
Speaker 3 (00:50):
Hello, and of course.
Speaker 2 (00:52):
The legendary broadcast of My Dad's Gonna be So Happy.
Rory Kecklin Jones, who writes the Always On newsletter and
the mo is on the Movers and Shakers podcast.
Speaker 4 (00:59):
Thank you for your sorry, Hello, so pleased to please
your dad.
Speaker 2 (01:02):
Oh, He's gonna He's gonna be elated, So Victoria. You
describe your beat as the Curse Technology Beat and I
think this is a great place to begin.
Speaker 5 (01:12):
Not for my mental health, but sure.
Speaker 2 (01:15):
Yes, but for the full of yucks. Perhaps what have
you seen? Just just start anywhere?
Speaker 5 (01:21):
Okay, So with my editors, you know, we're talking trend
pieces right now, and they're like, what's your what's your
trend this year? And I was like, bodily fluids. We
are talking piss semen, uh, blood urine, bodily fluids, because
you know, my actual beat is wearable in health technology, right,
(01:41):
so you know, I'm you know, I'm surprisingly light under
unwearable today.
Speaker 2 (01:46):
How many people got today?
Speaker 5 (01:47):
I only got two? The other day I was racking.
Speaker 2 (01:49):
Four like a parallax style things.
Speaker 5 (01:52):
Yeah, actually, actually no, I have AI You you sent
me these?
Speaker 2 (01:57):
Yes, yes, don't name them insuls.
Speaker 5 (02:01):
The AI generated insuls. My feet seas still hurt, So
there we go. H But yeah, so you know, longevity
quote unquote longevity tech is in right now. Wellness is bullshit,
as we know, but a lot of my beat deals
with wellness, and therefore the snake oil perpetuated by wellness
(02:22):
influencers in the world, and I get to have the
lovely job of going, what bullshit are they going to
make me? Wear this year. So I've worn uh, I've
worn in led face mask that makes Jason Vorhees look funny. Okay,
but you know it's not the Jason Warheas type led mask.
(02:42):
It's you'll see it tomorrow on Leverage Dot. There's that
I closed the loop on the taint zapper and you know.
Speaker 2 (02:52):
I'm gonna stop you then, Okay, what is that?
Speaker 1 (02:56):
Okay?
Speaker 5 (02:57):
So we have to go back six years because this
is a year. This is a story that is six
years in the reporting and the making. So CES twenty twenty,
right before the world closes down because of COVID. I'm
at Cees suffering, I have a cold or whatever. I'm
at CS Unveiled, which is the inaugural Nights showcase of
vendors and whatnot. I'm walking along a long what am
(03:17):
I going to cover? What am I going to cover?
And I see a table. It's one of those booths
that's very empty and sad, right, and on the table
there is a mannequin and it's just this mid torso
section of a male mannequin with Kendall smooth giblets, and
it's on its back so its tant is exposed to
(03:38):
a roving crowd of tech journalists and on on the
goos of this, of this, a mannequin is a band aid, right,
And on this sparse table, I really have to paint
the picture. Here there's a laptop, and on the laptop
there is a slideshow presentation. And I want you to
think of a black and white picture of a couple
(04:00):
in bed and they're disgruntled. The woman is on the
bed and she's sitting here with her arms folded and
just like a look of disgust on our face. And
the man is at the edge of the bed with
his head and his hands and he's so sad. And
the text on this slide show is premature ejaculation is
the number one male dysfunction. So I take a picture.
(04:22):
I was like his moto at the time. So I
take a picture of this and I send it to
our slack channel, and my editor goes, bitch, you must
write this, and I go, okay, so I'll just describe it.
I wrote it. A week later, the CEO Lovely Man, Jeff.
He reaches out to me and he goes, I because
I had some burning questions about this taint banding. I
was like, is it gonna rip off the hair off
(04:44):
these men's gooches, Like, is what are we doing here?
And it's like the proposed it was a concept wearable.
So the proposal was that to help men with premature ejaculation,
it was gonna zop you, okay, to you know, delay orgism.
And so I was like, I have so many questions.
I asked many penis half others that I knew. I
(05:05):
was like, would you would you try this? Would you do?
And they're all like, bitch no, And so you know,
a week later he reaches out to me. I was like,
I want to answer some of your questions. No, it's
not going to rip the hair off. We've done a
lot of research into the type of adhesive views. And
I was like, okay, I'll write that follow up. The
next year, I see them again and they have the
most They have this gift that shows how it works
(05:27):
and it's like a if you were on the floor
and you were panning up, it's like the view of
the gooch and just it's the most cursed gift I've
ever seen in my life. Right, it's in our social
video if you want to see it. But yeah, so
that was there and we were talking and he's like,
we've discovered something mysterious in our testing. And I was like, oh, okay, cool,
(05:50):
what sure? So I met up with this guy this
year and it's now a product. They are FDA cleared.
Nice And the mysterious discovery that they discovered was that
not only can it help men with premature ejaculation delay
orgasm mm hmmm, the electrical signal contracts the prostate and
(06:14):
makes ejaculation more forceful.
Speaker 2 (06:17):
So how did they measure.
Speaker 5 (06:20):
That it was discovered? It was discovered in test.
Speaker 3 (06:25):
You can tell when you do it how forceful it is.
Speaker 2 (06:28):
Well, I've never had sex, so yeah. No.
Speaker 6 (06:31):
Sometimes sometimes you're in a good mood and you hit
the wall and and people it must have experienced this.
Speaker 5 (06:38):
Yeah, I actually you know that year after, when I
was doing the follow up, I talked to a user
who had used it. You wanted to be anonymous, and
he was like, it's saving my marriage. It's great, it
doesn't hurt, it feels great.
Speaker 2 (06:49):
I'm being kind of cynical, but it seems like it's
actually possibly use.
Speaker 5 (06:55):
But you know, there's there's like a lot of meaning
between FDA proved, FDA clear isn't listed, it has to
do with the type of medical class device it is
and whatnot. But but, but but I did get to
try it myself on my forearm, as I do not
possess the actual body part necessary for real life testing,
so I put it on my forearm. I got to
(07:16):
try it out. It costs three hundred dollars for the
starter kit it has.
Speaker 2 (07:21):
Does it require a subscription?
Speaker 5 (07:22):
No, And there's no AI involved, so you know, but
you know you can. You can customize the zapp pattern,
so you can you can. There's a strength level of
one to one hundred. I could feel the strength at one,
which I was yeah, on my arm, tained which I
was told that makes me a unicorn because most people
(07:43):
can't actually feel it at one. Like he was saying
that depending on your body, uh, how hairy you are,
how much body fat you have, you might need to
like go up to twenty five to feel something nice.
Speaker 2 (07:56):
That's that's yeah, that's good. I'm trying, like I'm trying
not to like just laugh at the idea of bums
just like just like wooo bum poo, but like it
seems like an actual, like weirdly useful thing, like well, you.
Speaker 5 (08:07):
Know, a lot of men don't want to talk about
premature ejaculates.
Speaker 2 (08:10):
I don't want to talk about sex at all. Like
that's like that's a universal thing with the Fellas, Like
this came up last year with Shad and like men
just like you start talking about sex and like no,
I couldn't possibly.
Speaker 5 (08:19):
Yeah yeah, And you know, the their founder is a
really good sport because Jimmy Kimmel did roast him on
a monologue in twenty twenty two about it, and he's like,
that's fine. I just want people talking about it, and
I want men who suffer from this to actually do
something for you. Jeff, Yeah, Jeff is like he's.
Speaker 6 (08:34):
A cool d And the FDA clearance would mean this
it works, like they've tested it or is it just
not going to kill you?
Speaker 5 (08:43):
It means that it's safe basically. So they did have
to test it on rabbits, So.
Speaker 4 (08:49):
You mean they didn't have a double blind placebo control trial.
Speaker 2 (08:52):
I love the one that doesn't.
Speaker 6 (08:56):
Think you think you've been shocked, but if you have
an actually been so just it just naturally hot.
Speaker 5 (09:02):
It feels more like a vibration, to be quite honest,
It's not like a type of feeling. But what FDA
clearance means is like there's different types of medical devices.
There's Class one, which is like a tongue depressor, it's
not really doing anything. Class two is like moderate risk.
And I believe this would probably be in class two
because they require five ten clearance and basically they have
to go through testing for safety, efficacy, privacy.
Speaker 2 (09:26):
And this one. This is all of it. It's past them.
Speaker 5 (09:29):
Yes, it passed that. It took a long time because
this is a six year period. I want to say,
like they had to go through two rounds of funding,
like FDA clearance, millions of dollars, millions of dollars.
Speaker 2 (09:41):
Right now, the Titan Zeppr is the most functional product
we've discussed on the show. Like I'm deadly serious, Like
we spent the last two outs spend like, yeah, this
doesn't work. This doesn't because it's not. The Titan Zuppr
is real and it's real.
Speaker 5 (09:51):
And you know, so part of the thing was is
that you know, they had done their initial round of
gathering data, which took them a year and a half
to gather all all this data, and then the FDA
was like, yeah, but we're gonna need you to do
animal testing so you're gonna have to put this taint
zapperro ond bunnies.
Speaker 2 (10:08):
And just to be clear, that's not the tits.
Speaker 5 (10:11):
No, it's not the bunny taints. They're putting it on
the bunnies because bunnies are furry and some taints are furrery.
So you need to make sure that when you put
this adhesive on on on skin that when you take
it off, you're not causing rashes, irritation, ripping things out.
So you know, they he told me that, you know,
they'd have to put the patch on the bunny every
(10:32):
single day and monitor it for a few hours and
you know, and yeah.
Speaker 2 (10:37):
So Rory, moving on, moving on from the tank for now,
I'm just a dog.
Speaker 4 (10:41):
I've wandered off the streets and it's an old BBC man.
Speaker 5 (10:45):
Oh, it's actually not called a taint zapper. That's just
what I've called it, called more m O R.
Speaker 2 (10:52):
That's a horrible That is one of the worst names
they could have come up with.
Speaker 6 (10:57):
It's called more but it sounds like one thing you
don't get is more. You don't like there.
Speaker 3 (11:03):
You don't get more more velocity.
Speaker 5 (11:05):
More, more velocity, more lasting power. The company behind it
is called marai inc and maori and Latin means delay,
so shocking.
Speaker 7 (11:15):
Call it longer longer long yeah, with no e Yeah.
Speaker 6 (11:21):
Because then it's like you're going longer and your dick
is longer. Yeah, men like longer.
Speaker 2 (11:26):
That's and that's the that's men like longer. But Rory,
what have you seen at the show? Anything unrelated to bums?
Speaker 4 (11:36):
I'm just that I've not got any human detritus to
bring before you. I mean, I've seen something incredibly wholesome.
Speaker 2 (11:47):
Please please tell me.
Speaker 4 (11:49):
I went to the Lego press conference. I used to
come here a year after year for the BBC and
it was a nightmare because of the time difference, and
we had to make a store on a Sunday and
a Monday to go out on the Tuesday eight hours
behind and be desperate. And I learned a good lesson there,
(12:11):
which is never go to a CS press conference because
they're terrible. Because every single press conference features a CEO
who thinks he's Steve Jobs unveiling the iPhone and he's
not Steve Jobs and he hasn't got an iPhone. And
I know this because my very first CES was my
(12:33):
best ever because it had a historic event in which
in the BBC, having paid an awful lot for a
lot of us to come over, led by me, I
said to them, listen, We're going to take a day
out and leave Ces and go to think called Matt World,
because there's a room if something BIG's going to happen.
And so I was there when Steve Jobs did unveil
(12:55):
the iPhone, which was better than any event that I've
ever seen it fifteen.
Speaker 6 (13:02):
When he goes, when he goes, are you getting it?
That's my favorite part.
Speaker 3 (13:06):
Yeah, you get it?
Speaker 6 (13:07):
Were you getting it in that moment? Were you like, wait,
it's one device?
Speaker 2 (13:12):
Well.
Speaker 4 (13:12):
I was kind of shocked in a kind of very
British way by the fact that there were people standing
up and cheering. Were journalists we don't cheer for. There
was not the requisite level of cynical old hackery, but
even I got carried away. I knew something was big
when my desk in London rang people who were not
(13:36):
remotely interested in technology and said, you've got to get
your hands on that phone, and I said, you are deluded.
Apple is like North Korea, but less less open. They
will never let me near a phone that's not coming
out for six months. Are you dreaming? And then I
(13:57):
remembered that we've been promised an interview with Phil Schiller,
a tubby little marketing guy who did not quite have
Steve Job's charisma.
Speaker 2 (14:07):
And I had said, yeah, yeah.
Speaker 4 (14:08):
Yeah, we'll do that, thinking no, we won't, and I
changed my mind and we headed back to the Moscowe centerency.
We would like that interview with mister mister Schiller. After all,
he appears, I see you don't happen to have the phone,
and he said, oh yeah, handed it to me. I
did my piece to camera and did an interview with
(14:33):
Phil Schiller, which we didn't use, and I had my
shot on the on the main news and the following weekend,
a tech columnist in The Venerable Observant newspaper described me
as looking like a French medieval peasant holding a piece
of the one true cross us, which was because God
(14:54):
damn Christ, I had actually got the whole of the
damn thing. So, which is a long way around saying
CS has been sort of twenty years of disappointment for me, mainly,
but this year I decided, against my better insects, I
would go to one press conference. Yeah, and that was
(15:15):
the Lego one, right, and the contrast between that experience
and sitting through I didn't bother to go, but watch
the live stream of the Nvidia press conference was quite instructive.
Speaker 2 (15:29):
We hurst ourselves with that on the first day. It
was truly truly boring.
Speaker 4 (15:34):
Yeah, but where is the great thing about the Lego
one was we got a little freebie, a little package
of bricks on every seat. It was short, it was
forty five minutes. It was the toy industry and the
media industry because Disney and lucasfilms were there and Star
(15:58):
Wars characters telling the industry. Actually we know a bit
about storytelling, and which I'm afraid dear mister Jensen Wining
he obviously did not.
Speaker 2 (16:09):
He doesn't and it's weird. Other than these shiny jackets,
he doesn't really have much going on. And mostly I
was already the listeners already heard me say this, but
it's like it was mostly him just repeating stuff from
three to six months ago. It felt kind of washed,
and I mean it was quite enjoyable.
Speaker 4 (16:22):
When the tech when he's describing the most advanced, exciting
technology in the world and his PowerPoint failed.
Speaker 2 (16:30):
Someone's getting shot.
Speaker 4 (16:31):
Well, somebody, a whole team is going to be replaced
by AI within seconds.
Speaker 5 (16:37):
I do respect when it's a live text presentation now though,
because you know now they just flies to the Steve
Jobs Theater for ALP. Watch a filmed presentation in a theater, like.
Speaker 4 (16:49):
No sense of peril.
Speaker 2 (16:50):
Yeah, that's you know what in video could use a
little peril in that. There should have been some trap
doors or something. It was. It was very bizarre. It
was like two hours of a man clearly trying to
feign excitement and he's just like, yeah, Blackwell might come
out at some point. Yeah, it's in full full manufacturer,
and just the audience going yeah. The reaction was really sad.
(17:13):
It was it was, it was. It was.
Speaker 4 (17:15):
It was like a you know, a failing politician at
party conference setting up the que lines. Applaud, applaud and yeah.
Speaker 2 (17:24):
Yeah yeah it's thank god. Okay, it's good to good
to know other people heard that. But let's let's change,
because Adam, what have you seen? You've just got here,
So what wonders if you bold?
Speaker 6 (17:35):
You know, I'm sort of drawn to everything that's not
AI related.
Speaker 3 (17:41):
First of all, I was.
Speaker 2 (17:42):
Just walking your goal the section.
Speaker 6 (17:47):
Well, what I found was h Strangely, I saw like
five or six products related to teeth.
Speaker 3 (17:53):
Have you guys noticed a teeth trend? The teeth Every
everywhere I'm looking.
Speaker 4 (17:58):
There's usually it's an AI toothbrush.
Speaker 6 (18:00):
The first thing I saw was I was all the
AI you know, blurred out, and I just my vision
honed in on something called the y brush. Yeah, it's
a toothbrush, but what if the end of the toothbrush
was the same shape as all of your teeth, So
you just put it over all of your teeth at once,
and it brushes all your teeth simultaneously, And you know
(18:24):
what as the owner of an electric toothbrush, and it
has a timer and I'd do it for two minutes
and it's like thirty seconds per quadrant and it's so
boring and I have to listen to it to like
two minutes of a podcast. I was like, you know what,
ten seconds to brush all my teeth? Yes, someone buy
me this shit at Brookstone for Christmas.
Speaker 3 (18:43):
Thank you. This is what cees is for. This is
what it's supposed to be. Is dumb garbage that says
is going to save you a minute and a half. Right,
that's nice.
Speaker 2 (18:52):
I like that, did you see anything that was useless?
I was just describe the rest of the cees. I
guess do what else did you do? You see more
teeth related things?
Speaker 3 (18:59):
Yeah, look, I even got a teeth I got. I
got a free night guard. So I.
Speaker 6 (19:07):
Have a loose plastic bags and beautiful I've taken out
of my hoodie pocket.
Speaker 3 (19:13):
I've produced it.
Speaker 6 (19:15):
Yeah, so there was a smart night guard that I wear,
a nightguard for tooth grinding. And it's a night guard
that will much like the taint zapper when you grind
your teeth, it'll buzz in the middle of the night
to stop you from grinding your teeth. Which reminds me
of when I was a kid and I used to
wet the bed and my parents would put the pistol
arm did it? And you know the pistol arm. No,
(19:37):
this is from the early nineties. This has been very
high tech. It's a little so you know, if you
used to wet the bed, it was just common. You know,
it happens to a lot of people. Ed like, you know,
I'm like older than you.
Speaker 3 (19:50):
Should be on wedding the bed and and so.
Speaker 4 (19:52):
Once again, when you get to a age, well you
would be interested in this product.
Speaker 6 (19:58):
It's it's a pad like with election roads out that
you put into your underpants and then when it to
text moisture, it's connected to a little speaker that is
like attached to your shirt.
Speaker 2 (20:07):
And so this is a so this is sorry, I'm lost,
it's the piste along.
Speaker 6 (20:14):
Now I'm describing something that was inflicted on me as
a child that frankly, was almost like a form of
abuse to stop me from doing something that just.
Speaker 2 (20:21):
Naturally stops, like something that kids do accidentally.
Speaker 3 (20:25):
Yeah, like a very common thing. But it's a shame
alarm right now, like.
Speaker 4 (20:29):
A short color for dogs, is that right?
Speaker 3 (20:31):
Yeah?
Speaker 6 (20:32):
Yeah, this is the same thing for grinding your teeth.
And actually it's not a bad idea because I do
I can't.
Speaker 2 (20:38):
Yeah, I can actually see the use of that because
I grind my teeth sometimes as well, and I'm I
find myself accidentally doing it.
Speaker 6 (20:44):
Yeah, and I wear a night guard, and you know,
sometimes I'm like, am I grinding my teeth? Do I
need to keep wearing the night guard? Or I switched
to sleeping on my side? Is it still happening?
Speaker 3 (20:54):
You know that? I was like, all right, that's kind
of nice.
Speaker 6 (20:56):
And then they they gave me a they molded a
nightguard to my teeth, and they gave me a free nightguard.
And this sort of like loose bag. You might use
the bag produce at the supermarket.
Speaker 2 (21:05):
This looks like what a criminal would hand drugs to someone.
And it's just.
Speaker 5 (21:09):
Like going to central seat over my teeth.
Speaker 2 (21:14):
On your headline, I sober, it says Steph Cory finally
finishes eating mouth God. But for the three basketball listeners, no,
that's that's good.
Speaker 6 (21:22):
There's a jerkoff machine. You saill a jerk off machine
right in the Eureka.
Speaker 3 (21:27):
It's called the Handy.
Speaker 5 (21:29):
Yeah, that's good, and.
Speaker 6 (21:30):
The handy too. They've got both of them in the
same what's the sequel? Like, it looked like that's basically
the same thing.
Speaker 5 (21:36):
Just shakes heart.
Speaker 8 (21:38):
Yeah.
Speaker 5 (21:39):
I saw that thing two years ago and I was like, okay, cool,
that was that's funny looking. And then it's back here
again and I'm like, bill more the same.
Speaker 6 (21:47):
It's it's like, got all this copy where it's like,
don't worry everybody looks twice like, yep, it really is.
Speaker 3 (21:53):
It's really hamming it up, you know, the only.
Speaker 2 (21:56):
Way to do that is be super serious about yeah,
just like you're like, excuse me, said something funny about this.
Speaker 3 (22:02):
Yeah, see that would be better.
Speaker 2 (22:03):
That would be so much more interesting. It's like a
huh yeah, we went there.
Speaker 6 (22:07):
Yeah, it's very much the Cards against Humanity. They're like
just for nerds.
Speaker 3 (22:12):
To be able to I recognize this as comedy.
Speaker 5 (22:16):
You know, and be able to let I mean CS
as a show is extremely prudish and does not fuck
like the Rue.
Speaker 3 (22:23):
Yeah, well, the jerk Off Machine is not a fucking machine.
I'll tell you that.
Speaker 5 (22:26):
There was a time where there was like a like
back in twenty nineteen, twenty twenty, there was like a
controversy because they had given a vibrator, uh vibrator. It
was like a micro robotic kind of lingus dildo type
device and it was the Laura de Carlo and they
had given them a innovation award and they're like no, no, no, no,
(22:47):
we're gonna take the innovation award away. And then everyone's like.
Speaker 2 (22:49):
Why did they take it away? Because it was imbrober
uh so they could give it to just something that didn't.
Speaker 5 (22:55):
Exist, Like yeah, they were just like this is not
this is not robotics. And everyone was like yeah, it's robotically.
Speaker 6 (23:04):
Like they gave it and people complained and then they
took it away again.
Speaker 2 (23:07):
That's so that the most cowardly fuckless ship I've heard
in my.
Speaker 5 (23:11):
Then the next year, the next year, they're like, no,
we falk get all the sex tech companies to come
back and we'll have a sex tech pavilion. And they
had a whole camper. They had like a build your
own dildo experience, which I did and you know I
was reporting on it. And now it's just like, where's
(23:32):
the sex tech companies. It's one hand job device thing,
and it's like, okay, I would.
Speaker 3 (23:36):
Have expected build adild to be a big hit.
Speaker 6 (23:39):
You know, it was that year they could, you know,
have have it be a kiosk in the mall. You
know you go with your for your friend's birthday.
Speaker 2 (23:47):
Not smaller, smaller, smaller, you got any Yeah, we got
a special one out back right, not small than that. No.
I like this though, because David Roth sadly was only
here yesterday because the sex conversation we had towards the
end of Lost the Yes, just watching him like, no,
(24:10):
he's a lovely guy. I'm sure he's had sex. I'm
not going to comment further on that before I can
get comment from him, which would be really funny to
text him. Actually lost the Yes, I got in trouble
for texting my family group chat what my dad's horny
level was. Sorry, dad, you can now hear this anyway?
Your son's doing you proud? Yeah, it's it's it's actually
(24:30):
kind of nice to hear this useful stuff.
Speaker 1 (24:32):
But I guess it usually is.
Speaker 5 (24:34):
It's just takes six years of you know, development.
Speaker 2 (24:37):
Of bum buch.
Speaker 5 (24:39):
Of taint research is just.
Speaker 3 (24:41):
Like coming back here again and again.
Speaker 4 (24:44):
Yeah.
Speaker 5 (24:45):
Yeah, And it takes a long time to get some
of the stuff actually made. It's it's not like a
you know, a lot of the I say, a lot
of the tech I see here on my Beata's vaporware,
because it's just like where where is it?
Speaker 1 (24:57):
Where is it?
Speaker 5 (24:58):
Sometimes it takes six years to get approval, you know,
and bunny testing to make sure it's safe enough for
human use and that there's a scientific reason why you
might want to actually use this shit.
Speaker 2 (25:09):
Yeah. So yeah, it's funny like the non AI stuff
is like for real people doing things like fucking.
Speaker 5 (25:15):
Well, you know, they are looking into AI for the
tant Zappera as well.
Speaker 2 (25:20):
So I just I if you can as a listener,
if you're hearing this and you see something at CS
with AI and it works, please let me know because
so far, I'm oh, four hundred I think it is.
Speaker 6 (25:31):
I had a really interesting conversation with a woman and
it was like the sort of conversation where I was like,
I wish I had had a camera I was recording
it or something, because it was one of the first
boots I went up to and I'm like, I flew
here today.
Speaker 3 (25:44):
I didn't get enough sleep.
Speaker 6 (25:45):
I'm like kind of zonked out, so I just walk
up to this thing and it's like AI fast food
ordering thing. Yeah, well the most obvious possible thing. But
the guy demonstrates it and he's like, I have a
coffee please, do you want to make it a calm?
Speaker 3 (26:00):
Yes, I want to make it a combo? Could I
have a croissant?
Speaker 6 (26:03):
And it goes five croissants and then do you want
to make it a combo? He says yes, and then
it makes one of the croissants a combo, and he
says could I remove the other four croissants and it
goes yes. See it can remove stuff you know, and
they're like, what do you think? And I was like,
I mean, it seems worse than ordering. If I were
to confront this rather than a person like I wouldn't
like it.
Speaker 2 (26:23):
I'd be actively annoyed.
Speaker 3 (26:24):
Yeah. And you know they're like, well, you know, sounds bad.
Speaker 2 (26:27):
I'm like, yeah, yeah, but it sounds bad, but also
it is.
Speaker 3 (26:31):
Yeah, it needs a bit of wo But what I
asked the woman was, I was like, she was very
nice to engage in this conversation.
Speaker 6 (26:37):
I was like, because she was saying, you know this,
people want a good customer experience, and a person doesn't
always give you a good customer experience.
Speaker 5 (26:47):
For some robot doesn't, so.
Speaker 2 (26:49):
I give you a bad one every time.
Speaker 3 (26:50):
Yeah.
Speaker 6 (26:51):
She was like, I checked it at the hotel today
and the guy was rude and and you know, the
AI can never be rude. It'll always be in a
good mood. And I said, Okay, even if that's true,
if you you know, did the travel I did here today, right,
You got in a way mow. Then you get to
the airport, there's no person to check in. It's just
a kiosk. There's no TSA agent, there's no person to
(27:13):
talk to you in security. When you go get your coffee,
it's just a you know, a thing you talk to
and then it dispenses it. On the plane there's no
one to talk to. You get to the hotel, there's
no person there, right, and it just gives you a key.
Is that a world that you would like to live in?
Speaker 2 (27:28):
Yeah?
Speaker 5 (27:28):
Japan they do that in Japan.
Speaker 3 (27:32):
Okay, we do that there.
Speaker 2 (27:34):
Yeah, but it's cool there.
Speaker 4 (27:36):
I don't know. Yeah, as a as a UK citizen
coming through the American border, I'd quite like to you know,
global entry is have a rope book.
Speaker 6 (27:47):
There's certainly some things, but like she goes, she goes. Well,
you know, I like talking to people, but most people don't.
I like it, So, you know, I would maybe prefer
to have a person. I'm like, I think most people
do like other people.
Speaker 5 (28:01):
To your point, I was at pepcom last night and
we saw this thing called ag.
Speaker 2 (28:05):
I bought and I've heard about that.
Speaker 5 (28:07):
So this this this giant bot and then there's like
a smaller version next to it. So just imagine this
that the smaller version is like doing taichi and break
dancing like a psycho bot. It's like quite smoothly too,
very disconcerting. And the big bot is supposed to be
shaking tech reporters hands because the purpose of this bot
is to greet people at museums and that stuff. And
(28:27):
it's like it's supposed you're supposed to walk up to
it and it has like a doesn't have a face,
it just has a screen and it's like, my name
is Luca, say hi to me or whatever, and you're
supposed to say Hi, Luca, and it's supposed to put
out its hand, and you're supposed to put your hand
out and shake the robot's hand. And so my colleague
Jen Twey, she's our smart home reviewer, and she's like
on a robot mission this particular cees. She puts out
(28:51):
her I'm like, Jen, you know you're doing the robots.
Let me get a video of you shaking this robot's hands.
This robot was like incapable of shaking her hands, and
the one time it does, it pulls her hand down
and she's like Jesus Christ, and then it just starts
flirting with her and.
Speaker 2 (29:07):
We're like, oh, have you heard about the more?
Speaker 1 (29:12):
Yeah?
Speaker 5 (29:12):
But like the funniest thing about this show is that
this is the largest tech show in the in the US,
it's the big thing of the year. No one can
get fucking Bluetooth or Wi Fi correct at this show,
and so all the demos fucking fail all the time.
Speaker 2 (29:25):
I just don't want to touch any more. I don't
want want to go places. I don't want to talk.
I like seeing people.
Speaker 4 (29:31):
You don't want to touch your robots.
Speaker 2 (29:32):
I don't want to touch anyone. I don't want to
shake hands with someone. I want to go to the museum.
Speaker 5 (29:37):
Just the robots.
Speaker 6 (29:38):
But to the point that I was asking the presenter
I was talking to like you you.
Speaker 2 (29:43):
I'm finally talking to me, I quite like And I
think if I just walk through a series of machines,
that was like when I went to Korea, there was
a lot of machines, but like still the occasional person.
If it was just to your description, Adam, just a
series of screens that I interacted, but that would fucking suck.
I'd feel a loan in a way that I quite
like being solitary, But that would just be like I
(30:03):
feel like I'm going through tubes.
Speaker 3 (30:05):
Yeah.
Speaker 6 (30:06):
I think what it's gonna be the future that they're
gonna have I describe, you know, when I go to
the Alaska terminal at Lax, they've specifically done this where
like there's no check in counter, it's mostly kiosks and
there's like two people roaming around. So it's the self
checkout of airlines. And whenever I fly Alaska, there's always
some fucking problem with my ticket. I don't know what
it is with their system, and so every time I'm like,
(30:28):
where's the person? And that's what it's gonna be everywhere
And this as I'm talking to this woman, she goes, well,
I want people to have the choice. Maybe they want
to use the automated system. I'm like, I don't think
we'll have a choice, and she goes, yeah, we won't
have a choice because it's gonna be so much cheaper
to automate everything they're gonna do, like.
Speaker 5 (30:45):
The robo things. You call your bank and you just
want to talk to someone, they're like, press one for
the thing that you don't need to do. I am
press see for the second thing you don't need to do,
and you're just on the phone. You're pressing zero.
Speaker 2 (30:55):
Give me I would take that, and we're going to
rotate to the next thirteen second. But I just want
to say as a British person, every fucking single American
voice activated thing sucks for me. It's just like, okay,
give me a reminder for seven pm calling your mother, calling,
(31:16):
calling mum, sending her your last link, which was the
teen sapper. All right, we're gonna rotate through the next ad.
I fully endorse and if you don't like it, email
me at god at gov dot biz. Welcome Berg, Welcome Berg,
(31:41):
Welcome Berg, Zitron, We're back. It's the Consumer Electronics Show,
better offline and joining me in the room is the
victorious song of the Verge, Robert Evans of Behind the Bastards.
Speaker 9 (31:50):
Yeah, I gotta do this real quick. Sorry, Yeah, I
just like on air was the best way.
Speaker 2 (31:55):
I've decided not to do that because I've drunk so
many diet cokes already and I'm I've got like three
more in me. And of course Adam kind of a
wonderful stand up on the host of the Factory podcast.
Speaker 6 (32:05):
You know, I stayed at your apartment once and I
was frankly concerned at how much diet coke was in
that apartment and no coffee maker. You're one of those
no are you die coke? In the morning, pervert.
Speaker 2 (32:15):
I'm a diet coke whenever. Perfect. Oh man, it was
ten thirty pm and I went to bed half an
hour later and I drank one of those motherfuckers with
my sleeping meds. Yeah, I'm a they're gonna.
Speaker 5 (32:29):
Say, so, caffeine doesn't affect you.
Speaker 2 (32:31):
It just it's fine. Yeah, Like I have like ADHD,
and I take medicae.
Speaker 9 (32:35):
Off prescription at all, but I.
Speaker 2 (32:36):
Also have concerto. Like they give me that, but the
doctor gives me that because I actually unsupposingly have ADHD.
You haven't worked that out from the tens of thousands words.
So Robert, you got the sea of intuit to say something.
Speaker 9 (32:50):
The CMO into it. Yeah, So I started my day.
All I did really today was panels. I was in
like six different fucking panels, and the first one was
about uh one, I should probably have had this.
Speaker 2 (33:06):
It's okay, intuit intu. It's wonderful as well because they
have this new chat GPT integration. They've got it.
Speaker 9 (33:13):
That's exactly what I was talking about. So I was
in this uh this panel, which is like a leadership
roundtable sorry on like technology and advertising. Although the actual
panel did not seem to have much to do with that,
and the CMO of Intuit was there and talked about
(33:35):
the integration that they've been doing with chat GPT and
talked about how like they're partnering with open ai so
that you have when you're doing your taxes, you have
access to like a chat bot basically to help do
and you can have an agent. The idea is an
agent will at some point be able to handle most
of this task. But yeah, well that's kind of the problem,
(33:55):
and like so there's there's a couple of issues with this.
When I first started, like, I had been reading about
the into it integration because they're putting like one hundred
million dollars into open ai million, not bl the trip.
I said that clearly. And when that first happened, I
read an article by a security researcher who was pointing
out that they are giving into it is giving read
(34:18):
and write access to an LLM jesus of your tax data,
of any of the tax data that's integrated into that,
which is a problem from a security point of view.
One of the things this guy points out is that
there's a kind of attack that you can execute against
(34:38):
AI models that is there's a kind of attack that
you can execute against AI. Yeah, yeah, prompt injection.
Speaker 2 (34:49):
And that's when you do you go to a website
which tells the LM to do something.
Speaker 9 (34:52):
Right, a website or one of the things that this
guy proposes. If you want to put in a fraudulent invoice,
you could have an invoice that has that's not like
visible to the naked eye, but that the LLLM would
read and that text basically executes an exploit. And there's
not a like you can't defend against these really like
(35:13):
there's not a functional way to stop this. The point
he was making is that it's not a question of
whether or not there will be a data breach of
people's tax data because they're integrating it with this chatbot.
But when And the question that this researcher had that
I thought was a really good question was who is
then responsible? This is Chris Black published this in a
(35:34):
website called a raptus, But who is responsible if your
data all gets leaked because of this integration? Is into
It responsible? Is open AI responsible? Is whatever third party
you're using that uses into it software responsible? So I
asked the Thomas renize, who's the CMO of into it
that right when they had time for questions, and I said,
(35:56):
I basically brought up what's in this article? I said,
given the fact that this is in an ability that
attacks like this will be carried out and data will
be leaked, who do you like who is responsible? Like
who should take legal responsibility in that instance? And like
what safeguards are you building to try and deal with
the fact that this is going to happen. And the
(36:18):
first answer he gave is that we're talking about it
with open AI, which which I viewed as a non answer.
So I came up to him after the panel and
kind of corning up a little bit and asked, like,
you have to have more You're the CMO. Yeah right,
you're like the chief marketing officer of this entire company.
Security is a pretty big marketing huge standpoint, right, Like
I would think you would know something about this, And
(36:41):
he eventually got angry at me and said, like, I
don't have any of that information. That's not my job
basically to care about this sort of thing. I've got
a recording of it we'll play at some point, but
that that's literally all that I got from him, was like,
we're talking about it with open AI, And I don't
really know A.
Speaker 5 (36:58):
Browsers right, because yeah, hair browsers doing the same thing.
They're so so, so so vulnerable to the prompt jack.
Speaker 2 (37:05):
I really like that. Also, they trying to pretend A
eyebrowsers are useful. They're not.
Speaker 5 (37:09):
I tested five of them. Me, I had to test
five fucking AI browsers. All I wanted was the pair
of fucking new balances, and it couldn't get me. I
had to get my mother in law to get me
the new balances, Like what are you doing?
Speaker 3 (37:23):
So what are you doing on the AI browser? Try
to get the new.
Speaker 5 (37:27):
So you know, they're like gent like AI. You can
ask it to find you the fucking new balances or whatnot,
and it's supposed to help you research this stuff. And
all these tech CEOs are like, well, the future of
computing is that you will not do anything on the computer.
You'll talk to It's kind of like Her, the movie Her,
where you just talk to the computer and it goes
(37:49):
here you go. So all I wanted was a pair
of fucking new balances. But I take my shoes very seriously, right,
and I'm like, Okay, I want a pair of new balances?
What these are the prep like the parameters that I want?
How am I going to get them? And my query
kept getting longer and longer and longer and longer and
longer because they can't into it things. So I was like,
my foot size is this, This is the stuff that
(38:10):
I have to deal with. This is the color I want,
this is the style that I want. These are the
scenarios that I want to wear them, and these are
the prices that I want to have. So it's like
a fucking three paragraph prompt that I have to give,
and I'm asking all five of these different stupid AI
browsers to just get me a pair of goddamn new balances,
and it's just like that, I can't do it.
Speaker 2 (38:31):
Oh So I like browsing the Internet, not as much
as I used to due to all the obstacles, but
looking for a pair of shoes, looking for something in
a store, that's kind of you know, you fuck around,
you learn some stuff. It's kind of fun.
Speaker 5 (38:42):
Like, yeah, no, I did ruin once again, cursed tech beat.
I did ruin my mental health just trying to bigger.
Speaker 2 (38:49):
Pair by three hundred word essay.
Speaker 6 (38:51):
Yeah, I mean the part of The enjoyment of purchasing
something is to know that you got the thing that
you want. You put the work in, you found the
like I'm trying to I've been trying to buy the
perfect set of towels for a little while and I'm
and I was like, I was like, no, I don't
want to get the Onsen towels that like the fake
Japanese towels. I want to find some real Japanese towel.
(39:12):
And so I found out about like a variety of
towel that's like only made in one town in Japan,
trying to find the right vendor. And I know when
I get the towels, I'm going to be like these,
I went through something to get these right. And that's
how people you want to feel about your new balances.
Speaker 3 (39:26):
These are so good.
Speaker 6 (39:28):
You want to I got the ones I put a
little bit of sweat equity in. You want to be
like I got some shoes. The computer bought me some shoes.
Speaker 5 (39:35):
Also, the computer doesn't listen. I said, I already have
the perfect running shoe, right, I thought I found that
I was stylistic. I want walking shoes. And what did
it do? It gave me the fucking running shoes over
and over and over again. I said I didn't want
the running shoes. I wanted the walking shoes.
Speaker 3 (39:55):
Right.
Speaker 2 (39:56):
This is kind of like every one of these AI
browsers I've read about. It's kind of the same things.
Like imagine, if you will, you're seeing that you want
to buy something, and there's just no way to do that.
It's just like an alien, an alien that's never used anything. Yeah,
how do we do anything? Oh, I'll type in and
it's like, yeah, off the three minutes, it might do something. Right. Yeah,
It's like fucking yeah.
Speaker 9 (40:17):
That's the that's the frust That's the frustrating thing about
sitting in all of these panels in particular, is that
it's like entering this alternate world where all of the
technology works much better than it does, and people talk
about it as if like, yeah, we deployed fifteen hundred agents,
right that are you know, they're doing great work, And
it's like, okay, but I'm looking at, for example, the
(40:37):
statistics that Copilot released last year that about so or
that were released about Copile last year that about seventy
percent of work tasks that it's tasked with, it fails
at y right, because the agents are not functional yet.
Speaker 5 (40:50):
So I you know, I'm trying to tell Atlas chatgbt's
thing to buy me these fucking new balances, and it's
just like, oh, crap, there's a pop up. Let me
try and close this pop up. Oh that didn't close
the pop up. Let me click more precisely at the
coordinates and it's listing the coordinates on the site. Let
me try and close the pop It took.
Speaker 2 (41:11):
Just gas fumes.
Speaker 5 (41:12):
It took five I'm watching it. It took five attempts
for the AI to close the pop up a minute
and thirty seconds. I'm just watching it, and I'm just like,
Jesus fucking crazy.
Speaker 2 (41:23):
Close.
Speaker 3 (41:24):
It's narrating as it's doing.
Speaker 5 (41:26):
Yeah, yeah, yeah, it narrates. It goes just like, let
me think about the best way that I can close
this pop up. It didn't seem like I clicked hard enough.
Let me look at the coordinates and click more precisely. Well,
that didn't order.
Speaker 7 (41:38):
I know.
Speaker 2 (41:38):
The compute on that is also bonkers.
Speaker 5 (41:41):
You know that they boiled oceans to watch and AI
struggle to close pop ups minute and.
Speaker 2 (41:46):
Seconds giraffes from the back of a truck into a
furnace to make this like a guest turbine in a
poor community in Texas. Yeah, many people got cancer and
you didn't get the shoes you wanted.
Speaker 9 (41:58):
Yeah, And it's in between panels. A friend of mine
back and Iportant sent me a video that's some guy
I think in Tennessee recorded of like the sound of
the data center that went up near its house. That's
like this, I mean for months has just been this
like constant like droning beep noise thing, like like it
just sounds awful. And so I'm I'm going in between
(42:20):
these people talking about like you know and really focusing
on how like AI's going to augment human capability and
you know, these agents are are only getting more capable,
and like we're going to be using this to like
really set human creativity on fire. And then I'm like
watching a video where a guy's like, yeah, now it
screams every day, the internet screams every day outside my house. Yeah,
(42:44):
And I'm looking at the statistics that show that like
none of this stuff works as well as these people
are are saying it does, right, like not to the
extent that like I would want it doing a job
that really matters, and I'm just so like in between that.
The other thing that I read while I was going
in between sessions was an article that came out today
(43:04):
about a patent Sony filed last year to basically create
use an AI agent to play video games for you
when you're trying to do something hard if you can't
beat a boss.
Speaker 5 (43:16):
Did you see Razor stuff?
Speaker 9 (43:17):
Yeah, well, and that's the other crazy I haven't seen
the booth yet though, but you want to talk about that.
Speaker 1 (43:22):
I saw that.
Speaker 5 (43:23):
I was in that really sweaty Venetian suite where they
were giving the preview of the role a.
Speaker 2 (43:29):
You know, it's so crazy.
Speaker 9 (43:30):
Oh my god.
Speaker 5 (43:32):
So they had two crazy AI concepts. First, this Project EVA,
and the only way you can describe it is a
little anime wife, the wife Tube. It's the wife Tube.
It's a little desktop wife tube where the holographic anime
wife who named Kia, and like there's a camera in
the thing and so she's watching it and she's gone,
like and it's based on grox Annie if you oh god, yeah,
(43:56):
they did tell they did tell us like specifically that
it was based on And I'm.
Speaker 2 (44:00):
Like, oh, did you ask if it was based on
an adult.
Speaker 9 (44:03):
Well, I'll tell you what it's based on. Visually, if
you ever watched the show Archer, there's like, yeah, creepy
Sid has the Waifu hologround looks identical smaller.
Speaker 5 (44:15):
I was there with my colleague Antonio, who Tania, Oh god,
sorry Antonio, because he does listen to your podcast, Antonio
di Benedetto and he's our laptop reviewer. And so they're like,
you know, Kira can help you when you're playing your game,
and so he's playing like this a first person shooter
but it's just like a it's like not an actual level,
(44:36):
so you can't die in this section. And she's like, oh,
this is the gun you're holding is not the correct gun,
and it's like, oh, you died and he didn't die.
And then they like programmed, they programmed like, oh, so
did you see anything really cool at cees and like
this is how she talks because it's based on and
he's like no, and she's like, oh that's cool. And
(45:00):
but so you can have different like avatars doesn't have
to be an AI wife who you can also have
Zane the AI husbando and he has like nake tattoos
on there and it goes like, hang.
Speaker 2 (45:15):
On, I'm stuck. I need to ask the wife tube again.
Speaker 5 (45:18):
But like that's the whole point of games is that
you're strugged, is that you're playing, that you're using the
problem solving and all that, and it's just like, no,
tell me how to beat this level. And all the
time I talk to these companies and they're like, yeah.
Speaker 10 (45:31):
You're going to be able to yeah, because they're training
it off of like YouTube videos of people playing the
game too, which I also I just don't know if
I believe it's going to work well.
Speaker 9 (45:41):
It is a pattern, so maybe Sony doesn't either, Like they.
Speaker 6 (45:45):
Are they showing the AI just like the video frames,
like the literal.
Speaker 9 (45:49):
The YouTube video from the patent from what I haven't
read the patent. I read an article about the patent application.
That's what it sounds like.
Speaker 6 (45:56):
You could you could program an old fashioned AI, what
we used to call AI in video games, just like
you know and n PC.
Speaker 3 (46:03):
You could just when you're making a video game, you
have the world state, you know, how the game works.
Speaker 6 (46:08):
You could just like have something in the game. I think, yeah, exactly,
that like is part of the game. It's an artificial world.
Speaker 3 (46:15):
Why do you need to insult a game?
Speaker 9 (46:17):
I think this is like Aana.
Speaker 5 (46:20):
It didn't ask if we should actually fucking make Krtana.
Losers just played Halo way too when they were in
their formative years.
Speaker 6 (46:28):
Yes, I think a big mistake that they're making, or
just which one. Well, look, there's such a big difference
between a tool that extends or enhances human capability makes
you more powerful, and you not doing it yourself and
having someone else do it for you. I literally, just
(46:49):
for the first time, hired an assistant because I the
perfect person who I knew would do is such a
great job for me, is available for a couple of months.
So it's great you correspond with she's amazing, right, But
like it's because I have worked with her before in
a different capacity. I know she's amazing. I like trust
her to do all these various things that I don't
have to do myself anymore. But the moment that, like
(47:12):
I work with someone who fucks something up, I'm like, well,
I can never work with the contract. I'm not going
to assign things to them. And like versus, when I
fuck something up with a tool that I'm using myself,
I'm like, Okay, I need to do a little bit
better or whatever. Like the experience of those things is
so fundamentally different. The use cases for them is different,
the sort of person who wants to do them is different.
(47:33):
And so like, I don't know, like for into it
to be a company that goes from you know, here's
tax software that helps you do your own taxes if
you would really like to do that, and yes, but
like to go from that to now we will automatic
will be an automatic accountant. Is like they don't understand
(47:55):
the gulf that they're trying to jump.
Speaker 2 (47:56):
Now, well, it's because they misunderstand. It's kind of the
ongoing thing of See, it's like people that don't experience
the problems they're trying to fix. Because the best assistant,
to your point is someone that you kind of hand
stuff up, like my written edits to Matt Hughes. The
whole reason Matt works well with me is he can anticipate,
like he kind of gets my thing and he knows yeah,
and I can trust him to get it without me
(48:17):
having to I don't know, give a three hundred word
prompt to explain every nuance because so much.
Speaker 5 (48:23):
Handholding and I don't want to hand hold an assistant.
That's the whole fucking the whole point.
Speaker 9 (48:27):
Yeah, it's handholding. And it's also like picking having an
assistant to do the things you wouldn't have an assistant do,
like I don't play a video game for you. Yeah,
but there's so much of it is focused on, like
the the tasks that are deeply human. Like whenever I
(48:48):
hear like this AI can help you brainstorm, it's like, well, no,
it can't. That's not how brainstorming works. Someone else can't
brainstorm for you.
Speaker 1 (48:55):
That's not You're.
Speaker 5 (48:56):
Just talking to yourself in the mirror. But the mirror
is a chapla.
Speaker 9 (48:59):
Yeah, and you were hoping to plagiarize a good idea
from the chatbot that itself is built from plagiarism, which
I also feel like is a mistake.
Speaker 2 (49:08):
But regurgitating thought into your own brain and the hopes
and new idea comes out.
Speaker 6 (49:13):
Yeah, I mean, like I've talked about this before, my
most positive experience with chat GPT was I was trying
to decide whether to get myself a little apartment in
New York and I'll talk to you about this, yeah, right,
whether it's a good financial decision, and I plug into
chat GPT. Hey, here's all the numbers, and like, is
this a good thing?
Speaker 3 (49:31):
You know?
Speaker 6 (49:32):
Would this be a big mistake? And it sort of goes, no,
it wouldn't be a big mistake. You could do that.
If you want it, you can do it. And then
I'm like, okay, but what about the fact that I
can't sublet it? Oh, that means it's bad.
Speaker 3 (49:41):
Now you shouldn't do it.
Speaker 9 (49:42):
Now you're fine.
Speaker 6 (49:42):
Oh, but what about the fact that it's kind of
important to me. Well, if it's really important to you,
you should do it. And I did actually find that clarifying,
like it did help me think through it all.
Speaker 2 (49:51):
But just agreed with your lost statement.
Speaker 3 (49:53):
I was talking to myself in the mirror.
Speaker 5 (49:55):
The only that's the most useful it is it's like
you need to talk out loud to yourself, to myself,
you can, But sometimes I can do that to you people.
Speaker 6 (50:04):
If you answer all my texts for some reason, because
my friend.
Speaker 2 (50:08):
And I love you, like that's why I do it,
because I care about you and want you to be happy.
And the chat ball.
Speaker 5 (50:14):
Doesn't can't, No, the chatbot doesn't care.
Speaker 2 (50:16):
I actually wonder if my latent like actual autism is
mounting myself on the show. Like, that's why it doesn't work,
because when every time I've tried anything like this and
just like just talk it to my fucking self, Why
are you agreeing with me on everything?
Speaker 5 (50:28):
But I mean like that's because that's you, like to
be real, that's I know. I'm talking to myself on
that too. It's just like it just doesn't work.
Speaker 2 (50:35):
Like I literally have tried to do the things. I'm
just like, no, just like there's a blog. I'm just
like it's just repeating more just faking said.
Speaker 5 (50:43):
Yeah. No, you're like you have to know the artifice
in order to get to a point where you're like,
if you understand that you're talking to yourself, you're just
thinking aloud in a different format, it can be helpful
if you understand that that's what you're doing. People don't
understand that's what they're doing. That's why they're getting the
AI psychosis. Yeah, they're mistaking this for a friend. It's
(51:03):
not a friend, it's literally a person.
Speaker 9 (51:06):
Yeah. I will say one thing that was interesting to me.
I did a panel that was Intelligence through Motion AI
takes physical form, and it was about like robots, robots
with AI in them. It was actually the better one
of the day. There was not much that was silly,
and it was I think it was a really good
example of like how you ought to be responsibly talking
(51:27):
about and thinking about products like this. So one of
the people who was there was Andrea Thomas, who's the
co founder and CEO of Diligent Robotics, and they've built
a robot that is basically an assistant for doctors and
nurses at the hospital, not doing medical stuff. But like,
the thing that she pointed out is when we because
we started and this was the first thing they liked.
We started not with a product idea, but by talking
(51:49):
to a bunch of doctors and nurses and figuring out, like,
what are the different jobs you have. And one of
the things they figured out is that nurses in particular,
spend a shitload of time every day running stuff back
and forth through the hospital, and so like, well, if
we have a robot that can move things and can
potentially move refrigerated things or move things that you have
to be very careful when carrying, then a skilled, trained
(52:12):
human being isn't doing that bullshit work and they can
do more medical work. And there's not enough medical profession
that's effectively a convey about right right, which that's more
maneuverable than a conveyor because it can go.
Speaker 2 (52:22):
But the point is like it's useful because it's moving
stuff between places, right right.
Speaker 9 (52:27):
And I believe there was some talk about other like
they're looking at adding other capabilities, but the way that
they were thinking about it were, well, first, we're going
to talk to the people who we want to we
want to be making products that will be used in hospitals.
Let's talk to people who work in hospitals to see
where it's useful. And then the other thing that they
said during that this was something a couple other people
(52:47):
pointed out, is that you have to when you're talking
about building products that human beings will be working alongside
in an industrial capacity, you have to assume that there
will be failures and build that in because otherwise you're
taking risks with people's lives. Right, So you have to
be figuring out, we want this thing to fail when
it does, because it will in a way that doesn't
(53:08):
endanger human beings. How do we build that in from
the ground up. This is like a fundamental assumption it
will fail. We need to make sure it fails in
a way that does not lead to it crushing or
killing people because it's going to be around them, And
so I really appreciate it. And it was such a
that was such a gap to me from how everyone else.
I'm used to other people talking about this a lot more.
Speaker 5 (53:30):
In the health techomy, right, yeah, because the stakes are
so high. Yeah, you know, like all the tech people
are like break shit, disrupt, but like medicine is like, no, no,
don't break the people.
Speaker 9 (53:42):
Something's already broken.
Speaker 5 (53:43):
They're broken already.
Speaker 9 (53:45):
Yeah.
Speaker 2 (53:47):
Yeah, I think one of the barriers to that though,
And this is not even me being cynical. It is
just you ask people what the problems are and then
you go, oh, you're an LM. Can't do that sorry.
Like the answer is the problems that you have are
not fixable with the technology we have NAT like housing
and health like health stuff. It's like there are very
big limitations because of medical device clearance. But also just well.
Speaker 5 (54:08):
Technology is streamlining, right, like if you think about the
like I just remember this from a macro acinemic class
in college. But like all technology does is it moves
the curve up. It doesn't. It doesn't like change the
angle of the curve or whatnot. It just you know,
allows you to do more. So it's about streamlining things.
It's not about fixing problems. It's about finding things that
(54:31):
you could do more efficiently. Being more efficient doesn't fix
the problem, right, different, it's two different streams that you're approaching,
but they're conflating it by being like, if you're efficient,
you fix the problems. Like, no, the problem isn't always efficiency.
Speaker 6 (54:45):
Yeah, and you know, I mean the thing I think
about a lot is what I'm in any city, but
especially New York or LA where I spend a lot
of time. It's like the physical infrastructure is so bad.
But everybody has cell phones right, right, Like no one's
even like oh, homeless people have We know, everybody has
cell phones, like because the price of them is so
cheap and and you can do a lot with them.
(55:05):
But like there's potholes, the trains don't run right, and
like those are the problems in society, and the fact
that we have you know, high technology in these particular
areas but not in like the rest of our society
is like a weird distortion. We all know this. We
don't think about it often enough. But like what you're
talking about is is this AI will like push that
(55:28):
distortion even further. Yeah, in the best case scenario and
people like you know, Simon Wilson or whatever will have
like some crazy agentic shit happening. You know, like they'll
they'll be you know, in some fucking matrix and everybody
else will be like, how do I buy food?
Speaker 2 (55:45):
Someone Willisson be trying to draw a Pelican on a bicycle.
I think that's the one thing that's literally the test
thrones on every LM. But yeah, Willison is kind of
realistic about and yeah, yeah, to your point, Robert, with
that panel, it's like, yeah, we're talking about this fantasy land.
But even when we so during the Nvidia conference, it
was talking about yeah, and now agents are getting even
more powerful in doing this.
Speaker 3 (56:05):
They're not well yeah, no they're not.
Speaker 2 (56:07):
But also gun to your head, explain what that means, like,
because I regularly am arguing with people about this online
perpetual that.
Speaker 5 (56:14):
Was they're talking about the agents and the consumerism and
the commerce aspect of agents just because that's the only
way they have a chance of making them.
Speaker 9 (56:22):
Such a lack of specificity, like that is the thing
that is most like in like the first three panels
I did, all of which were one way or the
other talking about like agents and their ability to like
improve productivity and creativity. There was an exam exactly one
specific example of a thing that people used AI, like
that a marketing or company used AI to do for marketing, right,
(56:44):
because these were all fairly marketing focused, and it was
specifically it was the company that makes Allegra had like
a new non drowsy formula, and they engaged in what
the the I think it was a deloittte I forget,
I forget who who said this, it's in my notes,
but what the person who was talking about it described
as model hacking to basically put out a bunch of
(57:06):
content so that chetch ept would repeatedly scrape articles or
whatnot that talked about how Allegra was non drowsy, so
that if people asked questions about it, the LLLM would
return both always would say l Allegra is non drowsy,
would like put that before the word Allegra, and would
like subtly talk about other allergy medicines causing drowsiness. And
(57:29):
they were talking about like and that's like a really
great example like the this is like a strategy developed
using AI that like, uh is not what we couldn't
have done marketing like this before. It couldn't possibly have
come up with a way to reverse. Yeah, first off,
like you're you're the most specific example of the use
of this technology in advertising is disinformation. Yeah right, basically
(57:51):
you know, or at least that yeah, on that spectrum.
Speaker 4 (57:55):
And that probably costs thousands of dogs, Like probably so
thousands of articles or whatever just fed into the training
Jesus fucking Christ.
Speaker 9 (58:05):
Well, and at the same time, they're talking about how
important trust is and that like you have to be
you have to keep and there's yeah as they lie
and they're like, well, you know, people only trust you,
will only trust your AI as much as they trust
your company, So it's really about like having a good
bread so the other way round.
Speaker 2 (58:23):
Well, that's what kind.
Speaker 9 (58:24):
Of what I took it as them saying is that like, look,
people don't trust AI, but they still have some trust
for specific brands. So if you really want to, like
there's a limited period of time for.
Speaker 1 (58:34):
For that trust to stay alive.
Speaker 2 (58:36):
It feels like blow it up.
Speaker 5 (58:38):
Yeah, that's honest to god, makes the show more boring.
Speaker 9 (58:42):
Yeah, it really, it's all fucking AI ship.
Speaker 8 (58:46):
Yeah.
Speaker 6 (58:46):
Like the stuff I was popping for was there was
a robot lawnmower that could go up a hill.
Speaker 2 (58:51):
Yeah, see that.
Speaker 5 (58:52):
All the robots are getting GEN again. Look like Gen's coverage.
It's just about how last year the robots had arms,
This year the robots have legs.
Speaker 2 (59:03):
Next year, big old Willie.
Speaker 3 (59:05):
There was also a.
Speaker 6 (59:07):
There was we were I was talking about the jerk
off robot last time. But first I saw there was
a robot called beat Bot, and it said.
Speaker 9 (59:14):
That was like a masturbation.
Speaker 6 (59:16):
That wasn't It was that nothing beats beat bot, and
I was like, this is a masturbation.
Speaker 3 (59:20):
Great.
Speaker 6 (59:21):
It turns out it's a little robot that swims in
your pool and cleans your pool. And I think maybe
it's also the Bluetooth speaker, and I was like, it's
like a pool room.
Speaker 9 (59:28):
But also you didn't if you were going to if
there's any good call for a hybridized robot, a combination
pool robot or gas and robot, like make the sexy
pool boy a robot.
Speaker 2 (59:43):
And it's just doing that weird song from you remember
that one. Yeah, there was the offbeat song the guy
was claiming he would only have sex to No. No
one remembers that. No, just oh god.
Speaker 9 (59:57):
Yeah, you can market it to Richmond with like wives
in their tow twenties where you're like, look, your wife
is going to cheat on you with the cabana boy.
Just make sure it's a robot cabana book.
Speaker 1 (01:00:08):
What you look depressed?
Speaker 9 (01:00:10):
Honey? Do you need some time with the cabana bot?
Speaker 2 (01:00:13):
On new music?
Speaker 6 (01:00:14):
Genuinely the best, one of the best use cases for
generative AI has sex stuff like it.
Speaker 9 (01:00:19):
It's certainly one of the most.
Speaker 5 (01:00:21):
Use Sorry, as someone who was subjected to testing Annie
the Grock sexpot, Jesus that was damaging, you might have.
Speaker 9 (01:00:30):
A workman's comp Yeah.
Speaker 5 (01:00:33):
Listen, I keep asking for hazard pay at the Verge
dot com and for my title to change to Senior
Curse tech Reviewer. It's not happened yet. It's not happened yet.
But I did have to test Annie, and boy did she.
Speaker 6 (01:00:45):
Yeah noticed, Like the only people who are really making
good use of like Sora style videos are niche fetish
communities on the Internet who have been able to move
from writing fan fiction for each other because that's a
cheap thing where they can, you know, have make media
for each other. Now they can make little videos they're
(01:01:06):
making and they're so happy there.
Speaker 5 (01:01:09):
Okay, but AI doesn't know how to animate tongues. And
I know this because again my job is cursed, and
I had to I had to test these AI video
generating more men kissing, making the AI kiss each other. No,
they don't, they don't. They don't understand how tongues work.
Speaker 2 (01:01:30):
It's just a little Jensen Wong said that the new
generation Vera Ruben is four times more powerful for training.
So maybe that's the Jensen Huang's bring the tongues in.
But sadly we are go, we're gonna have to end
this thirty second. The next thing coming up is an
ad for a tongue training models. And if you hear
(01:01:50):
something else that's an error. Whatever it is, just ignore
it because we're meant to. It's the tongue training app.
But before we get to the tongue training app, Robert
has a very important man.
Speaker 9 (01:02:00):
So I came across the booth during the brief time
I spent on the floor today of a robotics company
called Zeroith, which I think is a reference to the
laws of robotics, and you know, you had the normal
laws and there was the zero with law, YadA, YadA, YadA. Anyway,
we don't need to get into Asimov right now, but
it's they had a whole booklet. I actually thought these
were like notebooks at first, but every page of this
is printed, explaining the vision for their company and the
(01:02:23):
robots they make. The origin story. I'm just going to
read a little bit from here, Okay. The prevailing narrative
in robotics has long been one of replacement and utility,
Robots as instruments of precision, built for efficiency and stripped
of warmth. Yet from across the long memory of civilization,
the impulse to create was never merely mechanical. From yan
Shies graceful wooden automaton and ancient China to Tallos, the
(01:02:45):
bronze guardian of crete, our earliest machines were not tools.
There were mirrors, reflections of an ancient longing too, with
the reflections of an ancient longing to understand ourselves. Zero
withth emerges from that unbroken thread of imagination and hearing that,
and there's so much more flowery pros. I need you
all to see. The robot that they wrote that.
Speaker 2 (01:03:05):
Okay, let's let's take it.
Speaker 6 (01:03:12):
Why is he wearing a little Zo Polo shirt?
Speaker 1 (01:03:15):
I love that that's your first question.
Speaker 9 (01:03:17):
So listeners at home, what they've got is a robot
that looks kind of like a cross between Elon Musk's
shitty robot and a guy in a fencing helmet. It
can't stand on its own power. They have like a
sex harness built that's like standing it up. That's like
so it doesn't fall on its ass, and they put
an awkward T shirt on it, I think, to hide
(01:03:39):
the straps.
Speaker 1 (01:03:40):
From the thing that keeps it standing.
Speaker 2 (01:03:42):
Jesus fucking this is the consumer electronic show everyone. This
is yeah, they did. They are lying to you. Any
motherfucker that tells you that robotics ry is coming, they're
fucking lying anyway. If this is an AI add off
to this, that's the fucking problem. Diane. I'm once again
(01:04:10):
in scenic Las Vegas, and the locals are staring at
me with daggers in their eyes as I insist on
podcasting further. After rotating cubes in my head all day,
I'm now rotating. My guests joining me once again is
victorious song of the Vergine, not remotely. You're wonderful and
your very popular listeners love you as well. It's important
for you to hear this.
Speaker 5 (01:04:28):
Not from my ego.
Speaker 2 (01:04:30):
You need the ego. Ed On Guaiso, Junior of the
Tech Bubble newsletter and the incredible, wonderful stand up comedian.
The start of the movie is this thing on. Chloe
Radcliffe is joining us?
Speaker 1 (01:04:41):
Yeah, what is hell?
Speaker 2 (01:04:44):
Your first ces?
Speaker 4 (01:04:46):
Yeah?
Speaker 2 (01:04:46):
What do you think?
Speaker 8 (01:04:47):
I'm a cees virgin? Wow, there's so many boys.
Speaker 5 (01:04:53):
It's crazy boy.
Speaker 8 (01:04:56):
No, no, no, no, there's so many hairlines. Is I
should stay a lot of balls? Yeah, a lot of combs, Yeah,
a lot of a lot of comb comb over ponytails.
Speaker 2 (01:05:08):
Offer any hair stuff?
Speaker 1 (01:05:10):
Yes they do.
Speaker 5 (01:05:12):
Did you go to go to Eureka Park? There's always
someone who has like some led hair bands situation and
it's yeah that's good.
Speaker 2 (01:05:21):
Yeah. So Chloe, you'll first ces. Have you seen anything
that's stuck in your brain?
Speaker 8 (01:05:27):
My biggest take has so far from being on the
floor for a pretty limited amount of time, is that
the theme is numbers that you don't need.
Speaker 2 (01:05:37):
Okay.
Speaker 8 (01:05:39):
I stopped at a booth that had a smart dog cage.
Speaker 5 (01:05:43):
Oh, the smart pet crate.
Speaker 8 (01:05:45):
Yeah, yeah, smart pet crate and it tells you your
dog's It tells you your dog's heart rate and breathing rate,
and then it also tells you the distance between the
top of the cage and the dog.
Speaker 5 (01:05:57):
Welcome to my useless numbers.
Speaker 8 (01:05:59):
Uselessnessnumbers, And I said to the guy I was, I like,
stopped and was Lena, And I was like, well, why
do you need the distance from the top of the
cage to the top of the dog? And he goes, well,
that's how you know whether the dog is moving or not.
And I was like, yeah, but you just look and
he goes, well, but you're like looking at your app
and so you want to know maybe if the number,
if the number has stayed static for a little while,
(01:06:20):
maybe the dog has died. I said yeah, but wouldn't
you know that from the other numbers like heart right.
Speaker 1 (01:06:36):
To take your dog dead.
Speaker 5 (01:06:39):
They're gonna ramp up to four hundred biometrics. That's something.
Speaker 2 (01:06:44):
Your dog is Lena Dunham Inde. I love that well
because it's like, if the dog's moving, couldn't emotion sense,
I mean, there's so.
Speaker 8 (01:06:58):
Many things, couldn't We already have pet camps whoever has
whoever is.
Speaker 1 (01:07:04):
Buying a smart pet crate also.
Speaker 8 (01:07:05):
Already has a video camera trained on that dark's crane.
That person is not doing smart crate over pet camp and.
Speaker 11 (01:07:12):
They're like, hey, can you look at this graft?
Speaker 5 (01:07:14):
Do you think I've got I've got two cats and
a lot of cat tech. I don't need a crate. No,
Like I got auto feeder. It has a little camera.
And we did that because when we went to Italy,
I was just like, I got to make sure these
dumb cats are eating in between pet sitter visits. And
(01:07:34):
literally day one I'm in Italy, I watched my cat
look at me through the camera. It's pissed, knock cat rage, and.
Speaker 8 (01:07:45):
I do need to ask if the cats were not
eating between feed feeders, between pet sitters.
Speaker 1 (01:07:51):
Yeah, what were you gonna do for me?
Speaker 5 (01:07:53):
Nothing?
Speaker 1 (01:07:54):
Nothing.
Speaker 5 (01:07:54):
I was gonna call the pet sitter and be crazy
lady and be like, my extremely ro ton nineteen point
five pound cat is not eating.
Speaker 8 (01:08:03):
It's now living off its own body fat and it
will be fun. Yeah, and it has enough calories to
support it enough. But yeah, So I mean that's the thing.
I think that like all of these numbers that my
theme of numbers that you don't need. What they wind
up filtering into is anxiety. Oh yeah, it's like too
much information leads to anxiety.
Speaker 5 (01:08:24):
Senior cursed tech reviewer. Because I I'd test wearable. I
have useless numbers for humans, yes all the time. Yeah,
and just I'm completely overwhelmed and over inundated, and yeah,
it's very accurate.
Speaker 8 (01:08:38):
It would drive me insane all of the sleep tech
I'm The thing is that I'm like sort of the
wrong person to be here because I'm like, stay away
from me with your with your numbers and your vibrations.
Speaker 2 (01:08:51):
Having you here is good. We need that kind of
energy because there's too many perverts here. Like, actually, we
need enough number so we know how the space between
your dog and the seeing the crate, but what if
about what about the sides?
Speaker 1 (01:09:03):
Yeah, how are they going to get that?
Speaker 2 (01:09:05):
And also just I love the idea of getting a
notification about what does notification say your dog is this tall?
We've measured your dog.
Speaker 5 (01:09:12):
Like the smart little robots they're now having like cameras
so you can watch your cat go into the little
robot to piss.
Speaker 8 (01:09:20):
Wow, they already, they say, already weigh the cats as well,
so you can see how much my like I can.
Speaker 5 (01:09:27):
I can literally look into an app and go like,
oh you lost zero point five pounds on that douchie,
good for you.
Speaker 1 (01:09:36):
What do you mean eating?
Speaker 6 (01:09:37):
Brother?
Speaker 1 (01:09:38):
Brother?
Speaker 2 (01:09:39):
The only exception is is if I have a cat
with a urinary thing, how my beautiful, big, beautiful and annoying,
just like daddy. But he is this beautiful cat who
he occasionally has pe problems. But you can mostly just
get that by looking at your cat like you're just
checking occasionally be like, oh, okay, he hasn't paid in
a while. I don't need a four hundred and fifty
dollars WiFi connected thing to be like, are you sure?
(01:10:03):
I think that's how much they know?
Speaker 5 (01:10:05):
They're like eight hundred dollars.
Speaker 8 (01:10:07):
Fuck off, Sorry not to you, but the cat thing, Yeah,
how much do you love your cat?
Speaker 2 (01:10:12):
I love my cat so much. But a beautiful ARMANI suit.
I assume god he'd look fetching. He's a big, big,
white and ginger cat. Oh my god. No, I did
get him a bow tie at one point.
Speaker 11 (01:10:27):
Off, Oh I hear it. Yeah, I used to get
my cats.
Speaker 12 (01:10:31):
I had the both ties with bells so I could
hear them because they would disappear into yeah, I didn't
know where in the house, and they always would rip
them off and leave them yeah outside.
Speaker 2 (01:10:40):
Like a real fuck you, like just like that's where
that ship goes, right, don't don't put bells on me. Yeah.
I put a little cats on, a little tie on
him as well. Once. There's so many good things you
can do with the cat, none of which involve an
eight hundred and fifty dollars piss measurer.
Speaker 1 (01:10:56):
One time I tied a balloon around a tortoise. Didn't
why because we kept losing it.
Speaker 8 (01:11:02):
I lived and it kept getting stuck under the radiator,
but then we wouldn't find it for a couple of days.
Speaker 1 (01:11:08):
She was fine, but kept was just smoking around the
apoblem with the ball.
Speaker 2 (01:11:15):
Told version of up.
Speaker 1 (01:11:16):
Yeah, she was trying to get taken away.
Speaker 5 (01:11:19):
That should be at the but at the consumer electronic
show would probably be like a wearable tortoise tracker with.
Speaker 8 (01:11:27):
No no, you can steal a balloon, you can ruin
a kid's day to get your tortoise found.
Speaker 5 (01:11:32):
They would have like an optical tracker to read the
tortoises temperature and it's art right, so you go to
her app and be like, is my reptile alive?
Speaker 2 (01:11:40):
Is what happened to you? It's just a tool slowly
walking ye seven hundred dollars.
Speaker 12 (01:11:47):
There's this video from an account I followed because of
this video where they have a tortoise and they have
a cat and the tortoise has a skateboard to follow
the cat around all day and harassium, that's the tech
I would get.
Speaker 11 (01:11:59):
That's, you know, a little skateboard for my little guys.
Speaker 1 (01:12:02):
You know, someday we're all gonna go.
Speaker 2 (01:12:03):
Back, so is cats and skateboards. That's the real technology.
Did you get any other numbers that didn't that don't
match or make you upset?
Speaker 1 (01:12:14):
There was a there was a sleep bot.
Speaker 2 (01:12:17):
Nice.
Speaker 1 (01:12:17):
I think I'm calling it the right thing.
Speaker 8 (01:12:18):
It's a it's a smart pillow, okay, and uh it
seems like the actually useful thing is that it's filled
with these little like pneumatic air sacks that inflate or
deflate based on your snoring to stop you from and.
Speaker 5 (01:12:35):
That that beds. Yeah, okay, that seems.
Speaker 1 (01:12:38):
Right, like I can buy that.
Speaker 8 (01:12:39):
It's like that that that passes the sniff test. But
this pillow also when you're When it senses your head
is on the pillow, it turns off your lights, It
turns off your uh speaker if you have a speaker plane,
it closes your curtains for you automatically, and then it
this is not a lie, texts your family that you
(01:12:59):
are asleep.
Speaker 1 (01:13:02):
And I took pictures.
Speaker 2 (01:13:04):
Of pillow for a prison you know, I'm.
Speaker 1 (01:13:08):
Going to close. I'm reading.
Speaker 8 (01:13:11):
I'm reading the example text that it has sending a
message to the family. Cassie Oscar has been detected lying down. Wow,
Jimmy Oscar has been detected, lying down.
Speaker 1 (01:13:28):
I am I am not making this up.
Speaker 8 (01:13:30):
Doctor Larry Calibracy, Oscar has been detected lying down.
Speaker 2 (01:13:38):
Text he has this is this is a prison pillow.
Speaker 1 (01:13:42):
Yeah, this is just for what you do.
Speaker 11 (01:13:43):
If you got that text, I.
Speaker 2 (01:13:46):
Would be horrified. So funny, I mean everyone, and of
course everyone who comes to see yes with me has
to use I love the odor as well. That you're
just going to bed, your your blind's rope and all
of the lights are on you just blaring core when
I assume and I have no way of doing any
of this myself, So just my pillow will psychopath I know.
Speaker 5 (01:14:08):
Sleep tech is all about like optimization. That's why my
newsletters called like an optomizer. This whole beat of health
tech and all that is just about having the most
optimal something right. So they're like, oh, science shows that
if you sleep in like X, y Z conditions, you'll
sleep so good, So have a pillow, close all of
(01:14:28):
these things so you're in the most optimal sleep environment whatever,
and like, yeah, it is good to have an optimal
sleep environment. It is good for you to do all
that stuff. But you don't need to fucking text your
doctor calib races about your sleeping.
Speaker 2 (01:14:41):
That would be such a strange text to receive, like
he is, and especially with the way it was written,
like Galactus.
Speaker 1 (01:14:49):
Hated.
Speaker 8 (01:14:51):
Also, the smart pillow doesn't like smooth out your bed
sheets for you, you know, like all of this stuff
that actually make it everything that actually makes sleeping good,
which is like wear the right clothes for you to
sleep in and.
Speaker 1 (01:15:10):
Have the right bed you know.
Speaker 5 (01:15:11):
It's like, I've done the digital sh I've done so
much sleep tech testing, and the only thing that's really
worked for me and my spouse is unfortunately, a five
thousand dollars smart mattress cover, which.
Speaker 2 (01:15:24):
Is yeah, I burn. I pay for one as well,
and like people made fun of it because of the
a WS thing when it went down, your bed wouldn't work. Yeah,
that happens to me.
Speaker 5 (01:15:37):
I think it happened. It definitely happened.
Speaker 2 (01:15:39):
Like I need to sleep in an ice coffin, like
I just my poor girlfriend, just like even for her,
I'm just like, yeah, this is I sleep at fifty
degrees like Dracula.
Speaker 5 (01:15:50):
It saves marriages and relationships because you could sleep at
different temperatures. I need to be a cozy little a
cozy little gal normal.
Speaker 1 (01:15:58):
Yeah, I need to be a little book bug in
a rug.
Speaker 5 (01:16:00):
I need to be like toasty. And my spouse also
needs to be a corpse so we can set different temperatures.
Speaker 2 (01:16:06):
I need to set to Dracula.
Speaker 5 (01:16:07):
Yeah. And you know, they snore, and it used to
be a thing where every every like day, I would
hear them storing and I'd be like, what fuck up?
So I can sleep? But now the bed actually just
lifts when it detects and snoring on his side, no
on our side. So sometimes I go like, oh it's moving,
and I will fall right back asleep. But you know,
ever since that happened, he doesn't wake me up snoring anymore.
(01:16:28):
That's good, Like it's saved our marriage.
Speaker 1 (01:16:30):
Wow, this is unfortunately you're selling me on this discover But.
Speaker 5 (01:16:35):
Unfortunately it also comes with useless numbers, Like how can.
Speaker 2 (01:16:38):
I useless number? Here's my psychopath thing? Though I have
my numbers for my eight sleep and I have my
numbers for my aura ring and they disagree all the time.
I love it.
Speaker 5 (01:16:46):
Yeah, same is like long term tusting that.
Speaker 2 (01:16:49):
My favorite thing was I posted because I also use
so many which is the thing that electrocutes for Actually
what I love it. It's this thing that electrocute you head.
It sounds insane. Just best not to think too hard.
But like I posted my numbers on Blue Sky once,
which was a mistake, and so it was immediate like
you're not getting enough deep sleep, you should be really concerned.
I'm like, I really had to stop myself being like
(01:17:09):
kill you.
Speaker 5 (01:17:10):
To be clear, the or ring is like they do
a lot of science, they do a lot of research.
They're only about like I want to say, seventy nine
percent correlation between the gold standard of polysomnography. That's what
are you guys dreaming a lot? No, I do have medications.
Speaker 1 (01:17:27):
Dream that'll do.
Speaker 5 (01:17:29):
My new medications are giving me night terrors really bad.
Speaker 2 (01:17:32):
I heard them going.
Speaker 12 (01:17:35):
I was like, you gott he's listening to no, no,
he was listening to this stuff.
Speaker 2 (01:17:40):
But he lied down on this pillow. You know he
has night terrors. Yeah. He strikes me to kind of
just wake up, like, yeah, I used.
Speaker 12 (01:17:50):
To think when he went to sleep he would just
be in his memory palace.
Speaker 2 (01:17:53):
Yeah, exactly, like regarding it. Yeah, that's the thing that's well,
it's like the basic things for sleep, like is it
light in there or dark in there? Are you the
right temperature? Are you comfortable? And it's like, yeah, but
what if the pillow did all this ship you already?
Speaker 1 (01:18:08):
I asked. I asked the people at the booth.
Speaker 8 (01:18:10):
I was like, but what if I want to lay
in bed and not have the lights out?
Speaker 3 (01:18:14):
Yeah?
Speaker 1 (01:18:15):
And he goes, well, you'd have to put another pillow
on top.
Speaker 2 (01:18:25):
What's great? What's so good about that? It is likely
no one has asked them that. You're the first person
to be like, what if I am in bed but
not sleeping yet? He's like, fuck, how much was this?
Did he give you a price?
Speaker 1 (01:18:40):
I didn't even ask I should have asked.
Speaker 2 (01:18:41):
It's probably, like.
Speaker 5 (01:18:42):
It's probably my estimate is six hundred yeah, five dollars.
It's always stupid.
Speaker 2 (01:18:48):
N you could go on like eBay and get some
like really nice sheets that will like some like Crisp Bokale.
Speaker 5 (01:18:54):
Perhaps I've tested smart pillows in the past, but like
it was a very different smart pillow was testing like
my breathing rate?
Speaker 11 (01:19:00):
Why not?
Speaker 5 (01:19:01):
But it was also like it had a speaker in
there so you could listen to your podcast to fall
asleep in your smart pillow. And I was like, this
is dumb. It's so dumb.
Speaker 12 (01:19:11):
I'm someone who you know, I'm not trying to optimize sleep,
but I do value the dream pertion of it. So
is there really is for someone like me who's just
like I, you know, I'm interested in deeper sleep. Does
it make sense to go on the journey of trying
out the smart attach it or should I just invest in.
Speaker 3 (01:19:30):
I can't.
Speaker 11 (01:19:30):
I don't dream on melatonin?
Speaker 2 (01:19:32):
What about trust?
Speaker 5 (01:19:33):
Usually melatonin will make your dream super intense?
Speaker 2 (01:19:37):
I wake up?
Speaker 1 (01:19:38):
What about magnesium, the one that doesn't make you poop?
Speaker 2 (01:19:41):
I haven't done what is it?
Speaker 3 (01:19:42):
I know?
Speaker 11 (01:19:42):
I haven't I haven't tried a.
Speaker 8 (01:19:43):
Magnesium I think it's magnesium glyconate makes you sleep, and
magnesium sit trate makes you.
Speaker 5 (01:19:48):
Ship relaxes you. Magnesium healthy aine. And then there's also
you could try a GLP because one of the side
effects of that is intensely vivid dream.
Speaker 12 (01:19:58):
Oh yeah, I mean you know, I like I lucid
dream regularly, which is why I'm like, okay, I'll do
anything as accept whatever goes and fringes on that. But
I've never thought too much about doing the tech beyond,
like I'll just invest in really nice sheet, you know,
and just to get myself really comfort.
Speaker 2 (01:20:17):
The sleep measurement stuff sucks, but I really love the keat.
I love the cold. I love the fact control over
the end. I burn half I sleep and I look
and I have like a full scale like blackout eye mask.
I have a giant sheet. I like curl myself up
in a ball. It's probably not great, Like it's probably
not a great sign. And I'm just like cold and
like wrapped in this thing like I've just blacked at,
(01:20:39):
like and I wake up when my alarm goes off
and not a moment earlier. It's fucking great. I mean,
I write constantly I think like, this is the only
reason I can just like endless sleep. But it's again,
the numbers do fucking nothing for me. I wake up
feeling like shit, It's like you didn't sleep enough. Thank you?
Speaker 8 (01:20:54):
Yeah. For me, the numbers make it so much worse
because mostly if I'm having trouble sleeping, then a huge
proportion of my brain is fixated on I am not
sleeping and that is bad. And so then to have
when they're like, this is going to help you sleep better,
I'm like, no, no, no, it's gonna remind me how badly
I'm sleeping, and that's going to make me sleep so
much worse because every night when I lay down, it's
(01:21:16):
gonna be like, well, tonight has to be the night
that it is good now.
Speaker 2 (01:21:20):
And get you loved ones get a text saying she's
not sleeping.
Speaker 8 (01:21:24):
Chloe has been detected lying down, but has not been
detected unconscious yet.
Speaker 2 (01:21:30):
Chloe has violated the rules of the Billows Sleep bought that.
Speaker 5 (01:21:34):
That matric is called sleep latency. It sucks.
Speaker 2 (01:21:38):
Yeah, what's the what how long it takes you to.
Speaker 5 (01:21:40):
Sleep, how long until you try to go to sleep,
and how when you actually fall asleep it's called sleep
latencyy and you can kind of measure certain things by that.
It's it's bullshit occasionally bullshit, but it's bullshit.
Speaker 2 (01:21:52):
I occasionally really like I'll get like a two minute
one of the fuck.
Speaker 5 (01:21:56):
Yeah, well, like all it does. All it does is
if you have a sleep and see under fifteen minutes,
it just means you're really fucking tired.
Speaker 1 (01:22:03):
And that's bad. Interesting. Yeah, wait, you mean like healthy
people have.
Speaker 5 (01:22:07):
About fifteen minutes to fall asleep.
Speaker 1 (01:22:09):
Interesting, that's like a normal Take that boyfriend.
Speaker 5 (01:22:13):
Yeah, like you're falling.
Speaker 1 (01:22:14):
He falls asleep in forty five seconds. It's like if
and then highlight there he helps angry.
Speaker 8 (01:22:20):
Listening to him, get this peaceful rest that I should
be enjoying.
Speaker 5 (01:22:24):
I'm like, you fall asleep too quickly.
Speaker 2 (01:22:28):
It could be one hour yesterday, Jesus Christ. Yeah, okay,
seventeen six minutes, two days ago, thirteen sixteen and seven,
seven sixteen, twenty eight, eight thirty three, one fucking fucking
lost Friday, one minute, just fucking I think I was
(01:22:49):
like in a moment with my girlfriend, I'm very happy.
So that's lovely. Under that layer, all you.
Speaker 5 (01:22:55):
Need is like a baseline and like the only point
of these numbers is to build a long term base,
and when you're off the baseline, you go like, oh, okay,
maybe happened to me.
Speaker 8 (01:23:05):
It's yet again to go back to sort of the
theme of earlier, don't you know when you're off the
baseline by I'm having trouble sleeping lately?
Speaker 2 (01:23:14):
Yeah, I'm having trouble to sleep. Like.
Speaker 5 (01:23:17):
The whole point of this is that they're trying to
do something like illness prediction, and there is science that
they can do that. Like during COVID they.
Speaker 2 (01:23:24):
Were yeah, or is not great of that?
Speaker 5 (01:23:27):
It actually is.
Speaker 2 (01:23:28):
It hasn't been great for me.
Speaker 5 (01:23:29):
I've gotten it where like the symptom radar is just
like you're getting sick and then I would get sick,
so like, but you know, it's not that helpful because
it's like a forty eight hour button.
Speaker 8 (01:23:37):
Though again I know I'm like, I know my throat
has a tiny tickle, I'm about to get sick.
Speaker 5 (01:23:43):
Like all of this stuff came from like when they
were doing COVID research because it's asymptomatics. Okay, get like
that sort of like, but.
Speaker 2 (01:23:50):
This is the thing with all this wellness shit. It's
like well, you can't be mad at us that it
isn't really useful, because once it.
Speaker 5 (01:23:56):
Was snake oil. That's my beat, snake.
Speaker 2 (01:23:59):
Oil, snake oil cussed feet.
Speaker 5 (01:24:02):
Yeah, I just I just sit here and go like, no,
that's not true, No that's not true. Here's the nuance.
And everyone yells at me and they're like you're a
big farmer shell and I'm like, no, I'm not.
Speaker 2 (01:24:11):
What the fuck does that? I mean, what do you
mean a big farma shoe?
Speaker 5 (01:24:14):
Oh. I wrote a story recently about the wellness wild West.
It's kind of like a storyline that I read about
a lot, and I was talking about how people shouldn't
be injecting unapproved drugs. Ideally, even if it.
Speaker 2 (01:24:25):
Peptides, yes, Chinese peptide.
Speaker 5 (01:24:28):
Don't don't do that. Don't take GOLP three. It's not
actually a GOLP three because the only people with access
to the proprietary components to that is Eli Lilly. So
whatever you're getting from China is almost one not a
gil and.
Speaker 2 (01:24:41):
You're not seeing GOP three is a problem. Just getting
it from an unlicensed getting it from.
Speaker 5 (01:24:45):
An unlicensed Chinese pharmacist is not a great idea. Bub
reconstituting it yourself. You are not a pharmacist. I don't
see any of you on TikTok using fucking sterile ingredients
or sterile instruments or whatnot. Like I said that and
they were like, fucking big sh pharma's shill. I can't
believe me saying don't use unapproved drugs. Is me being
(01:25:06):
a big farmer.
Speaker 2 (01:25:07):
Show I'm this week, I'm using the new viral GLP.
Speaker 5 (01:25:11):
From REDDA and Rada tu weey because there, you know
the the videos. It was like, oh my god, here's
week one on read. I'm like, first of all, you're
not on REDDA, You're on some other fucking co The
The thing that they're calling GLP three, which is also
like not the correct way to call it, is redd
a true tide.
Speaker 2 (01:25:29):
I feel like if you describe this like a hunt,
like I went to a mysterious man who handed me
some pills, you'd be like, yeah, that sounds bad, you
should go to the doctor. But this is like, well,
not because I got it from the internet.
Speaker 1 (01:25:39):
Yeah, I just was reading an article about peptides. Because
not because I was like would those make my life better?
Speaker 8 (01:25:47):
Not certainly, not that Not because I was thinking about
trying them, and I read an article and there was
somebody was quoted. Somebody was just saying, like, they're all
over Silicon Valley and uh, like this crew, you know,
Crypto loves them, AI loves them whatever whatever else, this
other thing loves them. And whoever was quoted said, the
(01:26:08):
only group that's a little more skeptical is uh. Anybody
who works in biotech is a little bit more skeptical.
But they're just so in the pocket of the FDA,
or they're so they're so uh they like bow down
to the FDA. And I was like, well, the other
interpretation is they this is the world that they work in,
and they're steeped in way, way, way more information than
(01:26:30):
somebody who works in crypto is about peptides specifically, So
maybe their skepticism is like healthy and ractional.
Speaker 5 (01:26:36):
It's partly like little like the FDA doesn't want you
to have that. And I get where that's coming from,
because you know, healthcare sucks in our country, it's expensive.
These things could be helpful, but you're getting blocked from it.
They are trying to, you know, make sure that they're
making the most amount of money of it. Those are
all true things, But the FDA exists. We have these
regulations because people fucking died. People died before, Like this
(01:27:01):
is how people die or get really adverse effects because
I don't know, but doctor matt On TikTok, who is
most certainly not an actual doctor, telling you how to
dose gray market Chinese peptides from like things you bought
from Amazon. I'm sorry when you're in the comments and
you're like, oh, yeah, I was vomiting uncontrollably.
Speaker 1 (01:27:23):
Oh I wonder why.
Speaker 2 (01:27:24):
I'm just doing weight loss math math.
Speaker 12 (01:27:28):
I had such a distant sense of the GLP once
because I've been learning about them from the way in
which they're displacing insulin production in a lot of places,
and I hadn't had a good sense of the extent
to which I was like, oh, why are they calling
them Chinese peptides so incessantly? And that piece was interesting,
(01:27:51):
and that time piece was interested in laying out one
Silicon Valley's obsession with them and also making click into
sense that I feel like we're seeing others subcultures be like,
well like, yeah, of course you got to do it yourself.
Of course you got a looks Max. Of course you
gotta yeah, you know, like uh optimization.
Speaker 11 (01:28:10):
Yes exactly.
Speaker 2 (01:28:10):
I just want to be abundantly clear about something. By
the way, we're not saying g o P one is
a bad thing, like it's totally fine. I just want
to be really abundantly to developed. I mean.
Speaker 12 (01:28:19):
And one of the use cases is for like, you know,
if you have, for example, diabetes. You know that, you
know that's that's one of the uh the proven use cases.
And yeah, gives it to you.
Speaker 11 (01:28:32):
Yeah, as long as as you're not doing it in
your bathtub.
Speaker 2 (01:28:37):
PhD in philosophy teaching you that.
Speaker 12 (01:28:40):
It's been so fascinating to see the extent to which
people are like throwing a lot of uh reason out
of the door just to be like, no, this is fine,
this is okay. Actually, and you are a big Pharmachelle,
if you are, you.
Speaker 5 (01:28:54):
Know, maybe maybe you know body builders who already are
a very lean taking GOLP so called GOLP three. Maybe
that's not a great idea because they've actually never studied
what this medication does in people with low body fact.
You don't know what.
Speaker 11 (01:29:11):
If you're going to do this, how many times are
we going to do a wave.
Speaker 12 (01:29:14):
Of a drug that does seem to have a really
miraculous impact on people. And then you know the crusading
group of power users maybe you know who are like, hey,
you know, well, if you're not supporting everyone using it
as widely as possible, then you're.
Speaker 1 (01:29:30):
In the pocket of the I mean, side effects are
a lot more. Yes, effects are real.
Speaker 5 (01:29:35):
We may not actually want to have those side effects.
And like, these drugs are so new, we don't know
what the long term effect is. And it's not bad
to say, like, you know, you're you're always doing a
weigh a pro and con like I'm willing to do
the side effect because the alternative is worse, and like,
those are conversations you need to have with a real
fucking doctor. Oh, not a comment section, not some billy
(01:29:58):
bob doctor boo on TikTok. It seems it's.
Speaker 1 (01:30:05):
Times.
Speaker 8 (01:30:05):
The word A thought that I was having on the
plane here is it feels like society as a whole,
and particularly like the this this sections of society that
are more online, that are more techified, that are more
you know whatever for sort of forward in that kind
of progression.
Speaker 1 (01:30:25):
Uh have plenty of room for.
Speaker 8 (01:30:27):
Skepticism when it comes to systems of like social support,
but do not have room for skepticism about tech or so.
Speaker 1 (01:30:38):
Yeah, yeah, just.
Speaker 5 (01:30:39):
Like medicine, Like, oh, medicine, vaccines, why would we do
that despite hundreds of years of evidence, Like they can't
possibly be right because coach Matt told me on TikTok
and it seems right.
Speaker 2 (01:30:52):
And I mean I say this from a perspective of
I've been very overweight in my life, like I used
to be like three hundred and fifty pounds a while back,
and it's like I get the desperation. Yeah, I get desperation,
especially when it's like GLP one is like very GP three.
I'm sorry, but it's like very expensive. You have to
go to a doctor and all that, and just being
like maybe I could do this cheaper, maybe my doctor
won't give it to me, maybe whatever, And I get
(01:31:14):
that fucking desperation. You'll try it, Like when I've had
weight problems, you're fucking scratching at the walls, trying to
find some cheat code around the actual hard thing of
changing lifestyle, which I should be clear, it's very fucking difficult.
I have sympathy with everyone. I've been through it multiple
cycles in my life I get it, but it's like
to your point, Victoria, it's like, yeah, you're gonna get
side effects. I'd rather roll the dice on the doctor
(01:31:36):
who's using the thing that they know the side effects,
they've done trials and such, versus the doctor.
Speaker 5 (01:31:41):
It's hard because you go to doctors and doctors don't
listen to you. I have bekase, I have metabolic conditions.
I'm on a GLP actually for it for like peacos
and fatty liver, And getting to that point was over
a ten year long journey.
Speaker 2 (01:31:55):
Yeah, but these.
Speaker 5 (01:31:56):
Are conversations I was having and I had to fire
like five doctors because one was like listen to the
Huberman lab no, or like maybe you should be maybe
you should be a vegan. Fuck off. Not not to
like say anything with vegans, but like that's not the
solution I was looking for, or just being like, well,
you know picos causes weight gain. Also the solution is
(01:32:17):
weight loss, but also your body will not let you
lose the weight, so just lose some weight. And like
I understand that desperation, like from a very personal level,
but at the same time, you have to believe in science.
You have to believe in like the process exists because
people have died because side effects are also debilitating to
quality of life. The worst part about being on a
(01:32:39):
GLP for me is that I like food. I don't
have a lot of food noise. I actually genuinely don't
have food noise. My liver just wants to fuck me over,
and so like I can't eat. It's so it takes
the joy out of eating for me. I don't enjoy
it at all. I'm sorry, yeah no, And like I'm tired.
(01:32:59):
I get fucking clown nightmares. I literally had a nightmare
that my spouse's X came running after me and I
was like, oh, this ho and then she turned into
Penny Wise, the fucking clown, the Bill Scar's Guard version.
I woke up screaming like.
Speaker 2 (01:33:14):
This is the side effect I have, clown.
Speaker 5 (01:33:16):
Yeah, no, that's a side effect I have. This is
a thing, you know, and you have to wait those
pros and cons for health. But you know, in the
pursuit of optimization, and that's rampant here at CEES, people
are like, yeah, numbers that don't matter, do this, do that,
and it's like, this is why I have mental health
problems doing my job.
Speaker 2 (01:33:35):
But now you have numbers to help.
Speaker 1 (01:33:37):
Yeah, yeah, now you have so many numbers.
Speaker 8 (01:33:38):
I want to I mean, I want to go back
to one thing that you said, which is you said
you have to believe in science.
Speaker 1 (01:33:44):
Yeah, because people have died, we've done this whatever.
Speaker 8 (01:33:47):
I think that there's just a huge swath of the population,
which is heavily represented here, of people who would just say, no,
you don't have to believe in science.
Speaker 2 (01:33:57):
It is lies.
Speaker 1 (01:33:59):
It is you know, it is bunk, it is biased,
it is whatever.
Speaker 8 (01:34:03):
And that fracture is where I'm like, oh no, we've
not just about science particularly, but sort of like the
like communal agreement on some like really foundational social pillars.
Speaker 2 (01:34:15):
Consensus reality.
Speaker 8 (01:34:16):
Yeah, consensus reality is is like seems to be crumbling.
Speaker 2 (01:34:19):
And it doesn't help that the leader of the FDA
sounds like a fucking cyberman. Oh you know yet. Yeah, Now,
we can't make fun of him. It's a physical condition.
We can't make fun of him for that. We can
make fun of that, for the fucking worm in his.
Speaker 5 (01:34:32):
Brain, the warm in his brain, and the.
Speaker 2 (01:34:34):
Whale corpse and the bare corpse.
Speaker 5 (01:34:35):
I he keeps me up at night because in June
he was like in four years. I want every American
wearing a wearable and as we quantified people on the
earth wearing three to four wearables continuously twenty four seven,
three sixty five.
Speaker 2 (01:34:49):
No, don't do that, so at that point we can rotate.
Oh guess one last time, Thank you so much for listening.
The upcoming AD I assume is not for weight loss drugs.
If it is, well, I guess.
Speaker 7 (01:35:04):
You know how we feel about that now, but that's
not really my goddamn problem.
Speaker 2 (01:35:20):
Welcome back to the better off line CS experience. We're
just talking about wanking. I guess.
Speaker 1 (01:35:27):
I was more specific.
Speaker 3 (01:35:28):
I was saying, the flashlights are They're just okay, and so.
Speaker 1 (01:35:31):
Why would be interesting take yes, oh.
Speaker 6 (01:35:35):
What because you're oh, because I admit to using a sexat.
Speaker 3 (01:35:42):
A man who.
Speaker 6 (01:35:43):
Uses a sex toy. Women are vibrator vibrator vibr because men.
Speaker 8 (01:35:47):
Are inadequate, because men, and if you say that women
are inadequate, I will take deeper.
Speaker 11 (01:35:55):
The next generation of sex toy.
Speaker 2 (01:35:57):
Welcome. Welcome to a offline. You complained last year we
didn't have enough woman, which was true. Now we've got
one that's just immediately created. The fracture got you for
two more days. As well as gonna be great hair
now this chaos and truthfully having like amazing teams the
core better offline team for the show as well. And
I'm actually like both you and Adam Chloe, you both
(01:36:20):
kind of were like it's like like Adam, you've done
a bit of tech cloth you've not done really and
you're like, oh is this Am I going to be
a right to do this? Because like I haven't done it.
This is the perfect view because I think the really
tech poisoned people are incapable of just being like that
sounds fucking useless. That sounds dumb as ship even the
jack off machine. It's like, you're going to attach this
(01:36:41):
heavy looking looks pretty heavy.
Speaker 3 (01:36:43):
It didn't. It didn't look useful or necessary.
Speaker 6 (01:36:47):
It's a we've already we've already talked about.
Speaker 8 (01:36:50):
It doesn't look useful or necessary.
Speaker 3 (01:37:00):
To describe.
Speaker 6 (01:37:01):
Because we talked about the handy and Handy too in
the previous segment, we didn't describe what it is. It's
basically the it's called the it's it's a joke product.
Speaker 3 (01:37:10):
People laugh.
Speaker 6 (01:37:11):
It's like the inner core of the flesh light, which
is like, okay, a flesh light if you're not familiar.
You know, the sticky hand from the vending machine of
the supermarkets.
Speaker 3 (01:37:19):
Trust that is that material.
Speaker 6 (01:37:22):
That's the material you put lubin in and it's squishy, okay,
and and again just okay. As an experience had I
got it's been in a drawer for years, right and
I got it, uh for free, by the way, because
I used to do a comedy show in a sex
shop in Los Angeles.
Speaker 3 (01:37:38):
Wonderful show at the pleasure chests. But uh, I didn't
pay for it, but so.
Speaker 1 (01:37:43):
Imagine I would have. It was on my list.
Speaker 11 (01:37:47):
I support the industry genuinely, sup.
Speaker 6 (01:37:51):
It's just it's a squishy core of a flesh light
and then it's attached to a velcrow harness which is
attached to like a big piece of metal machinery that
that moves it up and down.
Speaker 3 (01:38:01):
And I'm like, I have hands, Yeah, I just And.
Speaker 1 (01:38:06):
Here's my pitch.
Speaker 8 (01:38:07):
Here's my pitch for an analog version of that is
take one of the actual sticky hands that you actually
stick on the window, and put some lube on that
and put.
Speaker 1 (01:38:15):
Just hold that on the problem of your hand.
Speaker 8 (01:38:18):
And check that out that it's gonna feel like it's
somebody else's hands.
Speaker 3 (01:38:21):
Here's here's what I would like.
Speaker 6 (01:38:22):
I would like a woman to stand across the room.
Speaker 2 (01:38:31):
I must I must also be clear, by the ways.
So I looked up the handy too, and I just
want to be clear. There are two SKUs. And there
is the standard, which is one hour of battery life
save sixty dollars launch price, and it's regularly two hundred
and niney nine dollars. And then there's the five hours
of battery.
Speaker 11 (01:38:47):
Oh, the cooner package.
Speaker 2 (01:38:49):
Yeah, three hundred and seventy nine dollars down four hundred
and ninety nine. And you scroll out and it's like,
we recommend the FoST charger handy, And I just got
to ask how much you? How much you doing that? Yeah? Well,
he purchased the award winning VR compatible Automatic stroker. Think,
but isn't the isn't.
Speaker 3 (01:39:10):
The point of it V? It's VR.
Speaker 2 (01:39:11):
Come on, I mean, you're watching the poeos and all that,
But but isn't the.
Speaker 6 (01:39:16):
Point of the device that it's supposed to be very fast,
supposed to make you come fast?
Speaker 2 (01:39:20):
Really?
Speaker 3 (01:39:21):
I would think so.
Speaker 1 (01:39:22):
They should put it three four hours of battery life.
Speaker 12 (01:39:24):
Yeah, I know, I just they should use like most
of that battery life for a little screen where you
can watch some porn on it while you're gone.
Speaker 2 (01:39:31):
It's just really like people that don't have sex or
jack off. It sounds like it's just like the least
sexless people.
Speaker 8 (01:39:38):
I would love to know that the man who is
horny enough to want to use the jack off machine,
but so dedicated to tech that when he goes to
pick up his jack off machine and the battery is dead,
he goes, I have to put it on my fast
charger now and wait for seven minutes for to give
up battery life to make.
Speaker 2 (01:39:58):
Me come to thing his fo like sonic a hedgehog
sitting on.
Speaker 1 (01:40:02):
His hands, being like, there's no other way to do this?
Speaker 2 (01:40:05):
Does this ever happened to you? It's just I must
be This is a sex positive podcast. It's great, sex wonderful.
It's a great thing between consenting adults and wife fu tubes,
I guess. And it's just very strange to look at
another product on the show is just like an alien
made this, Like what what do you need? You can't
(01:40:28):
jack off? I guess, or you really want to. And
also I just feel like just there's a masturbata that's
a great thing to say. I just feel like picking
this thing up, I would just be consumed with shame immediately.
The only thing would make me more ashamed would be
if it was out of battery. Mm hmm, you're like, fuck,
I've used it. Yeah, just like I'm too into this.
Speaker 6 (01:40:51):
See now, this this is my pet peeve because I
think when you see be consumed with the shame, you're
ashamed of your own sexuality. Yet that's what's happening, is
you as you are a shamed the idea of masturbating
with a device.
Speaker 2 (01:41:01):
Why why Because my hands can do it perfectly fine,
that's I can have.
Speaker 3 (01:41:06):
Why shame?
Speaker 2 (01:41:07):
Why?
Speaker 1 (01:41:07):
Why shame?
Speaker 8 (01:41:08):
Why not?
Speaker 1 (01:41:08):
Why not amusement? If your hands can do it perfectly fine,
Like I just.
Speaker 2 (01:41:11):
Well already we're in this weird world where I wouldn't
buy it.
Speaker 1 (01:41:14):
Yeah, Like so it's just would you feel a shamed
not not a shame?
Speaker 8 (01:41:18):
Would you feel shame if you picked up a flash
light if you were going to use a flash light?
Speaker 1 (01:41:22):
Or is the machine worse? No, I just feel the
machine laughing at you.
Speaker 11 (01:41:26):
I just have never seen this.
Speaker 2 (01:41:28):
We're now going on the show, or just that one's like,
how does that jack off? It's just like the idea
of the unnecessary. To me, I don't like something else.
Speaker 6 (01:41:38):
That's because maybe the male sex toy is not a
jerk off device, because we can do that perfect. The
real male sex toy is a butt plug, and men
won't buy them. That's it's prostate stimulation.
Speaker 11 (01:41:54):
Yeah, that is you know if you like that, that's
fucking cool.
Speaker 2 (01:41:57):
I like, my problem isn't And you know what, maybe
as a guy who just like he's like a sex
guy and he likes all the different kinds, Fine, that
sounds fine. I just think like if I got a
machine that was out of batteries for jacket off and
it was unpowdered, be like, I've been doing this a lot,
haven't I do?
Speaker 12 (01:42:13):
The do the sex toys have big useless numbers too?
Oh god, I wonder if you haven't been coming lately.
Speaker 3 (01:42:22):
You haven't you come?
Speaker 1 (01:42:23):
Wait and see.
Speaker 8 (01:42:27):
The distance of time between when you begin to jack off.
Speaker 2 (01:42:30):
It's like a little prompter you okay, you haven't.
Speaker 1 (01:42:35):
And when you are off the baseline that.
Speaker 2 (01:42:37):
Is when you I'm actually looking out to see if
there's any metrics that come discrete delivery, I.
Speaker 11 (01:42:42):
Mean discreet delivery.
Speaker 10 (01:42:45):
One nice.
Speaker 6 (01:42:45):
What if what if withings made a sex toy and
then it tracked your you know what I mean, it
gave you some stats.
Speaker 3 (01:42:52):
The gooners might like that some jerk off so.
Speaker 2 (01:42:55):
What yeah, like, oh, how about like a special watch for.
Speaker 3 (01:42:59):
Jerking on off that like Matt, you know, they have
like a stat line. It lets you know how efficient
you're being and.
Speaker 1 (01:43:06):
How much time you spend, like you know, how on treadmills.
Speaker 8 (01:43:09):
You can follow the little workouts where it like shows
it it's like it's gonna get faster now, and then
the treadmill goes faster, you have to run faster, and
then it's like now it's slower. If you're if you
had a little like if you had a little gooning
workout line where it could be like go faster now,
and then it's like slow down, so down, sow down,
sow down, and you like follow your little workouts.
Speaker 4 (01:43:27):
I mean.
Speaker 3 (01:43:30):
You're describing joy videos.
Speaker 2 (01:43:35):
I just want to read something.
Speaker 3 (01:43:36):
It's a really good form of video porn.
Speaker 6 (01:43:39):
I think those are like it's very kind of wholesome
because it's sort of like I want someone.
Speaker 11 (01:43:43):
To instruct there's a relations there's.
Speaker 3 (01:43:46):
A relationship about the jerking off.
Speaker 2 (01:43:48):
I just want to be clear. There is a button
on this website to lead you to various videos. It
says explore the handy verse and handy verse. I don't
feel great having read that.
Speaker 8 (01:43:58):
The greatest sin that Mark Zuckerberg has committed on the
phase of this planet.
Speaker 1 (01:44:01):
He's giving us blank verse. We got that done. I'm close.
Speaker 11 (01:44:07):
We got the handy verse with sixty nine.
Speaker 2 (01:44:10):
What actually, oddly, like, what is handy feeling? There's like
a whole thing and I'm like coming around to this,
like this isn't for me, but I'm like, okay, if
someone's a real beat offerer, like this is quite nice.
Speaker 3 (01:44:21):
I don't know.
Speaker 6 (01:44:22):
The reason I feel shame is because I feel like
we're falling for it because this is a gag product.
Speaker 3 (01:44:26):
We're talking about it.
Speaker 2 (01:44:27):
No, this is real. This is just like a real
thing that charge in frint.
Speaker 3 (01:44:30):
It exists, but I don't think it serves it's to
laugh at it. Cees and devices know if I agree.
Speaker 2 (01:44:37):
Like, here's the thing. Woman sex toys very well documented
and used and talked about quite.
Speaker 1 (01:44:42):
You've seen pictures of them, yeah, very.
Speaker 2 (01:44:44):
Also like women discussed them. I feel like, Adam, You're
like one of the only men I've talked to about
sex who's just not immediately Yeah, I've had it like
one million times, but I can't discuss it at all. Yeah,
but it's like I believe that, like, there are men
who jack off in this man, I'm good on them
as long as it's like normal and consensual, and like.
Speaker 3 (01:45:02):
I could be wrong. I just don't believe in this product.
I believe in sex toys for men.
Speaker 6 (01:45:06):
I believe they're underrepresented a category, but I believe where
this product is situated in the hall and the way
that they have designed it.
Speaker 3 (01:45:14):
Yeah, yes, Okay.
Speaker 11 (01:45:22):
The only one I saw was the love Sense one,
which was the vibrators. I was like, okay, I saw
that last year also, so I was like, okay, that's
the one I know.
Speaker 6 (01:45:28):
It's in the Eureka Hall and it's sort of position
it's like meant to make you laugh. And I think
it's a five hundred dollars back big.
Speaker 2 (01:45:36):
But was it a big it was it a big booth?
Speaker 3 (01:45:39):
They had a big booth, then that's a real company.
Speaker 1 (01:45:41):
Because well, look, men called it big, women call it.
Speaker 2 (01:45:45):
Yeah, it's just a regular size nice. Yeah, but that's
the thing. If they have a regular sized booth.
Speaker 1 (01:45:54):
It's a perfectly normal sized booth.
Speaker 2 (01:45:57):
It's a good get biga during the show, if like
that's a real company, because otherwise it's just like, yeah,
we've one hundred thousand dollars in the jack Off booth.
I do, I do? Let the idea has spoken of,
I can have a goat.
Speaker 1 (01:46:12):
Hey, come on over to the jack Off booth.
Speaker 11 (01:46:14):
Come in, Uh, everyway, both anyway in the booth for you.
Speaker 2 (01:46:19):
We're going to segue away from this, right.
Speaker 8 (01:46:21):
Here's a question that is sort of off of this theme.
As a Cees virgin.
Speaker 1 (01:46:25):
Right not to really lean into this kind of language.
Speaker 8 (01:46:29):
Uh, how many of these products, Like, there are a
lot of products that seemed absolutely fucking useless. Yes, And
what I don't know is how many of the people
who are here earnestly believe that what they are hawking
is going to make the world better in some way,
and how many people like what what is the percentage
(01:46:51):
split between earnest believers?
Speaker 1 (01:46:53):
Uh, people who are here for.
Speaker 8 (01:46:54):
A paycheck because they need a job and this happens
to be the job they're doing, and people who are
totally disillusioned but are still doing it for the paycheck,
but like, are are totally jaded?
Speaker 1 (01:47:04):
I would be curious.
Speaker 2 (01:47:05):
I think what it is is like you get a job.
It's hard getting a job, and you get the product,
and it's like, well, I gotta fucking believe in something.
So you learn the talking points and you're like the
pillow thing you were talking about last piece, It's like
that person clearly had never had a conversation with someone
who's like, yeah, but what if I want to lie
in bed without falling asleep? Because they've memorize the talking
points and they've internalize them. I'm sure that there are
(01:47:26):
a percentage of cynical people, but I think a lot
of these people have just like not thought about it
too much. They're like, yeah, the air purifier with the
spike on it, the dog scratches itself on. That makes
perfect sense because I've been forced to believe it because
this is how I make money. And I think that
there's something kind of sad about it because you have
to believe. I mean like there's a certain degree at
(01:47:46):
times when you get a product. I don't do consumer electronics,
but like in the past where I've had to be
like yay, even go, it's like it's gonna pay mortgage. Like,
I think that that's a degree of that. And I
think some people just fucking stupid. I think some people
are just like, yeah, they said they isn't it cool
it can do this? And they focus on how cool
it is. And then when they think two sex and
(01:48:08):
they go, oh, fuck, this is one alien would use.
This is like it's just like stall Trek accept nothing
like it.
Speaker 3 (01:48:13):
I think.
Speaker 8 (01:48:14):
So.
Speaker 3 (01:48:15):
I also think a lot of them are just happy
to like do the job.
Speaker 2 (01:48:18):
Yeah.
Speaker 6 (01:48:18):
You know, like when I went to the ORB thing,
my ORB debacle, all the people I told, well, whatever,
people know about it or they don't. But when I
talk to people who work for this horrible product, like
some of them seemed kind of enthusiastics. Some of them
sort of had thousand yard stairs yeah, where where I
was talking and going like, ah, they're making like three
(01:48:38):
hundred grand a year probably yeah, working for like a
sam Altman company doing interesting work. Like they're like they
wanted to be an engineer. They're a software engineer. They're
working on something that's tech technically interesting. They would like
for it to work. They sort of have to drink
the kool at exactly in the same way that someone
who's like working on a shitty movie.
Speaker 3 (01:48:58):
Is like I'm living the drink. All the movie kind
of sucks, but I'm living the dream.
Speaker 2 (01:49:01):
But you're gonna be good.
Speaker 3 (01:49:02):
It's gonna be good. I'm making movies like it's this. Yeah,
that makes suspension of disbelief.
Speaker 2 (01:49:07):
But I respect people of movies way more. Like, just
do want to be clear, Like learning about how movies
are made. Recently, it's insane a single movie has ever
been made. Yeah, I do not understand how any movie
is made.
Speaker 3 (01:49:19):
Well they're not made anymore.
Speaker 2 (01:49:20):
Well, yeah, that's that's also the other thing. No, they're
either one hundred million dollars or one million dollars.
Speaker 6 (01:49:26):
The movies when you watch movies now, like I you know,
I was watching like Rob Ryner movies because he passed away,
of course, and watched like Harry Met Salary another movie
like that. It's like walking through like the ruins of
like a Roman Colisseum in the Dark Age is like
look at what humans.
Speaker 2 (01:49:41):
Used to be able to do, Like it's and then
you watch Stranger Things and there's a big crab and
win on a Ryder punches it to death. You just
I don't watch that show strange enough already and it's
just sad. But I think I think movies are gonna be. Okay,
I'm choosing to be optimistic for no reason.
Speaker 8 (01:49:59):
Yeah, yeah, I mean that's a sort of a whole
separate yet whole other conversation. But I do think that
there's a lot of people who would say, like the
same people who are who are who.
Speaker 1 (01:50:11):
See movies as sort of the dinosaurs, the things of
the past.
Speaker 8 (01:50:14):
Are the people who see a lot of shit at CST,
who sort of willingly drink the kool aid of a
lot of the shit at CEES and are like, this
is the stuff of the future without actually even being
paid three hundred.
Speaker 2 (01:50:25):
Thousand dollars by the company because they want to be first.
Speaker 8 (01:50:28):
They want to, but they sort of have gotten on
the roller coaster of they're addicted to the high of
u the possibility of success, the newness.
Speaker 2 (01:50:38):
Yeah.
Speaker 6 (01:50:38):
I mean, I remember when it was over ten years
ago now, but I was working at College Humor and
stuff like Google Glass would come out and.
Speaker 3 (01:50:45):
My coworkers would be like, it's it's gonna be.
Speaker 6 (01:50:47):
The future and just based on the premise alone, and
I have to be like I don't think so, yeah's right,
same thing happened with the VR.
Speaker 8 (01:50:54):
You know you wait, Adam, Google Glass is coming, ray
Ban Meta glasses.
Speaker 6 (01:51:00):
People were so they had the script in them for
I think, you know, and they want it to be true.
They just want something new that's gonna be fun.
Speaker 12 (01:51:07):
I also, I'd be curious what you think, because you
know you've been coming to see us for a while.
How much of it is Also people at a firm
that tried to sell something to another business and it failed,
and now they're doing consumer because I feel like I
would talk to some people, especially in Eureka Park, and
they wouldn't say it, but you could hear in the
conversation that there was a much more ambitious plan and
(01:51:29):
that fell through, and now they're winning to something else
that a consumer might be able to use.
Speaker 2 (01:51:35):
It's the raw economy, baby, It's growth of all costs.
Because if you solve a small, boring problem for a
million people, that fucking rocks, is that gonna be worth
ten billion dollars new you're gonna be able to take
that public? No, you can sell it to enough company. No,
But if you say I'm gonna solve fucking everything, forever
you get hundreds of millions of dollars because you've just
over promised and being part of this newness futuristic thing
(01:51:58):
and vaguely attaching yourself to it. If you're wrong, you
can say, oh, it is experimental. If you're right, you
can say I was fissed.
Speaker 8 (01:52:04):
Yeah. But and that's sort of to what Adam was
saying about, you know, wanting it's my language now, like
wanting the possibility of success addicted to the high of
that possibility. But everyone you were saying, I think you're
saying like the U when you're saying that they had
a more ambitious plan. Do you mean they had a
more ambitious plan for what the tech itself could be
capable of or do you mean like they've had They've
(01:52:26):
figured out a tech that is capable of whatever it is,
and the ambitious plan was to use it in a
much broader use case or a much higher scale use case.
And now they've sort of been like shunted down to
a bof a ces where they're trying to make somebody
pay six hundred dollars.
Speaker 12 (01:52:40):
It feels like they thought the idea was worth ten
x what it actually is. An example, I mean, I
saw it a lot more last year where it would
be people talking about tech that would be that's helpful
assistive tech, but their initial formulation of it seemed to
be we are going to apply this at scale and
workplaces to augment your.
Speaker 2 (01:53:01):
Productive for like standing up the thing for opening the
cash rep.
Speaker 12 (01:53:06):
Right, you know, something like that, where it's like not
even really clear that would be adopted at an individual scale.
Speaker 2 (01:53:13):
Well, so that one really pissed me off because it's like,
you don't give a fuck about anyone, you don't love
them in the eye, No, you were excited about touching
a screen towards it treats you little.
Speaker 5 (01:53:22):
You know.
Speaker 12 (01:53:22):
It feels like it's like we wanted to sell this
ad scale to like McDonald's at a bunch of other
fast food restaurants, and that didn't work. So now we
are cobbling together some of the IP to figure out
another approach that might be usable in smaller UH settings.
And I feel like, you know, when I talk with people,
especially in some of the assistive tech, you know, where
(01:53:44):
they're not really talking about it in the way of
we're helping people who are disabled or helping people who
need mobility it's it's it's it's almost adjacent to or
sounding just like how we talk about convenience.
Speaker 2 (01:53:56):
Yeah, you know, did you see anything today that was
of note?
Speaker 12 (01:53:59):
Like I mean, most of the stuff that I saw
today in Eureka Park wash more misuse of agentic language
and saying oh no, let's you know what if you
had a chat bought that you know, managed your home
Google calendar and told you when to pick up your
kids because you forgot, or you know, when we got
(01:54:22):
because you don't give us ship. You know what, if
you have something, I would look at your ship and
tell you when you're sick or not sick.
Speaker 1 (01:54:30):
You mean literally your poop?
Speaker 11 (01:54:31):
Yeah, yeah, yeah, yeah no literally, yeah, you poop.
Speaker 2 (01:54:34):
That was one thing that Victoria. We didn't get to
the piss poop. We go to the com sadly, but
it was like there's a many different toilet measures. Now.
Speaker 3 (01:54:42):
Yeah, oh I saw a good product, yes, and I
would describe it. I walked I saw this product, I
looked at I immediately recognized what it was. I walked
up to them.
Speaker 6 (01:54:50):
I said, fit Bit for your cat, and they said,
that's what you know, fit Bit for your cat. You're
going to sell that, and it's like a little food dish,
food dish and a bubbler for your and then it like,
I don't know, has some stats about your cat, and
it said like here's how much food is left, here's
the temperature of your cat, and then it just said healthy.
Speaker 3 (01:55:08):
Yes, and that's it's like, really, all the cat owners what.
Speaker 6 (01:55:12):
It's like, Well, could it could the box say my
cat's healthy and it'll relieve my anxiety and.
Speaker 2 (01:55:17):
Your cat owner, yes, I have, I don't know, like
and it's the only measurement is like how beautiful they
are like that, which I remind them of constantly. I
just how holt my cat is is very funny.
Speaker 11 (01:55:28):
Yeah, I got two cats.
Speaker 12 (01:55:29):
And it's like when I learned I got a I
got one of the letter boxes, those automatic ones. When
I learned there was an app, I asked them, can
I do.
Speaker 9 (01:55:36):
I need app? Can I delete that?
Speaker 11 (01:55:37):
They're like yes, And I was like, okay, I will
never look at this ship again. Yeah, And it works out.
Speaker 12 (01:55:40):
You know that anything that even wears other or has
a little motor other than.
Speaker 9 (01:55:45):
The litter box, they destroy. They swat it, they throw.
Speaker 12 (01:55:49):
It down the stairs, they push it, they literally catch it.
To the stairs and throw it down, so I can't
get anything smart for them.
Speaker 2 (01:55:55):
I get a fit bit for a dog, though, because
you like woke them. I would love to, actually, just
for my sick pleasure, know how fast Barbo runs because
him and his sister Pokey too bad. But no, they
just they will be sleeping all day and then just
run into bam, just fucking like across the entire house,
gone back.
Speaker 1 (01:56:14):
They're trying to get their steps in exactly like just.
Speaker 2 (01:56:17):
Cats like don't experience the same things that human being, Like,
they're not seeing that being, Like do I run around
a lot? They get the instinct like I need to
fucking run so fast in a straight line, fast than
I've ever But it's not because they're like I need
to be healthy. They just fucking cat Stop trying to
optimize my cat. Bro, My cats. Cats are not optimal
other than how beautiful and fancy they are and how
(01:56:39):
beautiful I miss them, anything else, like it's just this.
You're gonna be here for another two days. You're gonna
see more stuff like this.
Speaker 12 (01:56:49):
But I'm general I saw something we all need and
present ads inside of AI. How to put ads inside
of chatbots, how to put ads inside of any sort
of interface or product I might use the genitor of AI.
Speaker 2 (01:57:04):
It's so cool they haven't worked that out, I know.
Speaker 3 (01:57:06):
Yeah.
Speaker 2 (01:57:06):
That's also like the most obvious thing that they need
it the way that they would monetize this fucking thing. Yeah,
the one way to make the one way that every
per I get three emails a week from someone being like,
have you thought that they could put ads in chat jept? Yes, motherfucker,
I've fucking thought. Every single time you've emailed me about this.
(01:57:27):
I've thought about it.
Speaker 1 (01:57:28):
Do we know why they haven't yet?
Speaker 2 (01:57:30):
Because it's very hard to guarantee. The whole thing with
ads is you need to have applicability and reliability and accountability.
You cannot have your you can't have like an ad
for a children's thing by someone looking up sex and
vice versa. And the best example we have that this
doesn't really work is Perplexity. Their ads chief left a
few months ago and they made twenty thousand dollars selling
(01:57:51):
ads in twenty twenty four.
Speaker 6 (01:57:53):
Plex They Yeah, I mean the needs of advertisers like
determine everything.
Speaker 3 (01:57:57):
Yeah, Like you know, why does YouTube wor the way
it work? Ninety percent of.
Speaker 6 (01:58:01):
YouTube is to please advertisers, so to make sure that
you know, age verification, like content warnings, like all of
it is just for them and like they'll sort of
run the show. But also isn't part of the problem
that like advertising is not enough money for you know,
the massive amount of investments these companies are taking on, like.
Speaker 3 (01:58:21):
You put ads and chatgy, but you're saying enough, yeah, yeah,
it won't sustain trillions.
Speaker 2 (01:58:27):
Well, the other problem they have is the information and
a good story about this where it's like nine hundred
million weekly active US is ish even though Opening Eye
double counts people all the time. They admit this and
they're on academic papers, but nevertheless only one hundred million
if those are in America, which is a large number
whatever whatever, But that means like they're mostly overseas who
are lower value advertising plans, because there's a very big
(01:58:47):
difference between to advertising someone in America versus in India
or Indonesia where they release chat gpt go, which is
the cheaper one, which will inevitably be advertising support it.
And it's just like I do love that there are
like different boots of people being like what if we
could solve the problem the open Ai, Google, Perplexity, and
multiple other companies just haven't been able to touch. It
(01:59:09):
feels like watching the end of the world in a
very boring way at times.
Speaker 8 (01:59:13):
Well, so that's what brings me back to my that
brings me back to my question about how earnestly do
these people believe in what they're pushing. Like are there
people who are like, yeah, yeah, yeah, we are the
we have the secret key, we are the geniuses. And
sure would I be hired at Google if I was
really a genius. Maybe, But that's not what we're talking about, right.
Speaker 6 (01:59:35):
I think that they at the very least considered themselves
to have be the holder of a lottery ticket. Yeah,
maybe it'll work. And if it works, I'm a billionaire.
And if I can convince someone it kind of works,
maybe I get Aqua hired by Google, and then I
make half a million dollars yeah, you know, and like,
and I only have to get to like, you know,
next month to make that happen.
Speaker 3 (01:59:57):
I just got to convince some dopeoo's here right now.
Speaker 2 (02:00:00):
I believe all the TV people believe it fully, and
I get it. And I'm a big dumb ape. I
see a giant scream like.
Speaker 6 (02:00:05):
Yeah, baby, oh yeah, I want to see the TVs
at this so goo and I got mad because I
saw I think I was looking at the Vergis coverage
or something, and there's a new TV technology that's better
than I MINI bought the best. I bought the I
bought the you know the one, the one really good
Sony one. I got the really good Sony. I'm like,
I don't want a fucking better TV. This is as
(02:00:27):
good as TV's need to be. Seventy seven inches, it's
true black.
Speaker 2 (02:00:30):
Stop.
Speaker 1 (02:00:31):
I don't know.
Speaker 2 (02:00:31):
Just to be clear, you'll you'll find they haven't really
made better than that.
Speaker 3 (02:00:35):
Like I know it'll be in ten years. I'm still
just like, stop it.
Speaker 1 (02:00:40):
I don't know if you're gonna have kids.
Speaker 8 (02:00:42):
But I've heard a couple of dad sentences, both in
this TV discussion and earlier when you were like, now
I saw a good product.
Speaker 6 (02:00:50):
I'm oh yeah, but I was able to buy this
TV because I'm not as price sensitive as a real dad,
because I don't have the kids.
Speaker 3 (02:00:56):
That's why I got to get the get the get
the good one.
Speaker 2 (02:00:59):
And it's like, as a child have a that's the
normal way to say it, Like big screen, big Screen's fine.
It's like, you got a big screen, that's the coolest
thing to shoot, big screen, giant screen, to play Minecraft
on your fucking best father in the world.
Speaker 12 (02:01:11):
Two so you can both play it in separate rooms,
so you don't have to talk to me exactly what
they want.
Speaker 2 (02:01:18):
One of the most cherish things is watching my son
play Minecraft explaining shiit to me. But we're not talking
about a normal thing. We're talking about what the consumer
wants separate themselves.
Speaker 8 (02:01:28):
We're talking about what that guy who did the tweet
about how he gets boiling mad if he spends more
than ten minutes with children.
Speaker 1 (02:01:36):
Did you see this?
Speaker 8 (02:01:36):
There's some dad was like, am I normal? If I
spend more than ten minutes? Was it day with my children?
Speaker 2 (02:01:43):
Day?
Speaker 1 (02:01:44):
My blood boils? That's normal?
Speaker 2 (02:01:46):
Right?
Speaker 8 (02:01:46):
But everybody was like, hey, buddy, that's not totally normal.
But like, we totally get it. Being a dad is hard.
Speaker 2 (02:01:52):
You're a balance, you know, for that is, but not
like ten minutes should not because you can you have
a lot more of those in the day with the
child than that.
Speaker 1 (02:02:02):
Yeah, I mean, it seems like this guy doesn't have to.
Speaker 6 (02:02:04):
But it reminds me of when when you know, the
COVID lockdown first happened, and all the parents and the
ones on the internet were suddenly like.
Speaker 3 (02:02:13):
Someone needs to help me from being around my kid.
I thought you liked having the kids.
Speaker 6 (02:02:20):
Now you're mad that you got to be around the kids.
Why do you Why do you have the kid if
you don't want to be around them. And it also
makes you realize that like, oh yeah, the main purpose
of school was not education. It was just childcare so
people could go to work. That's why they're complaining. They're like,
I need to go to work.
Speaker 3 (02:02:36):
Well, that's You've just revealed a fucked up thing about
school and parenting all at once. I actually think that's
the lens. I'm going to go through everything tomorrow.
Speaker 2 (02:02:43):
I'm just going to look around the exposent to being like, okay,
what's the shoot where it's just people like I need
to get away from my obligations, like I made these
fucking decisions throughout my life. I got a fucking fast,
unhealthy bitch of a cat I got.
Speaker 1 (02:02:57):
We were all waiting on what that last word was
gonna be.
Speaker 3 (02:03:03):
There some options.
Speaker 1 (02:03:07):
It's like cat okay. He's saying cat okay.
Speaker 2 (02:03:10):
So I wasn't referring to any of my perfect fucking angels,
but it's like these people were just like, Okay, I
need to find it if my cat's healthy, because I
don't want to look at it or use it. Can
I keep my child in the separate room? Can I
give it an action figure that can go a.
Speaker 11 (02:03:25):
Robot and an action figure that will talk to each other.
Speaker 2 (02:03:28):
And and also if you want any of these things,
don't worry. They don't fucking work. They don't even do
the thing that we're promising you. But it's funny. Just
imagine this guy is walking and being like, please get
me away from all this. Can this is the robot
gonna be Oh, the robot's gonna fall over. It's falling over.
It's gonna kick me in the nuts. Do you see
that video the guy just getting fucking hoofed in the
(02:03:49):
nuts by That was great. If you've got a bootho
guy's being kicked in the balls, I don't care what
you're selling me, fucking Lockheed mouth.
Speaker 6 (02:03:59):
I mean, so that made me kind of sad. Like
there was one that was I don't want to say
the name of it because it did make me sad,
but it was a little It was like a little toy,
like a little Hello Kitty type toy, and.
Speaker 3 (02:04:10):
It was, you know, you could talk to it.
Speaker 1 (02:04:12):
Was it the one that was like the future of
robot Friend.
Speaker 3 (02:04:15):
Something like that. It was a funny.
Speaker 6 (02:04:21):
It was also like it'll watch everything that you do
and remember it. And I was like all right, and
I and I walked up and I was like, oh right,
what was it doing here?
Speaker 2 (02:04:29):
It is?
Speaker 3 (02:04:29):
I was like, oh, can I talk to it? And
she was like no, you know, the Wi Fi is
in here.
Speaker 6 (02:04:34):
In here is bad and it uses chat GPT, so
it can't connect to chat GPT right now. And I'm like,
this is just some somebody put like a couple hundred
grand of their own money into like a little doll
that connects to the Chat GPT API and is here
thinking that's anything, like like, why would we want such
a thing? Why would anybody here be excited by just
a front end to chat GPT a product that already exists.
Speaker 2 (02:04:57):
But that's the thing I think. I feel like, as
we're rapids a good way to put it. It's like
there are two there are two real cons with AI.
There's the con of the consumer being told it will
do this for you, that for you'll be a personal assistant.
And then there's for companies that are like, yeah, if
we put chat GPT in this, it'll be smart. No, motherfucker,
he's not gonna do that at all. And it's like
it's you don't want to name the company, which is fine,
(02:05:18):
but it's like there are so many that were there
last year as well. It's like, oh, it's an agent.
It's like it's just a fucking chat bot and you've
poorly trained it and it's like gonna tell you that
Taiwan is part of China or like BDSM shit like,
and it's just quite sad. It's like everyone is being
conned by these companies. Yeah, both the consumer and the
companies themselves. I can't wait to catalog them. I'm looking
(02:05:40):
forward to watching the sadness. But it's just it's it
is quite sad. I'm so glad I'm here a week.
This is great, my dark my joker, Like it's a shrug.
Speaker 3 (02:05:50):
It's a shrug economy. Like everyone just sort of hoping.
Speaker 6 (02:05:53):
Like I was talking to people about the Disney open
a ideal because that's like a bunch of people in Hollywood. Yeah,
like freaked out about it, especially writers who are worried
about IP and stuff like that.
Speaker 3 (02:06:04):
But like my take on if you look at the
deal is nobody even knows why they're doing it.
Speaker 6 (02:06:08):
Yeah, Like open Ai has the idea that, like, we
need licensed characters for Sora, and Disney just doesn't want
to be left out. Yeah, but like they think, they
appear to think maybe it's marketing for Disney. Plus open
Ai has some idea maybe people will want to create
scenes of Deadpool hanging out with Elsa or whatever.
Speaker 3 (02:06:28):
How does that result in money? Question mark? Nobody knows.
Speaker 6 (02:06:32):
Yeah, it loses money, But like Disney is just like
not getting left out of a thing. That's maybe the future.
Open Ai is making a deal that makes their business
look better to whoever the fuck is gonna give them
money next, and they didn't even pay Disney for it.
Speaker 2 (02:06:47):
Like they're doing a licensing deal where they pay Disney
in open Ai stock, which Disney already bought more of,
and in fact, Disney has the option to buy more stock.
It's just you gotta wonder what happens when open Ai collapses,
because he's gonna fucking help. And it's just like these
companies will change to the Chinese models, I guess, or
what remains of anthwer It's.
Speaker 11 (02:07:08):
Just everyone can use quin.
Speaker 2 (02:07:10):
Yeah, the quick Kimmy took too deep sea.
Speaker 3 (02:07:14):
Gotta be fine.
Speaker 2 (02:07:15):
Well, all right, I'm gonna wrap it there so I
don't just launch into a rant about AI as per
fucking usual. Thank you so much. We've got the core
three here. It's gonna be joining us the next day
or two. It's gonna be wonderful. Thank you so much
for traveling in with us. We're gonna end the episode
as we have, dedicated to Sean Paul. He's a wonderful fella.
Came in twenty twenty. Donate to the Pediatric Epilepsy Research Consortium.
(02:07:36):
Link will be in there, dedicated to his son. Thank
you so much everyone for listening. We'll be back tomorrow.
Thank you so much. Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song
is Matasowski. You can check out more of his music
(02:07:58):
and audio projects and MATTASAUSK M A T T O
S O W s KI dot com. You can email
me at easy at better offline dot com or visit
better Offline dot com to find more podcast links, and
of course my newsletter. I also really recommend you go
to chat dot where's youreaed dot at to visit the discord,
and go to our slash Better Offline to check out
(02:08:19):
our reddit. Thank you so much for listening.
Speaker 5 (02:08:23):
Better Offline is a production of cool Zone Media.
Speaker 2 (02:08:26):
For more from cool Zone Media, visit our website cool
Zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.