Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Also media.
Speaker 2 (00:04):
On April twenty fifth, nineteen eighty six, a disaster happen.
Speaker 1 (00:07):
I was born.
Speaker 2 (00:17):
What Welcome to better Off Lying, I'm your host ed Zititron.
We have an incredible block to begin this with. I'm
joined by Divindra, Hardwareven Gadget.
Speaker 3 (00:27):
Hello.
Speaker 2 (00:27):
Hello, I'm gonna going fantastically now victorious song from the Verge.
Speaker 1 (00:32):
How you doing, I'm so tired we are.
Speaker 2 (00:35):
Now I'm manager, I'm just the power of Christ, not really.
Edwin Graso Junior tomorrow. Hey doo ed doing pretty good?
Speaker 3 (00:43):
Pretty good.
Speaker 2 (00:44):
So you're fresh back from the convention floor, right, Yeah.
Speaker 4 (00:47):
After thinking I lost my passport there, which is so amazing.
Speaker 3 (00:50):
Yeah, I found it. Well, then you'll be just trapped
here forever. I just everyone's streaming.
Speaker 2 (00:54):
Yeah. I did not want to say this to you,
but that is actually a Cees right of passage of
the people that like I know, at least I think
like five people I know of like I lost my
entire wallet in the LVC.
Speaker 5 (01:06):
They didn't lose it in the LVCC, but one of
the first Cees's I ever went to, I had a
connecting flight back through Minnesota. There was a snowstorm I
was stuck there for an extra day while begging every
single airline counter to like put me on a flight out.
I lost my wallet. Oh there, So like my company
put me up with an extra hotel. I get to
(01:26):
the hotel and they're like, you kind of need ID
to check in. I was like, I don't know how
to tell you. I just lost my wallet at the airport.
And so they're very nice in Minnesota, so they just
let me stay. The next morning, I get up early,
and you can fly without ID. Domestically, you just have
to like.
Speaker 1 (01:42):
Go really just do like a credit check.
Speaker 5 (01:44):
You have to do go really early though, because they
do check everything you own with great companies.
Speaker 2 (01:51):
Oh no, I know, because the last time I did it,
there was like what car do you have?
Speaker 5 (01:55):
Like the lady was going through my dirty underwear and
that was my biggest nightmare. But you know, on the
way to the airport, I had left a review unit
laptop in the hotel rooms. I had to go back,
and then on the way back from going back, the
uber blew attire because it was so cool, Oh, oh
my god, it was so stressful.
Speaker 1 (02:13):
And I was like, oh my god, this is the
worst light of my life.
Speaker 5 (02:16):
I got home, it was like forty eight hours after
I was supposed to you. And then two weeks later,
the very nice people in Minnesota, the Lost and Found
at the airport sent me back my wallet after I'd
canceled every single card I owned.
Speaker 1 (02:28):
So that's such a lovely story.
Speaker 2 (02:30):
So, talking of finding things you don't want, what is
your experience being as the pre eminent.
Speaker 1 (02:36):
Wearbrales report, Oh thank you?
Speaker 2 (02:38):
What have you seen that's crap or interesting or funny
or good?
Speaker 1 (02:45):
Oh?
Speaker 5 (02:45):
You know, I wish I remembered all the crap that
I saw, but I feel like I just jetted sent
it front my head unless it's like super egregious. Oh wait, no,
I remember now I'll call it. I wouldn't call it crap.
I would just call it ridiculous, if that makes sense.
Speaker 2 (02:58):
Sick.
Speaker 5 (02:59):
It's ther human rare. The ultra human the name says
the ultra Human Ring Air is actually a smart ring
that I enjoy quite a bit. I did this thing
over the summer where I wore six smart rings at
wants to determine who is going to be the one
ring to rule them all right? That one came in second,
so I found it quite good and at cees. They're like,
(03:21):
but what if late stage capitalism? Okay, Okay, So they
made the rare, which is like there's it's a desert
themed collection of luxury smart rings, and there's Desert Rose Cute,
the Stink Song Dune, and.
Speaker 1 (03:36):
Then Desert Snow.
Speaker 5 (03:39):
Rose gold, gold and silver. Basically, the rose gold and
gold are made out of eighteen carrot gold. The silver
quote unquote is PT nine to fifty platinum. Guess how
much they cost?
Speaker 1 (03:48):
How much?
Speaker 5 (03:49):
So the gold are nineteen hundred each, like converted from
British pounds, the platinum is twenty two hundred.
Speaker 2 (03:57):
Great.
Speaker 5 (03:57):
I saw them on the floor and I was like,
I have six thousand dollars worth of smart rings on
my hand. I took a photo posted it online and
everyone was like no, and everyone was just like, you
know what though, your nails look better, and I was like,
these are ten dollars press ons, so nice try. You know,
it just goes to show you money isn't necessarily what
(04:17):
catches there.
Speaker 2 (04:18):
So they were expensive because of the metals. They were
expensive because of three hundred dollars worth of at.
Speaker 5 (04:23):
Most, So like the actual ring textpecs all the same,
So the actual ring is three hundred and fifty dollars,
so we're talking five to six times the price. And
I'm like, you know, I was talking to the guy
and I was I was saying, you know, tech becomes
obsolete so quickly. The battery is gonna die in like
three years.
Speaker 2 (04:42):
My rings already dying.
Speaker 5 (04:43):
Yeah, and you know rings that are actual gold, Like
people don't blink spending that much money on real gold,
high quality jewelry. But you get to keep those forever.
Those are heirlooms. You can pass them down. So like,
how do you reconcile that he And he was like, well,
there will be an upgrade path that we're working on
to guarantee the value of the metals.
Speaker 3 (05:02):
Like a skilled tree.
Speaker 5 (05:04):
And then also just like maybe we'll like I think
they're working on a way to try and swap out
the tech portion so that you can keep the metal.
Speaker 2 (05:10):
And I was like, I gotta work that one out first.
Speaker 1 (05:12):
Big if true, Big, if true?
Speaker 3 (05:14):
This is the only way to get people into smart
rings is to just construct artificial scarcity. That's the only thing.
And I know, Victoria, you've covered this stuff a lot.
I firmly believe that that whole category is bullshit, Like
just I say that wearing one, yeah, I mean I.
Speaker 5 (05:31):
Respectfully, it is very popular among women or as I'm
sure demographic is now majority of women instead of men.
And it's because they don't want to wear the swarm watches.
Speaker 2 (05:41):
This is the thing that story is not well told
in the media. I'm not saying is it any failure,
but this is and this is actually a failing of
better off line. As you literally just heard are preconception
that we had as guys, which is just and even
my experience of using AURA, the question is what is
it that attracts them? Is it? Is it the data
(06:01):
is more useful for women are more connected with those problems.
Speaker 5 (06:05):
In some respects because they do have pregnancy. They have
been putting so much effort into pregnancy research over the
last few years. They found that they could actually they
did a clinical study. So it's clinical, it's not it's
not it's a real that's what I mean. It's a
real clinical one. It's not a white paper, which is
what I see a lot in my field.
Speaker 2 (06:24):
And one is the difference that just for this white.
Speaker 5 (06:26):
Paper is like a book report by a company saying
we did our own internal research a clinical study or
a clinical paper is something that they've done that they've
published that they've paid the money to get published in
an academic journal or peer review journal, and those can
cost That cost can be sometimes up to twenty five
thousand dollars for that review. I mean, there's so many
(06:49):
of them. But yes, actually it was. And they're working
with actual researchers from universities such kind of that sort
of stuff, and they continuously do it. They have a
long dedication to it. So like they found that they
can find temperature predictions with pregnancy detection, and they haven't
necessarily done anything with that, but they're saying it's possible.
And they've also partnered with Natural Cycles. So for what
(07:10):
Natural Cycles is, You're gonna enjoy this. It's an FBA
cleared birth control app. Okay, Yeah, So I could go
off about that forever. The controversy with that, there has
been controversies that that.
Speaker 3 (07:25):
So there's a.
Speaker 5 (07:25):
Long story, but the point being is that a lot
of people or a lot of women have fertility issues. Right,
this is something that is discreet that they can wear
that integrates with other stuff that has like clinical backing.
Because a lot of people use basal body temperatures to
try and track their fertility. But that's so difficult to
(07:45):
do accurately because you have to do it first thing
in the morning, when you wake up, you cannot drink
and cannot get out of wat you can sleep with these,
it's much more so for a lot of women. There
is a benefit to that. It is, you know, a
lot of women I know are just like, Okay, I
have a bunch of rings on my hand. There are
some that are thin and stylish, which I quite like,
and then there's these honking smart rings. So you know,
(08:07):
that's not for everyone necessarily, but it is discrete compared
to I don't know, an Apple watch. A lot of
women find Apple watches. Sleeping with those is difficult, and
sleeping with them is difficult.
Speaker 3 (08:15):
You're totally difficult. I understand the like functional and useful
aspect of a smart ring for women that like that
totally makes sense. I think overall that market though, if
you're not always looking for the temperature market, it's it's
so niche.
Speaker 5 (08:26):
It's not like it's not like your first I always
say that a smart ring is better with a smart watch,
so it's kind.
Speaker 2 (08:33):
Of a problem as well. Like it is this and
also they're very annoying. The very I wanted the next
aura because I really like my sleep tracking, other than
the fact that the charging cycle lost like a day
and a half the best and then it screams at
me like an angry cat.
Speaker 1 (08:52):
But it's three hundred dollars.
Speaker 2 (08:53):
And the next one I was like, oh, I might
get it like a Christmas present for myself, and it's
like no, no, no, ma, I don't just try and
fucking buy this, man, You've got to remeasure your fingers.
Speaker 5 (09:02):
What I can actually tell you why, oh thank you.
They changed the saint changed the sensor, right, so I
have the four on me right now. If you look
at yours and you take yours off, you have the
bumps right.
Speaker 2 (09:11):
Yeah, this says no bumps, bumpless.
Speaker 5 (09:13):
It's bumpless. They've changed the sensors and the arrays on that,
so that has affected the sizing quite a bit.
Speaker 2 (09:18):
Still annoying.
Speaker 5 (09:18):
So I went from an eight to one nine that
that is the case, but the battery lasts a lot longer.
So with this PO two tracking those last like tracking
blood oxygen, thank you, So like with that tracking that
last maybe two three days, I found it really annoying.
Speaker 2 (09:33):
This is seven that's not bad. Yeah, this is still niche, though.
Speaker 5 (09:37):
It's still very niche. It's still very niche. But I
will say at the live Verge Cask recording that we
did last night, when we were meeting readers, I had
so many come up to me and they're like, look
at my order ring. I bought it because of you.
And I was like, I never tell people to buy yeah,
but they're.
Speaker 1 (09:52):
Like, look, look look I bought it.
Speaker 5 (09:53):
And I was like, oh, yeah, congratulations, you're.
Speaker 4 (09:56):
Someone so you know, like me, I've never I've never
used these wearables. You know, what is the value proposition
for someone who is not already in the Marketer ecosystem
using it for one of the use cases you've kind
of laid out, you know.
Speaker 5 (10:11):
Usually, so the killer quote unquote the killer app for
wearables is health. For there are so many different kinds
of wearables, but the one that people most resonate with
now is health. And usually like they get into it
because they want to change something or they've had like
a health scare, so oh, like I don't sleep well
and that my doctor says I have sleep appneyet oh,
(10:33):
I should track this sort of thing, or oh, my
parents just had like a cardio respiratory issue, I'm going
to buy them an Apple Watch. Actually, the scenario I
get asked most often is like, my parent had health scare,
What Apple Watch do I buy? So it has a
lot of different purposes, like.
Speaker 2 (10:50):
What health scares is actually help? Is it just folds?
Do they do more falls?
Speaker 5 (10:54):
If you have abnormal heart rate? Like if you're at
rest and your heart rate normally dips too low or
too high, it'll alert you and be like this is
kind of messed up. And there are many stories of
people who have gone to the hospital have their live
saved because.
Speaker 3 (11:08):
Yeah, and the idea of having Apple Watch with built
in cellular two and then you're all on the Apple
family plan and like your parent could call you and
say hey, I fallen or something, or they get immediate
emergency help, whereas their phone could be in another room. Right,
all those uses seem pretty useful.
Speaker 2 (11:24):
Yeah, it's just what ED is asking. Here is a
question I think many people ask about wearables. And I know,
basically we spent most of the show just being like
crapping on everything, But I like my orr A ring
when it fucking works and you know what is that
not the tech industry right now, but what Ed's asking
is like something they've kind of failed at Victoria. I
know you might not like the idea that you've effectively
(11:46):
sold an order ring by proxy, but someone probably read
the thing and when, oh, this actually explains what it
does in a way that doesn't make sense to me.
Speaker 3 (11:53):
Can I want to use it for that? Yes?
Speaker 2 (11:55):
Yeah, this has a use case of.
Speaker 3 (11:56):
So very specific use case. I may be an unpopular thing.
Does anybody remember the job own up I was carrying
back that forum of fact was just you were you
were definitely you were in pr at that point.
Speaker 5 (12:08):
But that was on the screenless fitness track and I
hear that, yeah, yeah, I hear that quite a lot
from different people who were just like.
Speaker 1 (12:17):
I don't like the smart watch.
Speaker 3 (12:18):
It was thin, it was really light, but they couldn't
get the h the engineering right.
Speaker 2 (12:23):
Because kind of crap a building stuff sometimes.
Speaker 3 (12:26):
Well their speakers were great, yeah, but they up destroyed
them because the because it was a flexible bracelet.
Speaker 2 (12:31):
It feels like a more challenging product.
Speaker 3 (12:33):
It was really hard to build them. They kind of
bet the whole company on it all fell apart. So
you do you have any wearables, Divendra. I have an
Apple Watch, and I always forget to wear it, so
that's the thing. But I love the idea of an
Apple watch. And certainly I'm thinking about my parents. I'm
thinking about my daughter to who's six years old now,
and parents have to think that devices for your kids.
I don't I don't want to put a tracker on her.
Speaker 5 (12:55):
But you don't want to give her a phone want
to lose.
Speaker 3 (12:58):
And also like if she's off somewhere doing activity, she
misses the bus or something. I would love an emergency
way for me to when in a couple of years,
not at six, but at eight to ten. Yeah, be
able to call me, be able to call for help
something even just hit a button, or to see where
they are on GPS if they're on like a school
trip or something, because there are trips where kids get
left behind. So you know, not surely not like the
(13:22):
most important thing in the world, but it is something
I think about, and certainly apparently it's two. Yeah, a
connected device that can instantly help them or get them
to emergency help like that seems super valuable.
Speaker 2 (13:32):
I kind of want to put one on my mom.
There you go, No, she's totally fine. She's doing well.
She just forever does not pick up her phone. So
it'd just be nice to know that too. My mom
would probably not hear this, but I'm sorry mom in
advance if you do so.
Speaker 1 (13:46):
Devandra changing gears slightly?
Speaker 3 (13:48):
Sure?
Speaker 2 (13:48):
Have you seen anything exciting on the show for anything?
Really been reving your engine?
Speaker 3 (13:52):
Revving my engine. I spend a lot of time within
the video, and I know you guys have been talking
about in the video by all means, yah, I do
think it is interesting watching in videous transformation to this
sort of like AI Superpower Company. But I also feel
like you know, I've been covering in video for a while.
I've covering Jensen Wong for a while. Did you guys
see that CES keynote?
Speaker 2 (14:12):
I didn't see the keynote and just not it was
not good for him. When you say you've been covering
them for a while, yeah, how long?
Speaker 3 (14:19):
I don't know, twenty ten, like pretty much since I
started doing tech stuff.
Speaker 2 (14:22):
Thank you. I knew that was the case. But for
listeners this is important. A lot of the people not
Max Journey, he's a dog, but a lot of people
talking about Nvidia now have recently started, yeah, writing, and they.
Speaker 3 (14:33):
Say the video And when you say that, I know
you're opposed.
Speaker 1 (14:36):
People say the video.
Speaker 3 (14:37):
People say Gary Shapiro said the video on stage, Jensen Hoong, so.
Speaker 2 (14:42):
Didn't he yell as well?
Speaker 3 (14:44):
That was there for that too the way I think,
I think Jensen's going through some stuff.
Speaker 2 (14:48):
So why was it bad for Jensen?
Speaker 3 (14:50):
Bad for Jensen? Because that was like a two hour
long keynote where the thing everybody was there filling that auditorium.
They just want to hear about the video cards, give
me the new RTX fifty ninety or some shit, and
he's been maybe five or ten minutes talking about those things.
The rest of it was on their robotics AI virtualized
training operations, and nobody gave a ship. Like every It
was like a comedian watching every single joke fall flat,
(15:13):
and you could tell on stage that it was kind
of affecting.
Speaker 1 (15:17):
He's trying.
Speaker 4 (15:18):
Also what they're they're real, they're real or industrial metavers
that they're trying to construct this world which everything versus
the omniverse. Plus in video Cosmos, it's a whole Cosmo Cosmo.
So omniverse is this sort of like simplistic view of
a three D environment that you could use to train
a robot or something. Cosmos is a real almost looks
(15:41):
like a real world version of that environment, so like
a photorealistic environment, and the idea these are interesting. How
would you train a robot with a decent amount of
artificial intelligence to perform certain tasks in a black mirrorst way?
Speaker 1 (15:55):
This is actually describing saying it for many reasons.
Speaker 3 (15:58):
So it's black mirror. You have them through that simulation
over and over, get better at it, and that's the
whole thing. Really interesting pitch, maybe not for the CS audience,
and that's kind of that kind of affected him. And
then yeah, the next morning there was a press Q
and A where people were asking questions, but he also
seemed really uncomfortable and also like, yeah, he was really
picking on the sound guy that poor Q and A
because he was like, the speakers are pointing at me
(16:20):
and making it hard for me to hear. We're all like, no, Jensen,
you're hearing reflections with the speakers on the wall that
are coming to the audience. He was like, no, I know,
it was a weird.
Speaker 2 (16:28):
Thing, man, It is such a penis.
Speaker 3 (16:29):
But I think he's just like he admitted in that
Q and A that he did a bad job at
the keynote.
Speaker 1 (16:35):
He was like, I failed to stors love to hear.
Speaker 3 (16:37):
Yeah, that's what you want to hear.
Speaker 2 (16:39):
That's what you wake up thinking about.
Speaker 3 (16:41):
I don't feel I mean, dude due to a multi billionaire.
Speaker 2 (16:43):
Now yeah, oh I feel I'm laughing at his pain.
Speaker 3 (16:45):
He's dealing situation stuff. But yeah, the in video stuff
is interesting. The video cards are interesting because they're leaning
more in AI.
Speaker 2 (16:52):
So tell me about with the video cards. So the
lower end one is meant to be the equivalent of
the forty ninety.
Speaker 1 (16:57):
I want to say.
Speaker 2 (16:58):
This is where it gets confusing because they using DLSS.
They're using DLSS, so you explain that for US AI upscaling.
So these new cards have DSS four, which have AI
upscaling to basically smooth out frame rates so they can
generate for every single real frame that video card renders,
it can generate three artificial frames, which kind of smooth
(17:18):
out gameplay. So if you have a really fast monitor,
it'll look a little better. They can say the FPS
is higher because this DLSS will autogenerately generate free ones.
Speaker 3 (17:28):
Yeah, and then you can go back to the last car,
the forty ninety, and be like, well, this is not
generating as many frames that one could generate one frame
per every real frame. So the number they can they
can say the number is right. But if you turned
off the LSS and you just looked at pure gaming
performance without any of the AI stuff, certainly like the forty,
the fifty seventy would not be as fast as the
forty ninety. So they're picking and choosing more marketing for
(17:50):
the gens. They've always done this, but yeah, you're choosing
which benchmarks they're talking about.
Speaker 2 (17:54):
So, and just before we get off the subject, so
in Vidio you've been covering since twenty ten, pulling these
games for a while, but they always have dumb, puffed.
Speaker 3 (18:02):
Up marketing they have. But also the thing is like,
what's change? Then they do succeed, like some things do
work the LSS. When they first talked about it, even
Jensen admitted he said nobody believed him that this idea
of using AI to upscale a lower resolution to a
higher resolution on a monitor gives you less like processing
power on the cards, so it gets you higher frame rates.
Nobody believed that would work, but it turned out it
(18:23):
worked pretty well. There were some issues, and they got
better and better and better at it, and he said, basically,
for the past six years, there's like a supercomputer at
in video servers that's just been crunching the LSS algorithms
to like sort of make that thing better. So that
is a real thing. But I think he tends to
overstate how good it is at times. It's a weird
thing because I think he's full of a lot of bluster,
(18:45):
you know, But the actual the results, the products are good,
certainly compared to AMD's video cards and all that stuff.
Speaker 2 (18:52):
Yeah, so Victoria returning to you, anything good, I know,
good stuff, like, Okay, what's actually been exciting? What's this
question with some alum because most people have had nothing?
Speaker 5 (19:03):
So I actually so my favorite thing on the show.
It's it's kind of girly, but it's Laorel sell bioprint.
Speaker 2 (19:10):
Okay.
Speaker 5 (19:11):
It is a it's meant for something like a dermatologist
or an esthetician's office, and it is a machine. Basically,
you take these skin strips and you take a sample
from your skin, you put it in a chemical buffer solution,
you stick that on a cartridge, and then that cartridge
is stuck into this machine. It goes beat up, boop uh,
(19:31):
and then it tells you if your skin's chronological age
and biological age match, or if you know your skin's
your skinny doing so good. It can analyze like different
criteria like wrinkles, skin barrier function, poor size, uh, you know,
even skin skin tone, like a bunch of different things.
(19:53):
And then based on the proteins that it had detected
from the from your skin and then the solution, it
can tell you whether you're pro to have those problems
in the future if you don't take care.
Speaker 2 (20:02):
Of things, and so what is the thing you meant
to do with this information?
Speaker 5 (20:06):
It is meant to help you wade through the cesspool
of skin fluencers hawking different products to you. So like
if you are you know, if you're a woman, probably
you're on TikTok and they're like, oh my god, you
need to buy a retinal. You need to have retinol
every day. I'll like, as soon as you start thirty
or you will shrivel up like an old crone. And
(20:26):
that retinyl is vitamin A and it is the most
clinically studied skincare ingredient. It is like basically what dermerenttologists
will give you if nothing's freaking working. But it is
a tough ingredient because a lot of people are like
sensitive to it. So there's something called purging, So if
you use it, you could end up, instead of having
clear skin, have a bunch of breakouts. And it's because
(20:49):
it's increasing cell turnover and it's just getting all the
impurities out of your face and you know it'll be
fine after that. The problem is it looks like it
looks the same as an intolerance or insensitivity to rentin all,
meaning it doesn't work for everyone, and those people are
just getting skin. So it can tell you if you're
responsive to it and whether it's worth that's really cool.
Speaker 1 (21:11):
Yeah, that's what she really has.
Speaker 5 (21:13):
Very cool and so like from different things that you notice.
So like for me, it said, you have some issues
with skin tone evenness, and yeah, I know that. I
can see it and it pisses me off and I'm
trying all these different things.
Speaker 1 (21:24):
How do you deal with something like that?
Speaker 5 (21:26):
So it's called hyperpigmentation, and there are different ingredients that
are said to fight that. So vitamin C is one.
Another one is nice cinemid. There's just like tranexamic acid
and all of those. So you know, like the girlies,
we're on TikTok and we are learning about all these
ingredients that we are supposed to use.
Speaker 3 (21:43):
In different serium.
Speaker 2 (21:45):
It will give you more direction.
Speaker 5 (21:47):
It'll give you more direction. And like when I was
talking to them, they're like, it's going to help you
know what not to buy. That's because like when you
have all these marketers and these influencers, they're just telling
you all these things that Like I was on there,
I was like, do you know what the Creans use?
The Koreans use beffita? And I'm like, what the fuck
is buffita?
Speaker 3 (22:04):
On social media? So like the other thing is you
go into Sephora or something and be like, help me out,
and I have bad skin, so I have to look
at this stuff too. And they will either you will
get a good salesperson who's like I got you, I
will take care, or you get somebody who wants to
make a lot of money and they will lead you to,
oh you need nice in am, I do you need
hyaluronic acid? All the fun stuff and plus retinol, and
(22:26):
then your your face is a mess.
Speaker 5 (22:27):
And then you end up with a ten step skincare routine,
which was popular like ten years ago in Korea. Is
like super nuts. Like my cousin's kid is like eleven
years old and she came to She's like, let me
show you my skincare routine. I'm like, you're eleven. The
only thing you should be using is moisturizer and sunscreen, like.
Speaker 2 (22:42):
A Dexta styles set up.
Speaker 5 (22:44):
Yeah. No, And she was breaking out and I was like,
you know why you're breaking out. You're overloading your skin
and you're too young. You don't need any of this.
It is a moisturizer and a sunscreen.
Speaker 2 (22:52):
So does this thing give direction or is it just
like it does.
Speaker 5 (22:56):
Give you direction? Because for me it's so for me,
it was telling me like, oh, you have hyper pigmentation,
but technically the proteins you're not prone to it, so
there's something.
Speaker 2 (23:04):
Well the consequence is there?
Speaker 5 (23:06):
So that means that means like I should probably be
using a vitamin C. I don't currently, so I'm like, oh,
I was trying a different ingredient obviously working, so like
I can look for a vitamin C. It told me
I was responsive, like highly responsive to retinol, which means
it will likely work better for me than someone who's
intolerant of it, and so that means I should probably
(23:26):
use a retinal.
Speaker 2 (23:27):
So this and the actual consequence is here is saving
hundreds of dollars.
Speaker 5 (23:30):
Hundreds of dollars because these things are not cheap to buy. Like, yeah,
if you know what drunk Elephant is, I do not.
It's a very premium brand. And Sphara you walk in,
one bottle like seventy million liters is like one hundred
and fifty dollars. Like these things can.
Speaker 1 (23:44):
Cost a lot of them.
Speaker 2 (23:45):
All this pressure on social media to buy Bye Bye,
to fix every problem, make every part.
Speaker 5 (23:49):
Like during the pandemic, skincare just blew up and then
like Korean skincare is like super popular now because you
look at you look at the stars of Squid Game,
how do you think they are? They're like fifty five
and their skin looks incredible. They do not look whatever.
And it's because they take skin gear so much more seriously.
They the tech that Loriol did, it was actually paired
(24:10):
with the Korean startup and then micropluitics, So they're taking
like protein ten this product of them ten years to
build because of the science and all.
Speaker 3 (24:18):
Right, there was a booth I stopped by.
Speaker 4 (24:21):
It was a Korean skincare company with some tech, and
they had like four pamphlets that had nothing to do
with the tech and just talked about why skincare was
so big as an industry Korea huge, which was which
I thought was really interesting because I feel like almost
every single pamphlet I've read besides those ones, has been
(24:41):
trying to trying to pitch me or sell me on
the tech, which I suppose that probably wasn't a way.
Speaker 1 (24:45):
I mean, it's.
Speaker 3 (24:46):
Interesting because there's there's an obsession with the lighter skin too,
Like whenever I go to computechs, the stuff you see
on billboards and everything is like, oh yeah, I want
to light complexion, you want a fair complexion, and it
all feels like this is all kind of I was like,
I see what you're up to, eyes, Like there is
very clear colorism going on there. So you're like literally
just you know, scrubbing your face off, scrubbing skin off
(25:07):
your face just to get a shade lighter, which is
part of a marketing too, which is a shame.
Speaker 2 (25:11):
I mean, I'm really glad you brought that up though,
because The thing is with CS is it is very easy,
as I will know, to be a bit cynical, definitely pessimistic.
When you look around, it's just everything has AI on it.
Speaker 1 (25:21):
It is nice to.
Speaker 2 (25:22):
Hear, especially for the tech industry, something for a woman
that is like very like this seems extremely helpful.
Speaker 3 (25:29):
It seems like it comes outside the tech industry too, right,
it's more of a look.
Speaker 2 (25:32):
It's like using technology to solve a problem, which is
usually not the tech industry's idea.
Speaker 5 (25:35):
And the funny thing is that people hear Lorial and
they don't think they do tech. They have been a
huge presence at this show for six, seven, eight years
now and they are doing interesting things with tech. And
you know, I hate the term fem tech because they
usually talk about it in a certain way. Usually has
to do like reproductive health tracking, YadA, YadA, YadA. But like,
they have a really high tech hair dryer that uses
(25:58):
infrared light to help you your hair.
Speaker 2 (26:00):
Why is that good?
Speaker 5 (26:01):
Because if you use so most hair dryers and hair
dryers are gadgets. Okay, they are the they're the second
most power intensive thing in your house besides the my like,
I think they use more power than a microwave really yeah,
and you use them for a long period of time.
They use heating coils, and that heating coil like you
want your hair to drive fast, you put it really
close to your head.
Speaker 2 (26:21):
It damages your hair, of course, because there's a giant heat.
Speaker 5 (26:24):
Right, heat, so you can hold it from further away,
use less heat, use less damage, and save a lot
of energy. So it's like good and it's like not
the sexiest thing in the world per se. It's not
things like, oh my god, we're going to revolutionize the world.
Speaker 3 (26:38):
That is different than the Dcon because they did try
to revolutionize.
Speaker 5 (26:41):
So the Dyson still is using airflow, right, it is
like airflow versus They did a lot of cool things
with airflow and engineering with the motor.
Speaker 1 (26:50):
This is cool.
Speaker 2 (26:52):
This is not actually applying heat to it.
Speaker 5 (26:54):
No, it is applying heat, but it's applying heat from
the light.
Speaker 1 (26:57):
It's light.
Speaker 5 (26:58):
So it's like the way to describe it is like
when you have something rain right and there's no sun.
The next day, the water will dry, but not quite
as well because it's like the wind and whatever will
evaporate the water. But if it's you have rain and
the next day there is sun and wind.
Speaker 1 (27:13):
It dries much faster.
Speaker 5 (27:14):
It's the same concept but applied to your hair.
Speaker 2 (27:16):
And I will respond to one thing you said, No,
I actually think things for woman's skin and hair dryers
that are less damaging to hair are actually the real revolutions.
If CS was old shit like this, it would actually
be a really cool show. If it was like ways
to use less water in a tap, let's still get
you as wet as you need to.
Speaker 1 (27:36):
Like it's people.
Speaker 5 (27:37):
I think it's because like the industry is like full
what the press industry for tech is like full of men.
Speaker 2 (27:42):
I mean we kind of proved that the beginning, like
I'm actively.
Speaker 5 (27:44):
Apolo, I'm actively the only woman in this room. But
it's like no. But it's like they have a lipstick printer.
That's pretty cool. You can take a you can app
you look at yourself, you can try out different colors
and then custom print to the actual thing. If you
have a celebrity that whose makeup look you like, you
can take a photo, you can color pick their exact
shade of lipstick and print it for yourself and then
(28:06):
take it on the go.
Speaker 1 (28:07):
It's very thoughtful.
Speaker 5 (28:09):
They have like a hair dye wand that like precisely
applies dye so you don't like destroy your bathroom. Hair
dyeing is like a ritual where it's like your bathroom
will never be the same. It takes forever to clean
up this. You can just brush your hair and you've
dyed it. That's it super cool, so.
Speaker 2 (28:25):
Real future, I wish we'll see us was this Sadly
we're wrapping this book, d vingdro hottawall were came people
find you.
Speaker 3 (28:32):
Find me at Engadget. I do the Engadget podcast there
and a podcast about movies and TV at the Filmcast
at the filmcast dot com.
Speaker 2 (28:38):
Lovely Victoria, we can people find you.
Speaker 5 (28:40):
I am at Vicm's song on all social handles, but god,
we got to find an alternative to Twitter. But then
I'm also at the Verge, so you can find.
Speaker 6 (28:48):
Me there Wonderful and mister Newsletters, Detect, Bubble dot Substack
dot com podcast This Machine Kills, and Big Black Jacobin
on Twitter and blue Sky.
Speaker 2 (28:59):
You can find me hanging from various banisters as I
crawl into your room to make you listen to the
podcast the second time. I need these downloads. Everyone. We'll
be right back after these insane advertisements that will blow
your mind. And we're back by the Thing, Download the
(29:24):
Thing or else. We have now been joined by mister
David Roth of Defecta.
Speaker 1 (29:29):
Hello.
Speaker 2 (29:30):
He is fresh from the hyper Loop, which he described
as thrilling.
Speaker 1 (29:33):
Yeah, it was amazing. I had no idea that you
could take a car through a tunnel like that was
something that.
Speaker 3 (29:39):
Yeah, how was the car right?
Speaker 1 (29:41):
I did it too? Did you actually?
Speaker 3 (29:43):
Oh my god?
Speaker 4 (29:44):
You know there's all this talking dune about ancestral memory,
but nobody ever remembers what it's like to be that
that sperm swimming through the fromordio ooze. And you know what,
if you want to know what that is, get in
one of these uh tesla's and travel through these patchy
white ooz looking walls and the fake rocks and the.
Speaker 3 (30:08):
Very short list. It's like a very useless distance.
Speaker 1 (30:10):
Yes, and I cannot emphasize enough how walkable some of
the like basically a football field. You're in the thing
for like less than two minutes, but.
Speaker 3 (30:24):
They spent years digging into the ground to do this.
I spend minutes thinking about why this is amazing.
Speaker 2 (30:30):
I am the first time I went in it, and
the only time I went in it the cut, we
had a traffic jam and I just started to panic.
I'm going to be honest, Yeah, because I'm very sure
I was very closetrophobic. I was like, uh, I was
doing not the Tina from Bobsburg.
Speaker 3 (30:46):
Whenever you're going to the underground tunnel, like into Manhattan
or something, you're like, it's could be bad.
Speaker 1 (30:52):
Nowhere else to go. You're just in the tube. And
so yeah, that's the thing I never loved. I'm not
like a big height scot. I'm not a big confined space.
I'm just a real cowardly guy. I'm always afraid every
moment in my life. But in those tunnels, it's like
it's a big there's other cars in there. It feels
like it should in it together right where there's a whole.
Speaker 3 (31:14):
Daylight.
Speaker 2 (31:14):
Or if you get an ounce of the car in
a tunnel, theoretically walk around the cause in this one,
I don't know what you'd get out to me.
Speaker 1 (31:22):
The tunnel is the size of the car, which I
didn't care for. Also, very steep getting in and out,
even by the sort of degraded standards of what we're
talking about here, which is basically like getting in a
car with one or two other people you don't know,
and then traveling the distance of two football fields, Like
it's still like it shouldn't be as unsafe feelings.
Speaker 3 (31:44):
I feel like we gotta clown the hyperloop more. Yeah,
just how dumb and useless it existed to stop the all.
Speaker 1 (31:53):
There's some journalism I will say, defector has we have
taken that torch and run with it. But there's also
only so many things you can say.
Speaker 2 (32:00):
It's only so far it goes right.
Speaker 3 (32:02):
Yeah, I kind of remind people like this was the
thing Elon Muski's to like kill the high speed rail.
Speaker 1 (32:06):
Yes, correct, you got car in tunnel. This traces all
it took.
Speaker 4 (32:11):
It also cost fifty three million dollars yeap, and it
costs oh wow, yeah, fifty three million to build, and
then forty seven million for the two tunnels and three stations.
Speaker 1 (32:22):
That's awesome, great, all every fucking penny. Yeah, especially as
somebody who frequently rides the subway where one car can
hold like a train can hold like twelve hundred people, right,
Like in this case, it's like a car where there's
a guy that helps you get into it, and then
there's a guy that drives it. So the ratio of
people helping to people riding is either one to one
(32:44):
or very nearly one to one. And also it doesn't
work like every time people get out there just like
please exit, like ahead of the car. And I would
say that in the times that I was watching that
worked like maybe thirty.
Speaker 3 (32:55):
Percent of that. Amazing.
Speaker 1 (32:56):
Yeah, the super stuff great is the modern day Tony Stock. Yeah,
he's amazing.
Speaker 2 (33:02):
He's so good. So what did you see on the
floor exactly?
Speaker 1 (33:05):
So I tried to see as much as I could
in terms of like I wanted to see the sort
of the high and low of it, because I know
that when we were talking about it yesterday, I had
only been to the uh sort of the space here
in the Venetian and a lot of that was like,
you know, sort of their products, like they're identifiably like
(33:25):
things that you know. In many cases they're like you
know you're making it, so you buy it. And like
in some cases it's like it's a vibrator or it's
like a dog door or whatever, but it's like it
has a practice pose, right, which is incredible. Yes, there
was one that we talked about this with the air purifier,
which we should probably go back to fill some confusion.
But in this case, it's like, so the convention center,
(33:45):
and I know you all have been over this a
million times on the podcast, but I'm just.
Speaker 2 (33:49):
Can say it again, all right, Cool.
Speaker 1 (33:51):
It is like the biggest of the big brands have
their sort of like LG experience stuff and this is
like a huge, like thousand square foots yeah, big awesome,
deep pile carpet, Like it's like a pleasure to experience.
But then if you go upstairs, it's like all the
white label Chinese electronic brands like sell on Amazon, And
(34:14):
so I got to see I wanted to see both
of those, and like the there's also this sort of
area where the first is like bleeding into the second,
where there's like brands that are doing ambitious stuff, you know,
wherey're like they make a robot or whatever the kid
is supposed to play with or something, but it like
also kind of sucks, Like it's not like the LG
(34:34):
shit was interesting to me because like as much as
I sort of as we discussed yesterday that like some
of the AI elements are like so misbegotten as to
like almost be poignant, right, But then you give an
example of one, like just the stupid one. The smart
home stuff really like kind of was weird to me
because that was so I ran into a coworker of yours,
(34:56):
an engadget guy up on the Chinese Electronics flirt, Dan
Dan Cooper, great dude, very happy to talk to him,
and he uh was sort of telling me about an
ad So I was struck by the smart home stuff
in the LG space. Like all the products themselves, like
as a product are cool. The O L D O
(35:18):
L ED screens blew my mind. They were incredible. There
was stuff where you could like basically have a sort
of more or less dirt free home garden thing and
like it looks sick, dude, Like it looked like something
from the Jetsons. Like I was legitimately thinking that it
would be a nice thing to have free home. Yeah,
it was like and it looked it was just like
they do like a boombox basically, like there were these
little slots with plants growing up through it, and they
(35:40):
said you could grow vegetables in it. I don't think
I would do that, but.
Speaker 3 (35:43):
It's like like those those like self contained but.
Speaker 2 (35:47):
One can eat vegetables.
Speaker 1 (35:49):
Certainly that's a practical that's a thing that something does. Yes,
and so so all of those items were neat to me.
But then the smart home thing was basically like the
most degrading, paying imaginable where they're like, is this what
you hogs want? You want a robot to pick your
clothes for you and then map your commute? I bet
you want that, right?
Speaker 2 (36:10):
Do you think that?
Speaker 1 (36:11):
What if we told you that the robot cared for you?
Speaker 3 (36:13):
There was a lot of best friends.
Speaker 1 (36:15):
Yeah, there's so that language, especially LG had it.
Speaker 2 (36:18):
Where is this affectionate intelligence intelligence Jesus Christ, empathetic AI?
Speaker 3 (36:23):
Two years later it dies because the cloud service, right.
Speaker 4 (36:28):
And it's going to make you feel like premium plan.
Speaker 1 (36:33):
You killed him. You killed the guy that picks your
pants every day, but.
Speaker 3 (36:36):
You can bring them back to life for nine extra
dollars a month. Did you guys hear about the robot
for kids Moxie, Yeah, the one that tell us the story.
I mean that the Moxie was just like very expensive
robot for kids developed by the I Robot CTO. So
somebody who like really wanted to build a companion, especially
for kids who are maybe special needs or autistic kids
(36:56):
and just something for them to interact with learn language.
You're later that thing dies. Now you have to tell
your child that their robot best friend is dead. Yeah,
because this company couldn't get funding in time, and the
cloud surface is dead. So now you just have a
really expensive I think it was like fifteen hundred dollars
in the beginning. We're like a fifty dollars monthly price.
Speaker 4 (37:16):
It has been realizing I'm the guy in like these
sci fi movies. You'd be like, I better not fucking
see you with a cyborg.
Speaker 1 (37:22):
Yeah, they had there's like a new version of what
you're describing Amy Ai and what was this similar thing
looked like a ferbie. It blinked and it talked in
a childlike voice, and it was a companion. So this
is again one of those things where like you kind
of do got to hand it to him in the
(37:43):
sense that it's cute. People were lining up to get
their picture taken with it, Like there is a lot
of just I watched like three different guys just walk
up get handed the thing like whatever, the Stanley cup
or whatever just hit the soy face while someone took
a picture of them and then passed it to the
next day they did the same thing.
Speaker 3 (38:00):
Cute robots, Like yeah, a lot of those categories, Like
there was one that just stares at you. It just
like looks up at you, and it's like, Hi, that.
Speaker 2 (38:09):
Was I don't need like an aggrod NPC from Morrow
Wind just walk.
Speaker 4 (38:17):
On some level, it's like it is technically impressive to
be able to figure out a way to hack into
what we view as cute, figure out a way to
get something that you're gonna let into your home like that,
But then it immediately brings you the next question, you need.
Speaker 2 (38:30):
It, righteah?
Speaker 1 (38:31):
And that's exactly that's where I was going with it too,
that it's like as a design thing, it's like it's
a try idea like I thought it was. I wanted
to pet it, you know, holding that thing, and yet
like the idea of it like talking to my child,
like no, fucking like, that's just not a thing where
I mean I again, this is the other aspect of
what you're saying, Like there are other needs and applications
(38:51):
for this stuff, And so when I look at something
and I'm like, well, I wouldn't use that, Like I
don wouldn't want to be in a robot that exercises
for me like, well, I don't have like unilateral I
don't my issues.
Speaker 2 (39:00):
I am a child and a lot of experience with
special aids kids as well. And guess what, this stuff's
fucking repugnant to me. It is disgusting to me the
idea that and I understand it's like, yeah, companionship, what
have you. But it's like any time I don't mean
you to bendra any of these companies that try and suggest, oh,
this is the we're going to use the computer to
fix your sons, because that's what the actual suggestion is.
(39:23):
It's not the it's not that they're like this is
really going to help you. And there are examples of
things that can they're even for like older people with dementia.
For example, they had these kind of companions that were
literally just like effectively like fluffy toys that kind of
purd like a cat and it just made them feel
like a robot, the headless cat, and it was just
very but that had a very specific thing. But these
(39:45):
robots is this kind of catch all if something's wrong
with your child, because that really is what they're saying,
and it's just but I'm also not buying your head off,
because it's like, theoretically, the idea of one of these
things is sensible, like I'm a loser and I'm lonely,
and I could use some or like a like a
child would like a teddy bear that could talk to
it theoretically, but it feels like none of these people
(40:06):
have got beyond that stage ever, right, They're not like
and how would this operate with a person?
Speaker 1 (40:12):
That was the bit that So the thing that like
kind of left me feeling sad ish leaving the convention
center was that aspect of it, right, that all of
it is sort of not just so the smart home
was bizarre to me because it was infantalizing. But there's
also and this was the bit that you're that Dan
told me about, was it? He was like, this is
(40:33):
also great to talk to more people with that, Like
thousand yards stare, they've been to thirteen ces is.
Speaker 2 (40:38):
Like the dad nothing God.
Speaker 1 (40:40):
Yeah, so he was like, oh, there's probably six ces
as ago.
Speaker 2 (40:44):
Drag on a cigarette pre pandemic time.
Speaker 1 (40:47):
Yes, you have to imagine this is actually just like
a perfectly cheerful man the British accident.
Speaker 2 (40:51):
Yeah British, right, he does.
Speaker 1 (40:53):
But he was talking about an lg AD that they
had then it like clearly had stayed with him. Do
you know the video that I'm talking about on it?
So this is it's like, uh it was. This was
their like sort of smart home thing that they were
working with then. And it seems like an important distinction
that this is like a pre pandemic thing because so
much of this stuff feels like it's designed to be
(41:13):
like are you so lonely? And you never see anybody? Right,
which I think is an experience that's now like more
current than or like at least more you can kind
of like put your finger on the top of the yes,
whereas like I feel like six years ago there's a partment.
It's like no, man, I just go to the store,
Like I don't have to like if I want to
see another person, I had to do it. But in
this case, so the video he was describing was like
(41:33):
it's the life of a guy who's got like an
lg AI support that is sort of taking care of everything.
Speaker 2 (41:40):
Is this the way, kid?
Speaker 1 (41:42):
No? This is so here he wakes up, the thing
is like hey, good morning, Like you know it's seventy
one degrees outside. Would you like me to pick your
outfit out for you and you just say thank you,
and then it does. And then it comes out of
like a steamer that's like built that's in your home,
you know. So everything's just your fresh and flies. You
leave the home, you get in a car, the same
robot drives you to work. Then it turns out that
it's arranged a blind date for you based on right,
(42:05):
this is where it starts getting darker, based on like
your preferences, presumably based on the same sort of algorithms
to tell you, you know that, like today is not a
day to where linen pants, Today's a day to wear
cotton pants. The so then the video follows the guy
he's on this blind day with this woman. It's it's working.
And then the AI takes because it can control the
projection on the light of the windows of building across
(42:29):
the street, changes to like show that it's it's a
heart and so that shows.
Speaker 3 (42:34):
That's that's just in that robot is just an Indian mother. Yeah,
it is just wake up, my son, here's your outfit,
here's your clothes. I found a date for you. That's
all it is.
Speaker 1 (42:45):
But this is the bit that was like striking me
about like there's no agency day of this. He was
like the Dan's words on it. He was like, this
guy just gets bullied all day AI, he doesn't make
a single choice all day long.
Speaker 2 (42:59):
Actually sounds great, though, Yeah, I'd love to wake up
with that.
Speaker 3 (43:03):
I think a lot of people would want they would
find some sauce in reducing choices.
Speaker 2 (43:07):
I am mostly kidding, by the way, the idea of
waking up and the well, yes, yes, I am nothing
like this. I just wake up and read three different
apps that tell me how to fail every day. Like
I just go and read pages of stuff that twangs
my emotions. And then I go and like use the
(43:28):
text step to text my therapist. I read a post again. Yeah,
it's totally different.
Speaker 1 (43:33):
That's the bit that's weird about this too, though, is that, like,
so there's something that's what made me sad about it
was the idea of just being like a flag raw
goose and like your whole life is just piped down
your throat like fun and you're just like, oh, thank you.
I feel this really is the hogs, It's the whole,
the whole. This is what you guys like like just
eat out this trough, like we've taken care of it
(43:54):
all you need.
Speaker 3 (43:55):
To doing towards Wally predicted different.
Speaker 2 (43:59):
That's a ship one's it doesn't even work this way,
but I do feel like it, So it doesn't.
Speaker 1 (44:04):
The fact that it doesn't work is like funny to me,
But it is also like the fact that they're promising
this is that was the part that was like uncanny
about it, Like the idea of you want to take
all of this mastery that you have that's made your
products so unfathomably cool and like project it across every
spectrum of somebody's life, like to a certain extent, like
(44:26):
I would want to be asked to my consent. Yeah,
but also you know, you see how good those OLED
screens are, and there's a part of me that's like,
I don't know, man, that's because like you guys.
Speaker 3 (44:35):
Do cars, you know, like we're moving beyond old, we're
in micro led now and that stuff is it's it's wild.
Speaker 2 (44:43):
From are going.
Speaker 1 (44:44):
Yeah, But that's that's the idea though, is the sort
of like I don't want to be protected from every
aspect of like being a lot fel like.
Speaker 2 (44:56):
There's another thing that pisses me off about this. First
of all, they're lying just right, like the first.
Speaker 3 (45:01):
Of all the line.
Speaker 2 (45:02):
But two, if you look at how the tech industry
actually treats customers, do you think that any of this would,
even if it did work, be good. It's like I'm
going to meet with a woman who is actually a
pig butchering scam waiting to happen, and my car ends
up pulling over to the side of the road because
there was a traffic cone in the in the road
and I'm late for the day and the other woman's
(45:23):
threatening to kill me. But don't worry. The trousers I
have on have a giant stain on them from the
LG cabinet, which and I also will admit I have
the LG steam cabinet exactly. I actually use it like
a genuine like I done meetings. I use suits a
lots of them like this.
Speaker 1 (45:39):
As an audio medium. But it looks amazing, thank you,
thank you.
Speaker 2 (45:43):
David's steaming right now. I am just steaming here. But
even then that thing works mostly, but it just doesn't
get all the wrinkles.
Speaker 1 (45:53):
It's like they can't even get the fucking steamer right.
Speaker 2 (45:55):
Yeah.
Speaker 3 (45:55):
The problem is these companies think all we want is
no friction. They think we want to glide through life
like we're fully lubricated. You know, it's just like, oh,
no problems. I don't want to think. I want to
think about something to remove the friction.
Speaker 4 (46:10):
This is my one regret for CS. I didn't get
to visit the fintech section because the section, and that's
where they believe in removing friction and they do. Whether
or not they do so is whether or not doing
so is good or bad. You know, let's see, it's
almost always and not almost always, it is always bad,
(46:32):
you know, when one of these fintech firms removes friction
in making a transaction or a trade. But that's like
one place I think of when I when I when
we're talking about removing friction, making your life more of
some coherent engine towards some end, and that's where they've
done it, and it's been disastrous.
Speaker 1 (46:49):
But also these companies don't remove that I would love.
Speaker 2 (46:52):
Look, perhaps I don't need to be fully loved or
partially loved like I would like less friction instead as
I as you heard in one of the previous episodes,
it's like there's more friction than ever. And even with
these companies they sell this dream, and LG, I think
is one of the more guilty ones. They have all
the apps in the world I think I have. I
forget which washer I have, but I think I have
an LG washer, And it's like there's an.
Speaker 3 (47:12):
A the best appliances. Yeah, butce is are good.
Speaker 2 (47:15):
But notice that there's never like they like those things,
don't like the ways they change are just like clothes
are more reliably clean.
Speaker 3 (47:22):
Yeap.
Speaker 2 (47:23):
The extensions from there have never worked because I feel
like I've been to multiple ces is where there's been
some form of demo where someone goes like, Elgae Intelligence,
please please warm up my bagel I've already put it
in the toaster of Michael. Yeah, it's already in there,
and don't worry. I already emailed your wife. And it's
like these things are meant like they've been promising this,
but they don't fucking work.
Speaker 3 (47:44):
It's just it's just make believe. It's pure fantasy. But
the thing about the friction thing which I've noticed since
I've been I've been see us since twenty ten, and
I've just had a four year gap because pandemic stuff
and I had kids. I didn't want to leave my
wife along with young kids.
Speaker 1 (47:56):
You mighttter have to explain why you didn't come.
Speaker 3 (47:59):
I got it. This is my duty.
Speaker 4 (48:01):
I should have to explain what you're twenty eight years Yeah,
but the whole the rise will startup, the rise of startups.
Speaker 3 (48:08):
Of what Consumer Electronics has been trying to do for
the past fifteen years is just to remove that friction.
And I think we've only really started to realize you
remove friction, you remove humanity. Yeah, and to a certain extent,
we need we need, we need a little friction in
our life with challenges, we growth challenges. But maybe it's
better for you to like not have an automatically prepared
cup of coffee when you walk into the kitchen. Maybe
(48:29):
you should just go outside and like go to the neighborhood,
catholicy and there are other things you can need.
Speaker 4 (48:34):
A consumer is not a person, right, and this is
this is a key part of I think the quest
to remove friction. A consumer is someone who is is
anxious or eager to move from one transaction to the next,
and any time in between those transactions, any time between
transforming labor into some sort of productive and is wasted.
Speaker 1 (48:57):
And I think what it is.
Speaker 2 (48:58):
Is they all moving friction, just not for us, it's
just between us and the purchase.
Speaker 1 (49:04):
Yeah, And that's again like sort of this brings us
back to like the sort of the uncanny aspect of
it in terms of being like, it's not necessarily making
my life any easier. It is simply making it easier
for me to like it's more transactive, right, Yeah, to
like be productive and to consume more effectively or whatever.
But it's not adding value in any way.
Speaker 4 (49:24):
Like I think about with these chatbots that are coming
in for therapy, it's like, is the therapy that they're
offering going to actually help?
Speaker 2 (49:30):
No?
Speaker 3 (49:31):
But now it's.
Speaker 4 (49:31):
Actually a more it's it's a more transparent and quantifiable market,
so you can search out for whatever product you think
is going to be a fit for your specific issue.
Whether or not it's going to help you is another question,
but you can find it and identify.
Speaker 3 (49:45):
It's one less human you have to deal with, yes,
which is also the less humans. Like it's part of
using apps instead of calling a restaurant or something.
Speaker 2 (49:53):
Yeah, I will fully admit like a mental health thing
with me. I've loved that for years, and I found
it because I got my various issues with anxiety, and
we're not wanting to go outside and being scared of
talking to people. Long story short, I got over that.
But also there was a certain level of like, yeah,
the Internet is really intense, just really good at selling
you those abstractions between people so you can live a
(50:15):
weird hermit life. And I absolutely didn't.
Speaker 3 (50:17):
And then you think society is you, and these technicals.
Speaker 2 (50:19):
Exactly become more exactly less experiences you don't grow like.
Speaker 3 (50:23):
And this is this is why I hate the stupid
the Google thing where it'll call a restaurant and make
a reservation.
Speaker 1 (50:28):
For I do like that for my accent.
Speaker 3 (50:29):
I hate it so much because it's like, what burden
are you putting on this restaurant. It's like, oh, I'm busy,
Like I'm dealing with real customers, real reservations, this fucking robot.
Speaker 2 (50:40):
I will argue my accent and when I call and
I have to spell xron or even just like my numbers.
So many people don't understand it. I've had to like.
Speaker 1 (50:51):
Because you're like, Z I T I sound exactly.
Speaker 2 (50:58):
I have to americanize my accent to a certain extent.
Speaker 3 (51:02):
That yeah, But also in that case, an app way too,
like a thing that actually makes life better for a restaurant,
Hey just log this reservation please, yeah, rather than that's
a case where you don't need to call the manager
of the restaurant to like do all this stuff. And
that's a case where an app based thing could be better.
But now we're abstracting to the point where a robot
is calling a number and talking to this human. The
(51:25):
human does not know it's a robot talking to them.
Then they're expecting conversations confusing for them. I hate that
whole thing because as a.
Speaker 1 (51:32):
User, and this is something that's been like a recurring
theme in the conversations that that I've been a part
of for this where it's like they invented a solution
to like a pretty minor problem and now are sort
of like, I mean not to.
Speaker 3 (51:44):
No, no, no, living experience whatever.
Speaker 2 (51:46):
Also mine is like minor in comparison to like.
Speaker 1 (51:49):
But yes, and so in this case, it's like there's
a you know something like that. It can't be easy, right,
I mean, like that's there's a lot of moving parts. Mean,
I don't understand any of them, but it does feel
like the sort of thing where that is a in
its way, you know, like an impressive achievement, and yet
at the same time, and this is the real recurring part,
(52:10):
like it's a very labor intensive, presumably very expensive solution
to a problem that like probably is more urgent for
like a like somebody who doesn't have anything else to
worry about, but making restaurant reserations, yeah, you know, and
ditto for the idea of being like a robot makes
your coffee for you, a robot picks here or whatever,
(52:31):
like some speaking, the computer intelligence it takes your clothes,
makes your food.
Speaker 2 (52:34):
Because Cory doctor I made this point last year on
the podcast. He said that algorithms are inherently conservative. They
move people towards the norm that is actually culturally dangerous.
I think it will eradicate people of Calais, or eradicate
culture that made America and made many countries the way
they are. And something that you're seeing on a much
well not necessarily a small scale, with social media, with
(52:57):
what's popular, what's popular on TikTok, the infant int of
that when you add in the algorithmic side. Not that
I think that any of these bullshit's part. I don't
think they'll ever have a thing that picks your outfit
for you. But if they ever do that is slightly
like there is something kind of darker about.
Speaker 1 (53:12):
Right, because every I mean like everybody's gonna wind up
dressing the same.
Speaker 2 (53:16):
Yeah, like that's exactly and that's presumably I guess on
the training data that it's given, which will probably lean
more white. Right.
Speaker 1 (53:23):
But this is also like one of those things where
there's like this is again like a very labor intensive solution. Yeah. Problem,
they're really breaking the back of it. Like if you're
somebody that just has like sometimes you got stripes and
plaids on, like you can fix that, Like someone could
just tell you that that doesn't work.
Speaker 3 (53:36):
You can learn that lesson once in your life and
apply it moving forward. Yes, yes, I'm still waiting for
them to like, hey, there are homes. I think most
home tech is like kind of garbage even No, yes,
I also there's a sale in the LG fridges with
the little windows, and I got an LG fridge with
the window because it looks cool. Yeah you know, you
knock on it. You have like double doors whatever, great
for kids snacks, but I spend an hour every night
(53:59):
cleaning my kitchen and I hate it, Like every single night.
It's like, Okay, give me yes, Rosie of the robot,
forget these companion robots. Yes, give me a robots to
do the things that really suck in our lives. And
I think open gives you more time to be with your.
Speaker 1 (54:13):
Family and a few other things.
Speaker 3 (54:15):
That possibility is still there, but nobody. Nobody's doing that
because alg is just like, hey, I wanna yeah, I
want to control your life rather than do the simple
task that nobody likes.
Speaker 2 (54:23):
Also, there's something quite joyful about when you're like, it
took me into like the last end of last year
to really enjoy dressing myself, and there's something fun about it.
It's a quote Derek Guya, clothing is like social language.
Forgive me, I have a misquote you, Derek. It's like
there's something about it and it takes a while before
you work it out. You feel very self conscious. I
don't think an AI telling you what to wear fixes
(54:45):
that problem. It's just hey, do this, so we have
decided what looks good.
Speaker 3 (54:50):
There's always in between, right, you don't have to be
like that. Maybe we don't need any dictator, but hey,
you have these clothes, just try try this, try this,
And actually I think you be look at like having
a friend who's supportive to be like, hey, try this.
I know you don't know how to wear it, but
maybe it would look good.
Speaker 1 (55:05):
That's that's an important distinction, I think because.
Speaker 2 (55:08):
To mac Levine, by the way, he does that for
me on text, very nice. Yeah.
Speaker 1 (55:12):
I was talking to Philip before when you guys were
recording black A about his experience with Pandora, which I
didn't use, but was always the one that like my friends,
and he was sort of explaining what it was like
as it got worse, like how did I'm talking about you, dude,
Philip walking there the like? Basically that was a sort
of like of all the sort of algorithmic musical recommendation
(55:35):
applications that have existed, that's the one that I think
anybody ever had good feelings about. Like Spotify is easy
to use, but I don't think anybody but is it good? No? Right,
And I don't think anybody's like I love Spotify.
Speaker 3 (55:45):
It's like I think of it as the early on
when Spotify, when the Europeans only had Spotify, they lorded
that Americans can't stream music.
Speaker 1 (55:57):
And it was just yea, all right, so that makes sense,
like the convenience aspect of it. So what Pandora did
was like basically like you would and maybe you can
even just grab the mic and talk about it because
actually understands how this stuff works. But I thought this
was interesting in terms of how it got worse, is
what I was saying on.
Speaker 3 (56:12):
Us, all right.
Speaker 7 (56:14):
So the exciting thing for Pandora for me was the
seating where it was actually looking for a flavor of
what you were interested, looking for the sound you're after,
not oh, you like rockabilly, let me give you an
entire catalog of rockabilly. It would ask you for what
is your inspiration point? And the best inspiration point I
(56:37):
gave it was Tom Waite's black Hole Sun Pink Floyd's Echoes. Okay,
so to deeply confuse the algorithm forever.
Speaker 2 (56:48):
For go ahead, doctor Jones, try to.
Speaker 3 (56:50):
Figure out what to give me for that. And it did.
Speaker 7 (56:54):
But my goal behind that was and what Pandora did
a great job doing was fine things I'd never heard
of that I might like that fit in that gap. Yeah,
I found so many bands I'd never knew.
Speaker 2 (57:06):
I have found and something that will both age me
and piss some people off. My favorite comment, I remember
my Pandora experience. In was out in Penn State, and
I put in like a bunch of bands, and I
got Caven Bank called cave In. They did some called
an Anchor I think it was off a bank of
an album called Antenna. Just fucking up every word in
that sentence, and I remember being like, this is so good.
Speaker 1 (57:27):
Now.
Speaker 2 (57:27):
The funny thing is that this is Caven's one alt
rock album surrounded by hardcore music that was just impossible
to listen to for me, And so I stopped using
Pandora because every other cave In album and I listened
to was completely insane for me, and I assumed that
there were two cave Ins for years. Anyway, The point is,
(57:47):
after this point, I have never had a recommended band
on any system ever, not Spotify, Apple Music, anything that
has ever made me feel anything.
Speaker 1 (57:55):
It's always been shit.
Speaker 2 (57:57):
Apple Music wants me to listen to in my hand
by Sound good, whatever song I'm listening to. It's like
you remember a pretty Noose by Sound, you have a
black Hole sun by Sound.
Speaker 1 (58:07):
This shit, it's just so frustrating that you should say
that was your experience, Like how it broke down.
Speaker 7 (58:15):
So where it broke down and Pandora itself fall apart
is the seeds started bleeding into each other and it
basically learned me better and better and stopped recommending anything new.
I mean after after I had found the Dreadnoughts, which
was a really great band, which then led me to
(58:35):
a handful of other bands I'd never heard before. That
was pretty much the last thing Pandora found free that
was new and exciting. Since then, the closest I've come
is SoundCloud. But SoundCloud does not work well for discovery
for the findings for you.
Speaker 2 (58:54):
So as we wrap up the episode, Devendra, I know
you have to go in a minute. I would like
to know, is that anything that really like made smile
shows anything?
Speaker 3 (59:01):
Well, yeah, I'll show it up that I like a computer,
A simple a simple computer in the aces Zen book
A fourteen. This thing it weighs under two point two pounds,
and I hold this thing in.
Speaker 1 (59:12):
That it's a real computer.
Speaker 3 (59:13):
That's a real computer.
Speaker 1 (59:14):
It's less than a hardback book where we go.
Speaker 3 (59:17):
Half a pound less than a MacBook Air. And I'm like, ass,
how did you do this? Because ASS is normally the
company that's out there copying Apple basically, and I think
they've gotten to the point where they've innovated. They do
stuff like dual screen computers. I don't think those are
as useful. This is just a simple, really really like
computer has an old ed screen. Macbok Air doesn't have
old it has ports. It has all got dam ports.
You want USB, a USB, C HDMI in a thing
(59:40):
smaller than the MacBook Air. It's like Apple, what is
your excuse?
Speaker 2 (59:42):
What was the keyboard on it?
Speaker 3 (59:43):
It's not bad, It's not bad. Aces makes okay keyboards.
That TAD is pretty good. Subscription model. Uh yeah, I
think it's gonna be like twelve hundred like they don't
it's Aces. They don't go too hard. But this is
a snap Dragon CPU, so it has that thing where.
Speaker 2 (59:55):
It's what's the limitation.
Speaker 3 (59:57):
It's going to be emulating some older Windows apps we
last year from what we saw on the surfaces, like,
it's actually gotten better than it has been years. I
think for most people it'd be fine. But man, a
two point two pound computer just like a little laptop
that is like basically feels like a tablet and can.
Speaker 2 (01:00:12):
Do everything you would. Did you write this up. I
wrote it up. It's up there right. Actually it's one
of our Best of CS Awards. Oh yeah, okay, wellcome
people find you to Vendra.
Speaker 3 (01:00:21):
Yeah, I'm at Devendra on you know, Blue Sky and
all the fun places at the Filmcast at the filmcast
dot com or a podcast about movies and TV and
Gadget and the Gadget Podcast.
Speaker 2 (01:00:30):
Check me out there, lovely. David.
Speaker 1 (01:00:32):
Defector dot Com is the website. The Distraction is the
Defector podcast. There's a Hallmark movie podcast. I'm mentioning it
every other time. So this is called It's Christmas Town.
Thank you. And uh yeah, I'm David j Roth on.
Speaker 3 (01:00:46):
This Machine Kills is my podcast.
Speaker 4 (01:00:49):
The tech Bubble dot Substack dot com is my newsletter,
Big black Jack to beIN on x the Everything site,
and Blue Sky is where I live online.
Speaker 2 (01:00:58):
You can find me on Google type in what happened
to Google search? Or who is Prabagar Ragavan? And I
should pop up now after I stopped speaking, you must
start purchasing. You are a consumer, As Ed said, follow
this by consuming what's next, don't think, especially if it's
one of the ones that embarrasses me. And we're back
(01:01:36):
and now we are joined by mister Phillip Broughton, the
health physicists that we know and love that has been
giving us drinks all the time.
Speaker 1 (01:01:41):
Hating film.
Speaker 2 (01:01:43):
I think it's funny though, that story about Pandora and
this recommendation system that used to work but doesn't yet
the company seems to still. It feels symbolic of everything.
It feels like everything's just kind of slowed down, and
even CS doesn't even seem that willing to convinces anymore
that based on everything you guys at David have seen,
(01:02:03):
it feels like there's a lot of just this is
what you want, rather than hey, we're fucking selling you something, right.
Speaker 1 (01:02:09):
Yeah, it's interesting, right? Does that sound about right to you?
Speaker 3 (01:02:12):
Yeah?
Speaker 2 (01:02:12):
You know I and if I'm wrong, please correct me.
Speaker 4 (01:02:14):
And I also think i've you know, today, I tried
to shift gears and instead be like, okay, let's pretend
the things I'm coming into.
Speaker 3 (01:02:24):
Are like for a real person.
Speaker 4 (01:02:26):
Right sent and kind of falling into like a drain
around the smart home stuff and being like I'm not
and most human beings are not the target audience for
this in a way that feels like a like a
snake eating its own tail. I mean, like, you know,
I went to a lot of the uh there's a
section that's pretty much just like how to build your
own power grid, you know, for your own home or
(01:02:47):
if you're going camping and you can't be offline.
Speaker 1 (01:02:49):
Oh yeah, the anchor is yes, anchor slow to me,
but yeah, going on.
Speaker 4 (01:02:56):
No, I mean it's like it's like it is fascinating
that you're able to literally power the equivalent of like
a home out in the woods. But then it raises
the question, especially with their marketing, where they're like, this
is the sustainable way to be technologically progressive, and it's
like what in the sustainable way to be like maybe
(01:03:17):
we don't consume even more in some spaces and so
stuff like that is really interesting to me paying attention
to advertising being like oh okay, like they're they're for
like homes that are built like crips and mausoleums, like
just massive empty spaces filled with nothing other than consumer electronics.
And you know, yeah, I mean, you know, for a
(01:03:38):
lot of people at this conference, it is and I
think that's been like important in constructing a better sense
of like who a lot of these things are for
beyond investors. Seeing like, Okay, there are things that consumers
who might not normally have something like this be interested in.
But they're also I think the other day we're talking
about status symbols, and it's like, one, what other way
to kind of also signal that flag and to be
(01:04:00):
able to go out in an RV in the middle
of the woods and still have the equivalent.
Speaker 3 (01:04:05):
Of a house.
Speaker 2 (01:04:06):
Yeah, and there's some of these people who just to
be clear, there are plenty of people who go camping
and do that, and it's like, I want to power
a grill. I want to I want to power this,
but it's not like you don't need six watts.
Speaker 7 (01:04:17):
Yeah, reactor, that's what I'm surprised no one's trying to sell.
Speaker 1 (01:04:22):
That was the thought that I had looking into that
stuff too, because there's a part of me that's like, oh,
that's cool. You couldn't do that, you know, like whatever,
five ten years ago, and then to see like, I
guess this is the sort of thing where it's like
you find this solution to I mean, it's not a
super pressing problem, but it's a it's a solution, like
you went from a thing that didn't work to a
thing that does work. But then all you can do
is like make it bigger. Yeah, like that's the only
(01:04:43):
solution is just sort of like yeah, now it's like
six thousand watts. Now you could like have like basically
like a Red Rocks performance.
Speaker 2 (01:04:51):
I have an anchor charger on. Make sure because like
all of the dense charges are really cool. Because the
gallium nitrate stuff, which Phil I will have you explain
in the moment.
Speaker 7 (01:05:00):
I know you want what that means. Either it batteries,
it has a battery at.
Speaker 2 (01:05:05):
You, I will explain Gallium nitre is a thing where
I don't know the science stuff, and I assume the
flipping science guy might.
Speaker 1 (01:05:11):
Now I'm f it's really more of a mortimer. If
you think about it.
Speaker 3 (01:05:14):
You should. You should riff tell us what you think.
Speaker 2 (01:05:17):
I can actually tell you what it does more than
what I'm doing with the subject of this podcast. No,
it's it basically allows them to make more powerful charges
in the much smaller form, which has led to battery
charges that can charge more powerfully and that much smaller,
And it's really cool. But then you get to this
stuff where it's like what if you had six hundred
thousand m a h and you could power a house,
(01:05:39):
and you could power your friend's house, you could live.
Who is the what is the customer base there? Because
it feels like at a.
Speaker 1 (01:05:46):
Certain point you don't you just don't need more. Yeah,
that's the base thing. To Edward's point, I thought was
really this was the thing that was very much on
my mind looking at a lot of the smart home
stuff was like, I think a lot of people. Let
me not to say that this is like, you know,
it's obviously it's elite stuff. This is like concept car shit.
I don't think that. I don't know how many LG
(01:06:06):
smart homes exist in the wild, but is it like
is it a thousand, is it a hundred? You know,
like but whatever, I think most people and maybe I can't,
I don't actually know what most people's experiences don't have
especially reliable wireless internet service. It's expensive, it's not very good.
You know, the amount of like connectivity that would be
(01:06:28):
required to make this like sort of all seeing robot
that you know, helps you with every aspect of your life,
like that would have to be reliable. Electricity isn't reliable.
I mean it's like the thing where you know, obviously
be like a bummer about this stuff, like if southern
California is fucking on fire, like that's where I see
a lot of these Mansas in my mind, you know,
like these sort of like technologically like you were describing,
(01:06:48):
like just basically like a vast stylish spare space with
like perfect connectivity and yeah, technological things, yeah, basically right.
Speaker 3 (01:07:00):
But oh no, I was just gonna say, we live
in New York.
Speaker 1 (01:07:03):
You know.
Speaker 4 (01:07:03):
It's like in New York, you know, we've had all
these private public partnerships to expand connectivity, and have they
done jack shit? I mean to the extent that, like you,
you know, you to not until something else happens you
realize how bad you have it and how much at
the mercy of the firms you are to get any
sort of fix going on.
Speaker 3 (01:07:24):
Right.
Speaker 7 (01:07:25):
So, the thing that comes to mind immediately as you're
telling me about these ridiculous generators battery packs, there's generators
substitutes is the experience of the most recent fires that
happened in Santa Cruz. Now, since Los Angeles is in mind,
has left PG and E service in the Santa Cruz
Mountains deeply fractured and fragile. Right, So they offered to
(01:07:50):
do power walls for a whole bunch of people because
they said we can't promise you. We're not we're going
to give you power, and that the power will be
reliable and will be shutting you off regularly anytime the
breeze so much as blows. So what these battery packs
telled me is an admission of fragility. Yeah, we're I mean,
(01:08:12):
that's kind of where we're approaching. Ye, but they're not
selling it for that though. It seems like the marketing
is still recreational.
Speaker 1 (01:08:19):
Yeah, I understand. I wonder next to because it's like
a bummer to think about that.
Speaker 4 (01:08:24):
Yeah, this is CEO, but they're also selling it to
an audience that is.
Speaker 3 (01:08:27):
Thinking about it.
Speaker 2 (01:08:28):
Sure, but I'm saying, though we're one year away.
Speaker 7 (01:08:30):
From Prepper has also come to mind. Yes, for off
grid living.
Speaker 2 (01:08:33):
But I think there's gonna this is my one CES
prediction for twenty twenty six Preppers. I think there's going
to be more Prepper sales. The idea of being able
to live off of grid, the idea of being able
to not rely on the power grid, which.
Speaker 4 (01:08:46):
Some of the some of the marketing felt like a probe.
There it was like explicitly on grid but also off grid. Yeah,
and not just like RV living, but just if you
for some reason happened to have a home that was
off the grid needed to be powered, like it was
a four bedroom apartment in the city, you know then.
Speaker 1 (01:09:06):
But that's definitely that is interesting too, because I could
definitely see that sort of because I got that sense
of it that there was this sort of like it's
a non political version of the idea of being like
fully self reliant, you know, like you could do whatever
you want to do. You'll never be inconvenience because of
this technology that you have. And yet like the next
step from that, like the political version of that is
(01:09:28):
like you will never be inconvenienced by whatever agents of
the state, by the right heads, by the if the
ATF is laying siege to your homestead or whatever, you
know that Like, yeah, we'll have.
Speaker 2 (01:09:39):
Like classic in voter activity.
Speaker 3 (01:09:41):
Yeah, loud speakers will continue to warm them.
Speaker 1 (01:09:43):
Yes, get off your property, you are trespassing. I am
a sovereign citizen. My name is LG. Sovereign citizen.
Speaker 2 (01:09:51):
But also it's gonna bronche off from that, I think
in a year and be like disaster relief. It's going
to be like, hey, when shit starts breaking. And this
is both the political and andym like, this is something
the American power grid, the AI think has been pushing,
the AI Generative AI and the data centers associated have
been pushing the grid to its spring right, it was
already old as shit, and so we're in this weird
(01:10:13):
thing where I'm just predicting this for next year. I
bet they start marketing on that, and it's so dark
like you can also that will definitely be the insane
people sales as well.
Speaker 7 (01:10:21):
Like the it appears that every single one of the
modular nuclear power startups that have cut popped up in
the last decade, and you just fucking modular nuclear startups
or care for that at all?
Speaker 3 (01:10:37):
Pocket sized reactors in.
Speaker 2 (01:10:39):
A seriousness, please spell out what those mean though.
Speaker 7 (01:10:42):
Okay, people are looking for sub gigawatt reactors that they
can treat effectively as nuclear batteries. This is for to
what end, so that you can go ahead and power
just a neighborhood with your private nuclear reactor. Oh good,
that belongs to you and the co op and you
(01:11:02):
have nuclear power as a service. So nas where they
will regularly come in and swap fuel out for you
to keep your enclave perfectly powered, not on clay from fallout.
That's a totally different thing.
Speaker 2 (01:11:15):
And I'm going to talk about something more evil hoas.
Speaker 7 (01:11:19):
That actually is the overwriting people who would be paying
for it and administering it. So somehow with a license
with the Nuclear Regulatory Commission in less and this hasn't
happened yet, the Nuclear Regulator Commission wanted to generally license
one of these pocket nuclear reactors, except all of them
have pivoted off this idea to We're going to make
(01:11:41):
the power that supplies to make crypto happen, or make
your AI happen because the grid is not stable enough.
Speaker 2 (01:11:48):
All these react is real? Are there any of these
actually out in or is this just marketing marketing?
Speaker 3 (01:11:54):
They do exist.
Speaker 7 (01:11:56):
We have built them before at experimental levels at the
national labs and have never licensed them to be real.
Speaker 1 (01:12:04):
Can you imagine these dipships?
Speaker 2 (01:12:05):
You would actually buy one for an no enclave actually
having a nuclear that that's actually in real districtive.
Speaker 1 (01:12:10):
Microsoft nuclearner is sending me so fucking just an unbelievably
reverse concept.
Speaker 2 (01:12:19):
I'm not talking about that. I'm talking about No. Sorry,
it is a perverse concept, you speacause I'm talking about
like an HOA having one of these, and like judging
by every hay I've ever seen, they would blow this
thing up immediate.
Speaker 7 (01:12:30):
This is one of the hopes to get them to
generally license so you can't touch it. It goes in
the ground. Your nice service people from Microsoft Nuclear will
show up and that that is a subscription I told
you as that's that's how you get them find they
will go ahead and do the spot for it. All
you can do as part of your neighborhood is just
(01:12:52):
hook into your local grid and that's.
Speaker 3 (01:12:54):
All you should do as a consumer. That is correct.
Speaker 1 (01:12:58):
Honestly, that's still like that again, hate it, want to
be clear getting it on the record, don't approve that
as a business. That said, it's still better than giving
it to like the single most disagreeable and ambitious person
in your neighborhood. Like just the idea of like whoever
your h o A president is like I trust Microsoft
Nuclear over like Stacey.
Speaker 7 (01:13:18):
Oh, you just made my asshole clench, But I don't
like aout. The idea of being a health physicist, beholden
to the head of the h oh.
Speaker 1 (01:13:28):
Yeah, yeah, just getting some super officious email and that's why.
Speaker 3 (01:13:33):
It sounds like fallout DLC.
Speaker 2 (01:13:35):
Yeah, this sounds like, yeah, this could be a really
banging movie.
Speaker 1 (01:13:40):
But those are like isn't that like Mission Impossible three?
Like suitcase nuke like that?
Speaker 2 (01:13:45):
Thinking just like a like a country wide blackout or
something just to you like do a day.
Speaker 4 (01:13:54):
Uh uh, pitch the family that runs the Bond thing
because they don't want to touch it with Amazon.
Speaker 2 (01:14:00):
This James Bond idiots. You ever had a James Bond
occasionally smokes weed? Yeah, I'm kind.
Speaker 1 (01:14:08):
Of lazy because Bond. What if?
Speaker 2 (01:14:12):
What if? James Bond smoked weed is probably the first
real podcast idea show.
Speaker 1 (01:14:18):
Just like I need.
Speaker 3 (01:14:20):
Another good Indiegogo project.
Speaker 4 (01:14:22):
You're back two of the movie Candy Flips and it's
to his detriment.
Speaker 3 (01:14:26):
They use it to kidnap his love and money.
Speaker 1 (01:14:28):
Patty, this is a sativa.
Speaker 3 (01:14:30):
Anyway.
Speaker 2 (01:14:31):
Please let's talk about tech again.
Speaker 3 (01:14:33):
That is weed is.
Speaker 2 (01:14:35):
Not tech, but.
Speaker 1 (01:14:37):
Together that's innovative.
Speaker 2 (01:14:41):
Yeah, there is the weed innovation here. If you hear
about this email me at easy at that offline dot com.
I want to hear about that experience today.
Speaker 1 (01:14:48):
I know you all talked about it yesterday, But when
I was getting off the bus from the convention center,
I discovered a whole other floor under the floor at
the Venetian that the people the Eureka Zone whatever yesterday,
which is basically just somebody.
Speaker 7 (01:15:03):
Where you want tech tech.
Speaker 1 (01:15:05):
Yeah, but it's also apparently that is the one where
it's just like if you can fit in the room,
they'll let you hang out. Yeah. I guess it's like whatever,
So I'm gonna go down there and check that shit
out either tomorrow.
Speaker 7 (01:15:15):
I expect there will be a laser bong for you
to see.
Speaker 1 (01:15:17):
Yes, that's where the like the weed tech is being
you know.
Speaker 2 (01:15:21):
The funny thing is is with weed tech, not saying
I smoke it or not, but like they have not
really fixed grinders yet. Like that is just a weird
industry where's like one that only kind of works. It's
very strange, but I guess and like some of them
not saying from personal experience, when you put the cone on,
it's actually quite fiddly, and when you're using not quoting
my specific experience or anything, because I'm being very clear
(01:15:44):
about who uses this, not how the fuck the stoners
use these like very like fiddly little tools? Is it
just like stoners enter like hit Man level focus? They
go into bullet time when making one, because I wouldn't
if I used one which I had.
Speaker 3 (01:15:59):
If you, if you did smoke weed, would you be
a roller? Would you pearl your joints?
Speaker 2 (01:16:05):
In this hypothetical scenario. I found one where you can
just put the cone and then the grinder is on
top of it. It's like fifty bucks. It changed my
theoretical life.
Speaker 3 (01:16:14):
Yeah, it would have.
Speaker 1 (01:16:15):
It would have in this fan fiction.
Speaker 2 (01:16:17):
In this fan fiction, we're talking in this simulation. I
think I've even said it in another episode.
Speaker 1 (01:16:21):
I don't know what I'm doing.
Speaker 3 (01:16:23):
Two.
Speaker 1 (01:16:24):
Yeah, this is Earth to Ed, But is it?
Speaker 2 (01:16:30):
You would think that there would be more of that though,
there would be more like maybe you'll see it when
you go and look. Maybe I would love to hear
the weed stuff for someone else. But also that's also
one that feels solved as well, like at some point to.
Speaker 1 (01:16:44):
A certain extent. Though again it's like that's the type
of ship that I liked that I saw here, especially
in the Venetian Yesterday is the attempt where it's just
like it's inventor stuff. It's like taking a practical problem
and then like working out some way to you know,
fix it or make it. You know, it's not always
affordable or whatever. But if the practical problem in question
(01:17:04):
is like no soft serve in my house, yeah, like
machine that fixed it. You know, like it's twenty five
hundred dollars after the discount here, but it's still like
I had the soft serve. It was pretty good soft serve.
Like these things can be fixed.
Speaker 4 (01:17:18):
I went to a coffee station and they were like, oh,
it's AI powered.
Speaker 3 (01:17:22):
And so I went.
Speaker 4 (01:17:23):
So I got the coffee and I'm like, how's it
AI powered? They're like, oh, well, this is a generator.
I was like what, Yeah, what do you mean this
is a generator? And so then I'm actually gave the
coffee station a real look and I was like, oh, okay,
you have the coffee thing here that's very small, and
then right next to it is a massive generator you've
stacked on here that kind of looks like it's the
coffee machine, but it isn't.
Speaker 3 (01:17:44):
What is it's It's called Ecoflow.
Speaker 2 (01:17:47):
You know.
Speaker 4 (01:17:48):
It's one of the ones I was talking to you about,
which is like for your home thousands and thousands and
thousands of watts. But they were they were just using
it as an example to be like, oh, you like
the coffee, Well, the coffee is plugged into a system
that would be able to figure out whether it should
draw from the grid.
Speaker 1 (01:18:03):
Hey, which are the solar collar?
Speaker 7 (01:18:05):
Is it waste heat from AI explation to make coffee?
Speaker 4 (01:18:10):
No, the AI has nothing to do with coffee, but
they say it has to do with coffee all. What
it has to do is it's supposed to plug into
your home to figure out energy efficiency, and if you
got one of their coffee things, you'd happen to be
able to take advantage of that so that you wouldn't
be using.
Speaker 3 (01:18:25):
More expensive qu on another subscription surface.
Speaker 1 (01:18:28):
Yeah good, Yeah, once again one of those things that
like everything that you describe there sounds okay enough to me.
I just don't know why you have to say AI.
Speaker 3 (01:18:35):
When you see yeah all the rest in me either.
Speaker 1 (01:18:37):
Yeah, it's just confusing.
Speaker 4 (01:18:38):
It's actually fun because I he's he had. I asked
him a bunch about the AI. I couldn't really find it.
Then we spent like ten fifteen minutes talking about my
locks because he was like, oh, my god, What are
are these these dreadlocks? How do you grow them? I
heard that they're so com person's just he had so
many questions.
Speaker 3 (01:18:59):
It was fun.
Speaker 2 (01:18:59):
It was actually white guy.
Speaker 3 (01:19:00):
No, he's from China. Yeah, okay, that's so good.
Speaker 1 (01:19:04):
I was not a great question, but you know I was.
Speaker 4 (01:19:07):
I was being empathetic because I'm like, these are probably
great first ones.
Speaker 3 (01:19:11):
I've seen six other black people on the floor off
the four days. You know, have you learned other names already?
We not do each other every time we see each other.
Speaker 1 (01:19:20):
How was the coffee?
Speaker 2 (01:19:23):
It was?
Speaker 3 (01:19:24):
It was regular, regular black coffee.
Speaker 2 (01:19:26):
Okay, so it's so funny.
Speaker 3 (01:19:28):
We've had better. Yeah, yeah, coffee, I've made better. And
I don't drink coffee like that. By the way, I'm
still mad just sitting here.
Speaker 7 (01:19:35):
I thought of a better use case for AI to
actually heat water to make coffee within their actual problems.
Speaker 2 (01:19:40):
It went on the business of fixing problems.
Speaker 3 (01:19:43):
Ship that's what I did.
Speaker 2 (01:19:44):
Yeah, that's not why we come here. A consumer.
Speaker 7 (01:19:47):
You must consume now if I can help, May I
offer you something you can go troll people with if
you happen to find the laser bone.
Speaker 1 (01:19:52):
I don't like to do that kind of thing.
Speaker 7 (01:19:56):
This family, so you can go ahead and ask them,
So when you're you using the laser on the weed,
does it how's it burning?
Speaker 3 (01:20:03):
Does it burn it enough? How? Because uh, that's a problem.
Speaker 1 (01:20:08):
I feel like there would be people. That's one of
those spaces where sometimes you'll go someplace and the person
that's selling it is just like they're doing their best.
Speaker 2 (01:20:15):
Yeah, not that they're not like in love with it.
Speaker 1 (01:20:18):
That just I'm just fucking him.
Speaker 3 (01:20:20):
It is kind of amazing.
Speaker 4 (01:20:21):
Though I've seen less weed tech than crypto, Like I've
seen crypto three or four times now, but we zero.
Speaker 1 (01:20:27):
Yeah, because I feel like if you had to pick
one of those things to still exist in five years, yeah,
I mean they probably both going.
Speaker 7 (01:20:34):
To exist from previous Yes there was weed.
Speaker 8 (01:20:36):
Tech, Yeah, maybe it's dying where maybe they forgot crypt
So one was this this healthcare management devices on the blockchain,
and the other was an AI trader that would make
that was not financial advice, but it would make decisions.
Speaker 4 (01:20:51):
For you every time there's any financial thing doing financial advice.
Speaker 2 (01:20:56):
But I love the AI trade one because back to
a conversation with my make Casey, It's like if the
AI trade it was so good, why would you fucking
sell it?
Speaker 1 (01:21:06):
Yeah, I wouldn't to be so good.
Speaker 2 (01:21:07):
You just turn this ship on and off you you're
on your yaw Yeah, got me that. Fuck.
Speaker 1 (01:21:12):
It also feels like again one of those things where
it's like, for all the promise of of AI ostensibly,
you know, all the promise of it, like everybody that
uses it knows it doesn't work very well. And so
the idea of being like, well, I'm not gonna like
as part of my participation in the work of building
an AI that works, I'm going to let it control
(01:21:32):
my money while I do this is the all the others.
This goes back to like some of the stuff about
the smart house. It's like all of these things that
like give you extra time to do what like none
of these like more trading. Yeah, and it basically doesn't
like all the time in your house that you'd be
spending like preparing food or like showering room, which is
like basically the part of my life that I like
because the part like when I'm not working, you know,
(01:21:55):
like I like.
Speaker 3 (01:21:55):
When you bathe too.
Speaker 4 (01:21:57):
Yeah, I appreciate that, but that is a good point, right,
Like all of these a lot of AI is structured
to be anti bureaucratic, cutting through the supposed layers of
filth that prevent you.
Speaker 3 (01:22:09):
From living your life.
Speaker 4 (01:22:10):
Yep, right, But then when you get down to it,
what's left a more space for them to commodify.
Speaker 2 (01:22:15):
Well, now you don't have to write or create anything.
Speaker 7 (01:22:18):
Yeah, you can affect it part of that filth.
Speaker 1 (01:22:22):
But that's like there's this observation. I don't remember who
who said it. There was basically like the idea that
like because of it, maybe because of the people that
invented it, or just maybe this is what it is
most practically applied to that It's like it's doing all
of the stuff that like is your life to give
you more time to work.
Speaker 2 (01:22:39):
When I think.
Speaker 1 (01:22:39):
Realistic, like most people would be if you ask them,
they'd be like, well, can it do my work for me?
Like I want to learn how to paint or whatever
you know, and that you don't have to pain.
Speaker 2 (01:22:48):
Yeah, you can just ask for it to do.
Speaker 1 (01:22:51):
This'll draw you whatever I want to do that you can't.
Speaker 3 (01:22:54):
You can't imitate the cause, so why would you want to?
Like you, but it can make it the best stick butts.
Speaker 2 (01:22:58):
Yeah, I like when people tell me I will write
my blogs for me. I don't know what I'm writing
when I sit down, too, that's the muse. I don't know.
I let the power of Christ consume me, and I
write five and a half thousand words in two hours,
and I research awhile as I go. I send it
to my editor all caps.
Speaker 1 (01:23:15):
Is this good?
Speaker 2 (01:23:16):
And Matt Hughes then doesn't respond immediately because it's two
in the morning there, and then I send him three
more messages saying I changed a bit, I changed a bit,
it's not I don't love it. And he has to
explain to me why it's good.
Speaker 1 (01:23:26):
That is the create What if an algorithm did that
for you and you sat perfectly still for those three
hours having no thoughts of any kind would.
Speaker 2 (01:23:33):
Be nice having ADHD and tonightus That sounds great. Yeah,
just And that's the thing. None of these things seem
to take away suffering.
Speaker 1 (01:23:43):
Well no, like it doesn't seem like they're reducing, like
like they.
Speaker 4 (01:23:47):
Get a little bit of what it means to be
a person, but not the whole.
Speaker 1 (01:23:50):
Yeah, I'm so I've been saying this again. It's only
suggests like that the people involved that like these things
that are like minor inconveniences for most people, Like, are
you know the idea of like having to do the dishes,
Like that's the worst thing in the life the person
trying to sell you this.
Speaker 2 (01:24:04):
No, no, no, those people don't do the dishes.
Speaker 1 (01:24:06):
What they've done is they've gone ship. What do people do? Yeah?
Speaker 7 (01:24:09):
Fuck?
Speaker 3 (01:24:09):
What what are regular people?
Speaker 2 (01:24:11):
Food? They eat?
Speaker 3 (01:24:13):
They love that stuff.
Speaker 1 (01:24:14):
They can't they can't dress themselves.
Speaker 2 (01:24:17):
They're idiots. They can't date. And what they do dish dishes?
My cleaner does those?
Speaker 3 (01:24:23):
They charge their devices? Yeah, they need to.
Speaker 1 (01:24:25):
Love their device most definitely.
Speaker 4 (01:24:27):
They don't have time to spend with their stupid little kids,
their wife.
Speaker 2 (01:24:31):
And their stupid child, the moron child. But oh they
haven't even got one of those because they're inferior.
Speaker 3 (01:24:37):
Don't worry.
Speaker 2 (01:24:37):
L G will get you there and you will now
meet a woman thanks to Lucky gold Star Corporation. That's
what the few And I keep going back to this point,
it's like, what are the problems you sol stands for
a Yeah, yes it does.
Speaker 1 (01:24:50):
Wow, cool, I didn't I learned something? Yeah, they have
it on the side blue garret life and life's good.
And I was like you again, that's one of those
things where it like, I don't think that's the real
name of that. Lucky gold Star makes more sense because
that has a certain where it just feels like they
used to be like a ready mixed concrete company or
(01:25:10):
like an international shipping line, you know, and then at
some point they've like discovered I actually think they might
have made concrete. You know, color Color started, Color started,
the LG Concrete Inc. Super good. Well that's where it begins.
You start with that and then.
Speaker 2 (01:25:26):
And steal where we learned. We learned a lot today. No,
it's it's just frustrating because I'm not asking. I just
want to be clear for the listeners. I'm not asking,
did you see anything good? To be facetious, I genuinely like,
I was so happy that Theveingra had founded laptop.
Speaker 1 (01:25:43):
He liked it sounded cool.
Speaker 3 (01:25:44):
I'm so excited about that.
Speaker 2 (01:25:45):
And the skin products that Victoria is talking about the
fact that you can actually make these, and that's what
Teke should do. It should like be like, hey, here's
an actual friction point. You spend hundreds of dollars on skincare.
Now you don't have to do that because you can
spend theoretically hundreds on the stuff that actually works so
you don't have to spend more in the future. And
then it's like, Okay, that's one company who knows one thing,
(01:26:06):
but the overwatching thing is the CEO saying, fuck, what.
Speaker 1 (01:26:10):
Do people do jobs?
Speaker 2 (01:26:11):
Yeah? What do people do doing during the job that
they do? They wash dishes and they on set emails
and they read them.
Speaker 1 (01:26:18):
Yeah, that I don't like a little bit of disconnection
to me seems like I see it's like at the
root of like so much of what is like not
working about a lot of this that there's because the
ones that it seems to me and maybe this is
just my own bias for like small or large or whatever.
You know, I'm like gonna own my my preferences here.
(01:26:40):
The stuff that is like clearly designed to like fix
a specific thing or improve a specific thing is better
to me than and like much more easy for me
to sort of like understand, as you know, just an
idiot standing in front of a demo or whatever.
Speaker 2 (01:26:57):
I need you just to stop saying that.
Speaker 1 (01:26:58):
Well it's regulars whole term that I like to use
to describe myself, but the like so that but anyway,
but I get it in that way, whereas the idea
of like a thing that fixes or improves everything or
that is like that like sort of global idea of it.
I understand why these companies which are you know, their
job is to grow into like sort.
Speaker 2 (01:27:20):
Of a bigger thing that will then grow right.
Speaker 1 (01:27:22):
And yet like not only does it become like sort
of hard to see like what the actual vision is,
it gets not just like more abstracted, but it like
it gets weirder. This is the thing that we keep
sort of coming back to that like you're not fixing
the problem like by like devising a house that does
all my decisions for me. It does not in any
(01:27:44):
way diminish the fact that I'm still paying too much
money for skincare products.
Speaker 2 (01:27:47):
Right, you know.
Speaker 1 (01:27:47):
And I guess like this is something that uh, your
Mia was saying last night about the idea of like
all of these sort of like individual solutions to broader
structural problems, and that like the idea of just like
continuing to throw the idea of being like, well, this
is like a better way of getting around, and it's
like it might even be a better way of getting around.
(01:28:07):
And yet like when the systems themselves are sort of
not working in your favor. It doesn't matter that much
how like down to the last decimal point efficient your
experience of it is because you're still gonna be stuck
on the road with everybody else.
Speaker 2 (01:28:24):
It kind of reminds me of what I was saying
about fintech and removing friction. Oh, you'll be able to
trade better and do this. The real structural problem there
is it's very, very hard to accumulate wealth as a
regular person. The actual that is more of a symptom
of a problem than it is.
Speaker 1 (01:28:39):
And also as I understand that all of these things,
like the one there's one that was either entry centered
teel or one of the fucking super friends, but a
thing that was like basically like not FDIC backed, like
that was the whole They were like finally like you
can bank without big government being involved. And then it's
like and it collapsed right away.
Speaker 3 (01:28:58):
Goodness, there's no insurance to Yeah, but.
Speaker 1 (01:29:00):
That's like one of those things where like understanding the
idea that like this is a part of a bigger
system is like it's just not something that computes for them,
and so what you wind up with instead is like
the worst invention of halt time, Yes, like a bank
that ruins you.
Speaker 4 (01:29:15):
Yeah, something that might have been invented by like a
rich diletunt in the eighteen thirties.
Speaker 1 (01:29:21):
Yes, like at a time when there wasn't really a system,
and so they're sort of yeah, like just the rude golder.
Speaker 2 (01:29:26):
I guess there's so many of the crypto things that
do feel like an insane thing that a guy would
do in a cult. Yeah. Yeah, it's a coin about
this woman who talked about what she used to do
with a Willie mm hmm. And you should invest in
the hawk to a coin who has a podcast for
some reason, this is your god. Now.
Speaker 4 (01:29:41):
I have a few friends who have made a lot
of any friend crypto, and they're the way that they
approach it is like they're like, I am deeply cynical
about it, but everyone here is an idiot. And if
you make enough money to begin with, then you get
plugged into the networks of people who are actively manipulating people.
Speaker 2 (01:29:58):
So what you mean is that they believe they believe
it's it's every every grift ever, which is I'm being scammed.
But what they don't know is I'm scamming them better.
Speaker 1 (01:30:09):
Well, yes, that's the bit that I always am curious
about with like because this is I think it's been
like demonstrated that like absent any other of the like
sort of many factors that could make somebody become like
that that like made people vote for Trump, or that
like push people to the right, that like crypto is
the single most powerful indicator there that it is like
(01:30:30):
the thing that like owning crypto is associated with voting
like often to the very hard to the right in
a way that nothing else is. And I think that
there's an aspect of that where like the idea of
being like, all right, it's a scam, I'm scamming somebody.
I'm probably being scammed, but I know enough to get
out that seems to be true. Not just that the
people like you were saying, like your buddies who are
(01:30:51):
like plugged into the people that are actually you know,
like in the whale community and like know when it's
time to bail on Hawk two coin or whatever.
Speaker 3 (01:31:01):
Yeah, if it's fast, you know whatever the fuck right, Yeah.
Speaker 1 (01:31:04):
It's like all of these things that basically are like
the most obvious like do not buy this. This is
a gag sort of thing. And yet like I think
everybody that is involved with that at every level, right
down to like just some like manosphere shut in nineteen
year old on a gaming computer right doing all the stuff.
All of those guys somehow still believe that. They're like, yeah,
(01:31:27):
I know it's bullshit, but like, I'll know when it's
time to get out.
Speaker 2 (01:31:29):
But now it's the ultimate con which is to pretend
to give people industry, to give people hope, to give
people more a feeling of more control while controlling them
and probably being controlled by someone out it's yeah, deeply.
Speaker 4 (01:31:43):
So I've tried to talk with them about this because
I'm like, I really do think for y'all, it comes
down to luck. You know, it's like like you said,
it's literally everyone believes that, and everyone believes.
Speaker 3 (01:31:53):
That in so many industries, right.
Speaker 4 (01:31:56):
The difference between you and some other bloke is like
maybe you heard about it an hour earlier, or you
just like there's like something that you got for no
other reason other than.
Speaker 7 (01:32:06):
It's important to note that we're saying this inside of
a casino.
Speaker 2 (01:32:10):
Yeah yeah, right now, And honest, every.
Speaker 7 (01:32:13):
Everyone thinks they have a system to play, and the
true beauty of sitting casino is sit there cross your arms,
watch and try to figure out what system they think
they're using and watch it fail.
Speaker 3 (01:32:29):
So we gamble in tonight, right boys.
Speaker 2 (01:32:31):
Yeah, not for me.
Speaker 1 (01:32:32):
I'm trying to lose five hundred dollars in twenty minutes.
Speaker 2 (01:32:34):
So we're wrapping this episode up, but I will say
the Friday episode. The show floor closes today, by the way, guys,
So there's nothing tomorrow, No just podcasting. Oh owly zool.
Speaker 1 (01:32:46):
So we're gonna wrap this one up, but tomorrow we
can talk about get into the Eureka Vault. You're going
to the Eureka Volte immediately following this will that's the
one you'll save art.
Speaker 2 (01:32:59):
I am sending you to the rat nest.
Speaker 3 (01:33:01):
Oh my god.
Speaker 2 (01:33:02):
We'll talk gambling tomorrow though, of course, how dare you
how dare you ever think to compare the dishonest crooks
of cryptocurrency.
Speaker 7 (01:33:10):
I'm sorry, let's do very sorry, the Las Vegas said, sorry,
the Nevada Gaming Commission.
Speaker 3 (01:33:17):
I'm very sorry for anything.
Speaker 2 (01:33:19):
I have swindlers in the sports book, but you know
that we love our odds there. But you know the
truth is our beautiful slot machines and our honest tables
that have the odds on the table. How dare you
comp its cryptocurrency? But we have to wrap up. I apologize,
No six waists.
Speaker 1 (01:33:38):
You make big money.
Speaker 3 (01:33:40):
Table.
Speaker 1 (01:33:43):
Phil. Where can people find you? Sorry?
Speaker 3 (01:33:47):
Phil Brighton?
Speaker 7 (01:33:48):
You can find me on Blue Sky at at Fundranium,
and you can find me at my blog Funderanium Labs dot.
Speaker 1 (01:33:54):
Com, David Defector dot com. UH is the website, Distraction
is the pot cast, and it's Christmas Down is the
Hallmark podcast. You messed up.
Speaker 2 (01:34:03):
You said you can do it every other time, so
you have to do it every time.
Speaker 4 (01:34:06):
Corblamo and ED Newsletter, the Tech Bubble dot subsac dot com,
UH podcast, this Machine Kills UH and UH ext to
everything site in Blue Sky, Big Black Jacobin.
Speaker 2 (01:34:19):
You can find me on the new social network hawk
to a Social and everywhere else THEA. And you're gonna
complain after this, you can say, Ed, it's the thing
that you were meant to rerecord. And Mattasowski told you
a month ago and then actually two months and Ian
Johnson also told you. And you need to be sorry
for those people anyway, And I'm gonna re record the
bit at the end. You've got one more episode today
and then tomorrow another two episodes, and then Saturday there's
(01:34:43):
just one. These are gonna be the real magic ones.
These are gonna be where people are really deteriorating mentally.
Speaker 7 (01:34:49):
Is this where I get to be mad about regulators?
Speaker 1 (01:34:52):
I'm gonna make you mad somehow, I believe in it.
Speaker 2 (01:34:54):
I'm gonna be I'm just going to be googling annoying things,
not even about the show.
Speaker 3 (01:34:59):
You don't need to google, you just do it.
Speaker 2 (01:35:00):
Yeah, it's a natural thing. I'm generative. Anyway, Thanks for
listening to this episode. More to come from the Consumer
Electronics Show. Thank you for listening to Better Offline. The
editor and composer of the Better Offline theme song is Matasowski.
You can check out more of his music and audio
(01:35:22):
projects at Matasowski dot com, M A T T O.
S O w Ski dot com. You can email me
at easy at Better Offline dot com, or visit Better
Offline dot com to find more podcast links and of course,
my newsletter. I also really recommend you go to chat
dot Where's youreed dot at to visit the discord, and
go to our slash Better Offline to check out our reddit.
(01:35:45):
Thank you so much for listening.
Speaker 1 (01:35:47):
Better Offline is a production of cool Zone Media. For
more from cool Zone Media, visit our website cool
Speaker 2 (01:35:52):
Zonemedia dot com, or check us out on the iHeartRadio app,
Apple Podcasts, or wherever you get your podcasts.