All Episodes

March 19, 2025 86 mins

Welcome to Radio Better Offline, a tech talk radio show recorded out of iHeartRadio's studio in New York city.

In this episode, Ed Zitron is joined by tech reporters Alex Cranz, Cherlynn Low and Victoria song to talk about the broken promises of generative AI, and how useless things get funded far more than useful ones.

Alex Cranz: 
https://bsky.app/profile/cranz.bsky.social

Cherlynn Low:
https://www.engadget.com/about/editors/cherlynn-low/
https://x.com/cherlynnlow
https://bsky.app/profile/cherlynn.bsky.social

Victoria Song:
https://bsky.app/profile/vicmsong.bsky.social 
https://www.theverge.com/authors/victoria-song 

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Also media, Hello and welcome to Better Offline. We're in
beautiful New York City, Nevada. We're in on fifty fifth Street,
and I am ed Zititron, of course, and I'm surrounded
by incredible people.

Speaker 2 (00:26):
So am I right?

Speaker 1 (00:26):
Is Sherlyn Lowe of n gadget he doing sheln And
we've got Alex Kranz reporter critic extraordine.

Speaker 3 (00:33):
Now, yeah, I just like to be really happy about Nevada.

Speaker 2 (00:36):
It's beautiful here in city Nevada.

Speaker 3 (00:38):
Yeah, Sonny, it's Warren.

Speaker 2 (00:40):
Yeah, I love it. And of course Victoria's song from
the Verge.

Speaker 4 (00:43):
Hello, Hello sunny New York, New York and Nevada.

Speaker 1 (00:47):
Yeah, that's the thing. People think I do this accidentally.
No one knows what I do deliberately, and that's because
neither do I. So we were all just talking on
the way in here about possibly the best tech journalism
we've ever read.

Speaker 2 (00:57):
And it's a story about a guy called Kevin Ruce.

Speaker 1 (00:59):
He appears to be a child that who was allowed
to write for the New York Times articles called powerful.

Speaker 2 (01:04):
AI is coming and I'm going to read you a
few sentences. We're going to discuss this.

Speaker 5 (01:09):
Here are some things. I believe that our artificial intelligence.
I believe that over the past several years AI systems
have started surpassing humans in a number of domains math coding.

Speaker 2 (01:16):
Wait math math? Are you fucking anyway?

Speaker 1 (01:22):
There is an article that was in the New York
Times AGI thing that Kevin Ruce wrote, a child in
a man's body like big except Tom Hanks. He can
act and actually complete his job. And Kevin Russ appears
to have a gas leak. He has said that possibly
very soon, probably in twenty twenty six, twenty twenty seven,
but possibly as soon as this year, one or more
AI companies will claim they've created an artificial general intelligence

(01:43):
or AGI, which is usually defined as something like a
general purpose AI system that can do almost all cognitive
tasks a human can do. If you believe this year
or moron, I'm sorry, what do you think? Like, I
don't know, I just don't.

Speaker 3 (01:56):
I think he's like gassing him up because he wants
to flirt with more AI and so like he needs an.

Speaker 6 (02:02):
Age hang on, hang a hang off, flirt with AI
like like like your girlfriends.

Speaker 4 (02:07):
The man who got a I.

Speaker 2 (02:08):
To fall in love with yeah, bing bing was like
leave your wife.

Speaker 7 (02:13):
Yeah, and so like like just so funny. This guy's
so funny. Yeah, like like this guy.

Speaker 3 (02:24):
You know, I've been using a lot of AI. I've
been talking with these these ladies about it a whole bunch,
and I think of it. I love using AI because
it'll be like, you're the best writer.

Speaker 4 (02:32):
We were just talking about this. I guess who told
her to do this. I guess who was just validating
her decision to do this.

Speaker 3 (02:38):
But you know, you use it. You love it because
it tells you you're you're beautiful and pretty and smart,
and then none of it is true true.

Speaker 8 (02:46):
It told me I could win a Pulitzer Prize if
you can't a one to three year period, and.

Speaker 4 (02:52):
Why not, geograph, they will win the Pulitzers.

Speaker 8 (02:56):
It will win the Pulitzers. But it was just like, yeah,
you could win a Pulitz And I was like, based
on based on what criteria? Are you telling me this?

Speaker 4 (03:04):
And uh, you know, are you using Claude as well?

Speaker 8 (03:07):
Yeah?

Speaker 4 (03:08):
Claude and chat Chief.

Speaker 3 (03:10):
It's like horoscopes. When you get a cool horoscope, you're like, yes,
this is awesome, you understand I'm a great Gemini. Yes,
But then you're like, no, but it's sucking not real
because it's a horoscope.

Speaker 4 (03:19):
You are out.

Speaker 2 (03:21):
There being like, I don't know any horoscope stuff, but
this mean.

Speaker 3 (03:25):
Is out there like, you know what, I really believe
that I'm a leo and that this will understands.

Speaker 1 (03:31):
I'm well, let me read you one passage as well.
I believe that hardened that Sorry.

Speaker 5 (03:35):
I believe that hardened a skeptics who insist that the
progress there's all smoke and mirrors. And he dismissed AGI
as a delusional fancy. Not only I'm wrong on the merits,
but are giving people a full sense of security.

Speaker 1 (03:45):
And I just want to say, that is one of
the dumbest fucking things I've heard in my life. The
people who criticize this are not actually they're giving people
a full sense of security, as opposed to the guy
being like the computer will wake up in two years
because he's.

Speaker 3 (03:58):
Saying he's saying the dand of AI is AGI, when
in fact, the danger of AI is the destruction of jobs.

Speaker 1 (04:04):
Yes, and even then it's going it's not even doing that,
it's it's taking away jobs from people who are already
having trouble getting them.

Speaker 3 (04:11):
He's like, oh, sky Nott's gonna do it. No, Skynut's
not gonna do it. Elon Musk deploying whatever a little
chat GPT bought he does is gonna do it.

Speaker 4 (04:18):
But AI is not infallible.

Speaker 8 (04:22):
It makes up a lot of stuff wrong, and it's
just it doesn't understand how to be human in a
lot of ways.

Speaker 6 (04:31):
I from that one sentence, though, I can tell who
his audience is. He's talking about a false sense of
security for whom, for the people that are developing the AI. Right,
it's just because none of us out there who have
jobs that might be taken over by a I have
any sense of security and the idea of an AGI
that will be competent enough to take over jobs. So
his audience, he's not talking to us. He's not talking

(04:51):
to people who have jobs that might be replaced. He's
talking to people developing the AI, the ones who want
to make money from it.

Speaker 4 (04:56):
Right.

Speaker 1 (04:56):
I also don't know what the full sense of security
would be. Oh, don't worry, this isn't going to fuck
them up. As opposed to the fact that you've got
one company soft Bank, borrowing money to put money into
Open Eye, which burns billions of dollars to make a
pretty shit product. If we're like not something that you
can find some cool things with. Fine, but for the
most part is the same thing this week.

Speaker 3 (05:17):
If you're using any ai as a search engine, you
should stop amusing. Yeah, do not do I know, Sam Altman,
you're listening to this right right now, I wish and
you are just so ecstatic about chatchpt. You love it
so much. It is a terrible search engine.

Speaker 2 (05:32):
It really is.

Speaker 3 (05:32):
What is it fifty I think now at this point,
like there was a recent study fifty one percent of
the responses are false sick something like that, like like
this stuff.

Speaker 1 (05:41):
Is bad acts like actually bad. That's what drives me
in saying. You get an article in the New York
Times being like, the computer is so smart, the computer
is so strong.

Speaker 2 (05:49):
I love the computer.

Speaker 3 (05:50):
Because it told you, because it's been it gases us up.
It's been an all this time, being like you're smart
and strong, so I must be strong.

Speaker 8 (05:58):
If you talk to it too long ago, it's just
starts nagging you and then you're like, oh, I'm not smart,
I'm so floathing.

Speaker 2 (06:04):
It's just you're and it's like and it's like you're
so good I'm like, I refusa believe between you and
my therapist, this has never worked.

Speaker 1 (06:14):
Nice try, you can't swindle me a computer.

Speaker 2 (06:17):
It's just so strange as well, because ostensibly you.

Speaker 1 (06:20):
Three are wonderful reporters, like I've ready for years, and
it's like you're not cynics, but you are willing to
be critical.

Speaker 6 (06:27):
Well, I am hardened as they come. When it comes
to the hardened, people like the.

Speaker 1 (06:30):
Shit still like you're excited for it. This is not
even excited for it. What's weird about this piece other
than the fact it's just completely wrong in all almost
everything it says and clearly supporting billion dollar companies, it
doesn't even seem that excited. It's not like he's like,
I can't wait until this happens because imagine the cool
shit that could happen, and like, I don't know, come
up with an idea. Perhaps that's like, oh, if I

(06:52):
had an autonomous intelligence in my phone, it could plan
my day for me. How delightful despite there being seventeen
different companies that claim they do this already on Instagram
does not exist. But he could come up with nightear
or something cool. But it's like, actually, AGI is coming
what is it.

Speaker 2 (07:06):
I don't know. It's inevitable though.

Speaker 1 (07:09):
The thing I can't describe is inevitable, and I don't
know what it will do, but it will be good
or bad.

Speaker 3 (07:15):
He's refusing to define a g I through the entire piece,
and he's like, and it's going to keep being redefined.
And I'm like, if you cannot define the word, the word, sorry,
So we take that work back. You're Kevin, You're not
allowed to use a g I until you can actually
define it.

Speaker 1 (07:30):
Yeah, I don't think you should be allowed to use
beat for a minute, go outside the theme.

Speaker 3 (07:36):
So use the touch grass app and step away touch
dot Grass.

Speaker 1 (07:40):
So, Victoria, you had a delightful review this week, but
device called the B.

Speaker 2 (07:44):
Please tell us about the B.

Speaker 4 (07:46):
Okay, so this review which is B by.

Speaker 3 (07:48):
V by V.

Speaker 4 (07:51):
We do not like those two letters together without the
buie though.

Speaker 8 (07:54):
Okay, anyway, yes I killed ed, Yeah you did kill them. Okay,
so you just like toppled it for a second. But
so you know, B is kind of the latest in
the AI gadgets that claimed to be your memory.

Speaker 4 (08:13):
You know, you wear it.

Speaker 8 (08:14):
It records everything you say and when I say, it
records everything you say it records everything you say. Okay,
so it records everything you say, listens to all of
your conversations. It doesn't record the audio, but it then
process it processes everything into a transcript, so you have
a transcript of your life. And then from that transcript,
you can use the chatbot to search the history of

(08:34):
your life, or you can get these daily summaries that
are that just be like, oh, this is what you
did today, these are the conversations you've had, this is
the places that you went to, and then it'll also
suggest to do based on your conversation. Right. So I
wore it for about a month and it wrote some
really great fan fiction about my life, and it was

(08:57):
it was also.

Speaker 1 (08:59):
I have one of my favorites, which was when she's
listening to TV off by Kendrick Lama. Yes, yes, he's
one of the moments Victoria instructed must have to turn
off the TVA, reminding them both to avoid getting sick
again and mentioning leftover choccuterie.

Speaker 2 (09:12):
Yes, this fucking rules.

Speaker 8 (09:15):
It's because you know, like obviously about the time when
I started testing this at the Super Bowl halftime show,
was just like in everyone's mind and I was like
blasting TV off everywhere because yes, and also mustard.

Speaker 4 (09:28):
So that was just going in my house like a lot.

Speaker 8 (09:31):
And so the bees picking this up and mistaking it
as like things that I'm saying. It would be like
someone is someone is very sure of themselves, and I'm
watching tiktoks and they're bragging about themselves. It also like,
so there's this bit called fat that I call fact tinder.

Speaker 4 (09:50):
They call it fact review.

Speaker 8 (09:51):
So in the app, based on your conversations, you'll get
this window and there's just like facts about you, and
you swipe yes if it's true, Yeah, that's absolutely.

Speaker 4 (10:00):
One of my fact.

Speaker 8 (10:00):
Tinders was like Victoria knows someone named ken Kendra Monticia
who enjoys mustard and turning TVs off, which is.

Speaker 4 (10:10):
Just like Jesus, you know, okay.

Speaker 8 (10:14):
And in my review, I made this little carousel of
fact tinders that I was given that you can just
go through and try and.

Speaker 4 (10:22):
Guess which one of these are true.

Speaker 8 (10:25):
Uh. There was one that was just like Victoria has
dietary specific like specific dietary needs and.

Speaker 4 (10:30):
She can't eat lollipops.

Speaker 8 (10:31):
And I'm like, I don't even know how for the
for your information, I don't recall talking about lollipops. I
don't recall eating a lollipop. I don't know where you
would get this information from.

Speaker 4 (10:45):
Damn, that's that's nuts. What what was the l behind me? Again?
Was it their own?

Speaker 8 (10:50):
It's a mix of available lll ms such as I
think anthropics and open aiyes, as well as their own,
so like they're not giving mix. But what was the
point your memory, Alex, Alex, I can tell you the point.

Speaker 6 (11:07):
I want one of these things for me because I'm
the sort of person that talks a lot and then
does nothing at the end of the day with it.
Because and then I want it to be like, here's
all your to dos based on all the crappy sides today.

Speaker 8 (11:16):
So there is a glimmer of a good idea. Yes,
because I have ADHD. I consist I think everyone consistently
has conversations with people in their lives and they'll be like, yes,
we should follow off exactly. You don't follow it up
on that because you didn't write it down or you
didn't put a name there. So the idea that gleaning
from your conversations that it would do that, it's not

(11:39):
a bad idea. There are a lot of people with
memory problems who might like that. Unfortunately, you also have
to have a pretty good sense of self in order
to fact check this AI, because otherwise you are going
to be gas lit constantly. So so like the way
I wrote this review is like I kind of wanted
to take people into what it's like to actually use it,

(12:01):
how it changes your behaviors and all that. So, like
I included my day one. I commuted on day one
into the office, I went and I took a briefing
for bold Hue, which is a foundation printer.

Speaker 4 (12:13):
Very cool to saw that story by you two.

Speaker 8 (12:15):
Thank you, But so like I I found that I
took that briefing and I went to the office that day,
I had dinner with a friend and I went home.
So a day where I had a lot of conversations. Yeah,
And of the five to dos that it generated that day,
one of them was like follow up on thoughts that

(12:37):
were shared, but not like deeply dove into I'm like,
what the thank you?

Speaker 2 (12:43):
Sure?

Speaker 4 (12:44):
The most shared me?

Speaker 2 (12:46):
Check out?

Speaker 4 (12:48):
I don't fuck what does that mean?

Speaker 8 (12:50):
The second was like, urgently check on your patient in
Louisiana as they are in danger of self harming or
harming someone else, and I'm like, what the where did
that come from? Where did that actually come from? And
the other one was check your car because it's making
rumbly noises. So let me tell you. The car that
it told me to check was probably the new or

(13:12):
the the NJ Transit. The book is that was a
bumpy ride coming in the Louisiana. Patient I'm guessing was
someone on my commute talking about.

Speaker 2 (13:26):
Time everyone's life like.

Speaker 8 (13:29):
And it depends on how often you mute it. And
the longer I wore it, the more I muted it,
because it started picking up like surveillance state.

Speaker 4 (13:37):
Yeah sure, but you.

Speaker 8 (13:39):
Know, around anywhere between three or four days and a week,
depending on how often you're muting on the charge on
a charge.

Speaker 3 (13:45):
So it's the actual plan of the beat to just
gather so much random conversations that it can like better
train its.

Speaker 2 (13:54):
Ll M even that because it doesn't seem to know
ship from fuck.

Speaker 8 (13:57):
It doesn't have a bunch of people, don't you trace
you do when you set it up, you do train
it on your voice and then did the poor job
of that, yes, because it often thought my husband was
me and husband has a much deeper voice, so you
know you can label speakers and what it also doesn't

(14:21):
always work. I tried labeling speakers and it wouldn't save.
And then the one time it labeled my friend, all
women from the there there fourth who I talked to
were this friend. So it's like, this friend did this
for you? Did this friend?

Speaker 3 (14:39):
I didn't do that.

Speaker 4 (14:40):
They didn't do that whatsoever.

Speaker 6 (14:41):
I have a question, So when you took your briefing,
did you have a conundrum of whether you wanted to
mute it?

Speaker 4 (14:46):
Did you ask tell.

Speaker 8 (14:47):
People where you like it's a brief it's a briefing. Yes,
Generally you usually have recorded. I was actually recording on
my phone. I totally forgot that I was wearing that thing, so,
you know, like I was fully expecting to just use
my phone recording of it. And then I was like, oh,
it has processed my conversation. And to be fair, the

(15:09):
summary I got of that meeting.

Speaker 4 (15:11):
Was actually quite good.

Speaker 8 (15:12):
It can be Yeah, it was quite good. It was
similar to what Otter does. It was summarizing the key
takeaways from this very structured conversation. It also memorialized forever
so that I know this is true, and multiple different
outlets that surge on fiance's makeup artists said that I
have great skin everything, and it's like memorialize I'm not

(15:38):
wearing foundation right, fact, yeah it is. I have great skin,
and I have spent so much money ensuring that I
have great skin with many Korean skincare products. We will
but we will talk about this, but just you know,
so like, ah, this is memorialized.

Speaker 4 (15:53):
I love this. But the thing is so it got all.

Speaker 8 (15:55):
The facts about the thing right, including pricing. All of
that was correct. Launch got that correct. Got the name
of the product completely wrong.

Speaker 2 (16:04):
Sick.

Speaker 8 (16:04):
It was a unique form mule. It was called formule.
It's bold hue, not remotely the same.

Speaker 1 (16:12):
This just sounds like it's just generic. Like because Otto
does this, I'm guessing they use similar models. I'm guessing
it's just the same models that everyone else uses to
take voice that like rev dot com does this as well.

Speaker 3 (16:24):
Yeah, yep.

Speaker 1 (16:24):
So, like the most impressive thing is the thing that
large language models do already. I have to wonder if
its inability to tell certain people apart is kind of
almost going back to like one of the core problems
of AI type stuff that when the connect could not
see black people.

Speaker 2 (16:38):
I don't know if you remember that.

Speaker 1 (16:39):
Yeah, I have to wonder if they've done much training
on voices that are not from white women nor men.

Speaker 4 (16:45):
So they do offer forty languages.

Speaker 3 (16:48):
Well so does in these other things. None of these
transcription services work very well.

Speaker 4 (16:54):
Yep, I know that. I'm just telling you.

Speaker 3 (16:56):
I'm just telling audience doesn't know this side. They're not journalists,
like let's talk, they don't have to.

Speaker 8 (17:05):
I'm just saying they offer forty languages. I did not
test whether the other forty languages were some languages.

Speaker 2 (17:13):
I cannot speak anything other than English.

Speaker 4 (17:15):
And even we've got we've got about four languages. We
do have about four. Two people in this room.

Speaker 3 (17:19):
Have four languages and not have white quite.

Speaker 8 (17:23):
Not white people in this room have about four languages
between us.

Speaker 2 (17:27):
Is like, forty languages does not cover the rice problem.

Speaker 6 (17:30):
So no, So So I think what you're getting at,
ED is that like technology frequently does this because the
info is trained on it doesn't train on a wide
enough set of people to cover the broad spectrum of
people in our world. I will say that like speed
talking speed exactly talking, speed talking, pitch talking like Hayden's
and patterns. I do think though that what both Alex
and v were getting at or two is that this

(17:52):
this particular product is so far behind and it's like
algorithms and the machine learning side of things because Otter Rev,
even Google Recorder or Apple memos Boys memos can better
identify voices and tell them apart and then attribute them
very accurately.

Speaker 4 (18:08):
They also it's like the struggle, but there's so much better.

Speaker 8 (18:10):
So like here's the other thing. This thing because it's
listening to things and all the time also struggles to
tell between broadcasts and live Oh yeah, yes, so you
know I'm watching an episode of Abbott Elementary and like
I just have to like emphasize an underscore that it
takes up about ten percent of your active brain at
all times to be like should I be muting myself
braw Like you know you do that Loo, what whatever

(18:33):
you do in a zoom call currently, imagine doing that
twenty four to seven. That's it's it's a lot of
your your brain that gets visited to that.

Speaker 3 (18:41):
Process when you go to the bathroom.

Speaker 8 (18:42):
You I can tell you didn't read my review because
that is in my actual I.

Speaker 2 (18:48):
Actually didn't remember whether it did. There was a related thing.

Speaker 8 (18:51):
There're so Okay, so anyway, we're going to address all
these points. We're going to address all these points. Okay,
I need to know if that was lacked aid like
hate lack aid for for dairy.

Speaker 4 (19:04):
I heard the other Kay, we're.

Speaker 8 (19:04):
Gonna we're gonna go. We're gonna start with the broadcast thing.
I was watching an episode of had Elementary and the
suggested to do it gave me was monitor union strikes
because students may have a hard time coming to your
class due to SEPTA strikes. I am not a public
school teacher in Philadelphia.

Speaker 4 (19:23):
One.

Speaker 2 (19:24):
Two.

Speaker 8 (19:25):
It can also read your emails if you give it access.
It told me to check uh to basically take this
unique I D I D entifier and check this park
Mobile claim settlement and file by March fifth, I checked
four all four of my inboxes.

Speaker 4 (19:42):
I cannot find this email.

Speaker 8 (19:43):
I have no idea why they told me to do
this to two blah. I you know, one morning after
a particularly five US dinner, I went to the bathroom
and I committed crimes.

Speaker 4 (19:56):
Okay, I committed crimes. I was wearing this student thing.

Speaker 8 (20:00):
Uh not stupid, but you know, I feel I was
wearing this device that I was testing. I turn around
and I go, oh damn, that was a shit. And
then a second later, a second later, I went, oh shit,
this thing is listening to me. And then I muted
it far too late. I look at the summary it
gives me, and it goes Victoria humorous, humorously vocalized her

(20:22):
bowel movements, and I was like, Jesus fucking Christ. And
then I had a one on one with my editor
and the summary, because it was within an hour of
this happening, the summary that it transcribed was todd. Victoria's
editor humorously brought up that he saw on a shared
platform that they had a fact that she had.

Speaker 4 (20:45):
Vocalized her bowel. Oh my. They laughed about it and
it was funny.

Speaker 8 (20:50):
And I was like, do you think that I would
commit such an age R violation as to tell my
freaking editor that I had a particularly impressive dump that morning.

Speaker 4 (21:03):
No, I would not tell anybody this.

Speaker 8 (21:07):
I would take this to my grave unless it was
to demonstrate the fallibility of AI. This is a human reason,
a human good for me to be humiliated. So I
shared this, and my conclusion is is that death, sex,
and bowel movements are things that AI should butt the
fuck out of yeah, because I got some real interesting

(21:30):
notifications from this, and I was like, I didn't need
you to know that. I didn't need to be humiliated
by that. But the suggested to do was to start
carrying lactaid.

Speaker 4 (21:39):
From the poop because I had.

Speaker 8 (21:43):
Conversation lactose intolerant, and it said start carrying lactate again
because lactose intolerant symptoms are coming back. And I was like,
this is fucking rude, hopeful, helpful, helpful.

Speaker 1 (21:57):
I do not have mustard that would not be good
for your bows turned the TV off.

Speaker 8 (22:01):
But so I do want to say I do want
to say that it is. It does learn broadly things.

Speaker 4 (22:07):
About you that are accurate.

Speaker 8 (22:09):
So I, by the end of my month testing this,
I had fallen down into several existential crises. But there's
a chatbot that you can talk to, and so I
asked it things like am I a good person? Am
I a bad person?

Speaker 4 (22:22):
One type of person?

Speaker 3 (22:24):
Am I one who needs it?

Speaker 8 (22:32):
Yeah? And I was like, what is my communication style?
How would you describe the themes of You could tell
you if.

Speaker 6 (22:38):
You're an n f P or an I.

Speaker 8 (22:39):
T like, well, what's my relationship like with my spouse?

Speaker 4 (22:44):
And so yeah, yeah, yeah, yeah yeah.

Speaker 8 (22:46):
So that's the funny thing is I was very touched
because it was like, you are definitively a good person.
Here is like six reasons why with transcript things like
it's like you constantly show up for your friends, you
stand up for a wrong, you do this, this, this
and this, and I was like, oh my god.

Speaker 6 (23:05):
This is the validation we were talking about.

Speaker 8 (23:09):
It's just like you prefer direct and honest communication. You
don't play games. And I was like, yes, because I
am in Ariury, I'm Aries, I'm Aries sun Leo Moon
Aries no Ari sun Leo Rising Leo, No Ari sun
Leo Rising Aries Moon and Aries Mercury. And it's basically
like astrology and AI both agree that I don't fucking

(23:32):
rot and I don't suck around. Basically, I am very direct.
What you see is what you get, super honest.

Speaker 4 (23:37):
I was like, oh, thank you, that's so nice. And
they're they're like, you advocate for your colleagues. Oh that's
so it's true.

Speaker 2 (23:45):
I am.

Speaker 4 (23:49):
This demon. It's six feet under.

Speaker 8 (23:51):
I need an AI with transcripts to have definitive but
I'm a good person.

Speaker 3 (23:57):
You don't like when it computer says you're pretty.

Speaker 2 (23:59):
I don't.

Speaker 3 (24:00):
I don't love it.

Speaker 2 (24:01):
I don't care.

Speaker 3 (24:02):
I love it. The computer was like, you are such
a good writer, and I was like, thank You're.

Speaker 4 (24:07):
Getting in the Severn zone are we?

Speaker 1 (24:08):
Like?

Speaker 8 (24:09):
Oh, you should have seen the to dos it gave
after I watched an episode of severns.

Speaker 1 (24:12):
It makes watch a more interesting episode like the one okay,
I've loved the season, but the one way she goes
out to the cold.

Speaker 4 (24:21):
But the last episode that was not the episode that
I watched.

Speaker 2 (24:25):
It was the I've loved the season okay, because I.

Speaker 8 (24:28):
Was getting the episode that I was getting to do
this from was from the one where it's Gemma.

Speaker 4 (24:34):
It was all about Gemma.

Speaker 8 (24:35):
It was all about that was a beautiful it was
a beautiful episode. But when I looked at the dos like,
I was.

Speaker 2 (24:40):
Like, Jesus Christ, this is Jesus.

Speaker 6 (24:43):
This doesn't make any sense whatsoever going to the dentist
donate blood.

Speaker 4 (24:47):
Yeah, it's just like not like that.

Speaker 8 (24:50):
It was just kind of closer to the or. It
was like follow up on thoughts of cold.

Speaker 4 (24:57):
And I was like, this is yeah, honestly.

Speaker 1 (25:01):
I have some other practical questions. So it's fifty dollars
with no subscription. Yes, yes, how the fuck does this
make sense? Numerica like so that they claimed in the
in the review, you said they're going to do it
on device, which is a thing that everyone says, and
I've never seen one of these companies actually execute.

Speaker 2 (25:17):
So fifty bucks.

Speaker 1 (25:18):
And then so I'm pulling up my Victoria value honesty
and does not like dishonest behavior.

Speaker 4 (25:24):
I agree with that.

Speaker 2 (25:25):
From what I agree.

Speaker 3 (25:26):
It says that's one of your hobbies.

Speaker 4 (25:28):
I know.

Speaker 3 (25:31):
Hell yeah, Victoria, your hobbies are being a good person.

Speaker 4 (25:36):
Wait, hang on, hang on show. He says.

Speaker 8 (25:37):
Victoria believes in leading with love and kindness and calmness,
and family.

Speaker 6 (25:42):
With internationals with family, and you have is tagged at
the bottom as hobbies have probbable good.

Speaker 4 (25:48):
But Victoria understands Korean to something.

Speaker 8 (25:53):
Jesus Crictoria has a partner name Very also a hobby.

Speaker 4 (25:57):
Also a hobby.

Speaker 8 (25:59):
This is a health Victoria has lost both of her
parents exist. Yes, this is true. Victoria values values family
relationships and wishes to maintain them.

Speaker 4 (26:10):
Also true.

Speaker 2 (26:11):
I would love it if it's said otherwise. It's like Victoria's.

Speaker 8 (26:13):
This is politics, which is also true. Victoria has been
involved in discussions about family inheritance and estates tagged as politics.

Speaker 2 (26:21):
That's true, very good.

Speaker 4 (26:23):
Victoria has an interest in visiting family in Korea.

Speaker 2 (26:26):
True Hobbies.

Speaker 8 (26:28):
Victoria has collaborated on a gadget related podcast, Wow It's True.

Speaker 6 (26:35):
Which so it's very gamified, almost very very obvious.

Speaker 1 (26:39):
Yeah, Like these are things I have to wonder see.
The problem is I would love this company to burn.
I think the idea of this concept is horrifying. I
think it's upsetting that it exists. However, I would also
love to see like a ten people's worth of data
on this, just to see if they tell everyone very
similar things, because how.

Speaker 2 (26:58):
The fuck do you define Like is it willing to
be like you sound like a dickhead, you do not
care about your family?

Speaker 6 (27:04):
I was going to say, I think the flip side
of it validating you is will it pick up on
behaviors that are negative in people that have like these
traits like we you know, in social media in popular
culture now we're calling everyone narcissist.

Speaker 4 (27:15):
Right, watch any episode of Love is Blind.

Speaker 6 (27:16):
It's like, that's a narcissism that's as.

Speaker 4 (27:19):
Based on Love is Blind.

Speaker 8 (27:21):
Wow, And one of my fat tenders in the review
is saying that Matt Victoria had an incident where Madison
left the room, and she did leave the room.

Speaker 6 (27:42):
That's where my point is is, right now the people
who are using it and testing it are getting these
positive responses and feedback. I'm curious to see what it
would do to any number of CEOs or billionaires that.

Speaker 8 (27:52):
Also listen to me bitch to friends about things and
has recorded those things. Those were not included in the
review because I would, I would, I would get canceled.
It did also listen to me talk for about an
hour and a half about Flake Lively and Justin Baldoni.

(28:13):
My opinions that your friends, uh yeah, actually it said
Victoria's Team Balcony in Victoria's Team Balcony gossip context, and
I was like. It also gave me facts such as
Victoria has a living room and a kitchen.

Speaker 4 (28:31):
Thank you, thank you, thank you for that.

Speaker 8 (28:34):
It's one of my favorite things that it did was
it summarized a conversation between So what happened was is
that we were having some late night cookies Oreos, and
I dropped one into the cat's empty food bowl and
I went three second rule and my husband goes, that's disgusting,
and I say, in an apocalypse situation, you would eat

(28:56):
that Oreo And he went, I have a heat gun.

Speaker 4 (28:59):
I would just disinfect it.

Speaker 8 (29:01):
It took this conversation and went Victoria and Gate have
have disagreements about how to handle smelly things in an apocalypse.

Speaker 4 (29:11):
That was a disagreement. It was a disagreement. I'm just like,
what what is the point of any of it?

Speaker 1 (29:18):
Always looking at this and going like, ah, finally, well
I'd forgotten I had the Smellorio.

Speaker 2 (29:22):
Actually Orio was left out. How would you possibly not?

Speaker 6 (29:25):
It feels like the bee was this like little toddler
robot in your house looking at you, going through your
life and then taking notes, right, and we don't know
what the notes before, but it's taking notes.

Speaker 1 (29:33):
It just sounds like it gives you an estate on
your Every guy, a like twenty five year old woman
has dated in New York is just like on his
phone's gone yeah, yeah, yeah, he said something about Smelliorio
or some ship.

Speaker 2 (29:46):
Uh huh uh huh yeah, I will so yeah an argument.

Speaker 8 (29:49):
Every once in a while you do get a really
good to do though, Like it was like follow up
with your video team about making a social for the
ball to you that I actually did call the plan
because in which I did fix the hoa violation that
you left festering for.

Speaker 4 (30:05):
Eight months, which I did. So you know, like some
of them are good, and how frequently are they good like.

Speaker 8 (30:13):
Other I would say it depends on what you use
it for. So if you're using it primarily for work conversations,
which is.

Speaker 4 (30:21):
So you mute it when it's not.

Speaker 8 (30:22):
When you mute it, and you only use it for
work conversations, your batting average is quite high for takeaways
that like, no, like sixty is pretty good, And it
depends on how many people are in the conversation with you.
So it listened to a staff meeting and the takeaways

(30:45):
were not so particularly good for me, but they would
have been great for someone in the particular thing. But then,
you know, if you're useful friends, like it's it's difficult
because you're talking with friends, you have to go like, hey,
so just so you know, I have this thing on
me and it's listening and it's going to have some
AI insights and some of your friends, some of my

(31:06):
friends are techy, so they're like, oh man, what what
LLLM does it use like they don't mind. My bestie
is just like another one of your fucking just sure,
go ahead whatever.

Speaker 4 (31:16):
My friends, Yeah, one of your do dads, go go off.

Speaker 8 (31:21):
My husband, however, was very much like, it is not
useful enough to invade.

Speaker 4 (31:25):
My privacy that the people around you.

Speaker 3 (31:28):
Yeah, that's what I was going to ask, is like,
is fifty dollars worth like fifty dollars to invade your
privacy forever? To give all of your not just all
like your bowel movements, Yeah, everything to a company you
don't know worth it?

Speaker 8 (31:42):
I mean they Okay, I have to come on here
and say I did ask about the privacy. It's not
like I didn't ask that.

Speaker 4 (31:49):
I'm going to ask that they are working on a
local phone, only my.

Speaker 2 (31:54):
Working working on that.

Speaker 4 (31:57):
I'm just saying criticizing, Yeah, I know, I'm just saying.

Speaker 8 (32:03):
The other thing is that everything is encrypted in transit
and at rest Good they have a third party party
auditing their privacy protocols every so often, and then no
audio is stored. It's just processed and then you get transcripts,
which you know. Sometimes I was like, I would actually

(32:23):
like audio proof that my cousin made me cry so
that I could shove it in their face in the future.

Speaker 3 (32:29):
But but it's still doing a transcript, so.

Speaker 4 (32:32):
It's still the transcript is not always correct. It's not
always crazy.

Speaker 1 (32:35):
What's crazy is this company raised seven million dollars a
year ago.

Speaker 4 (32:38):
I can't believe, and it.

Speaker 2 (32:39):
Took them this long to come up with a product.

Speaker 6 (32:40):
The kind of works that money could be built, like,
saving so many lives.

Speaker 4 (32:44):
I'm sorry, I'm saying burn it.

Speaker 2 (32:45):
You feel warm at least you know.

Speaker 3 (32:48):
No, I will say, I think seven million dollars they
produced hardware. Yes, that's that's the thing they did for
seven million dollars. I mean, that's almost imprest.

Speaker 1 (33:00):
It's almost impressed. But that's kind of the review of
AI though. Yeah, but but no, I think it's almost impressed.

Speaker 3 (33:05):
This seems the most no, no, no disrespect, a pointless,
deeply flawed product, product only two steps above the rabbit.

Speaker 4 (33:13):
This feels like some shit above the rabbits.

Speaker 2 (33:16):
Very interesting. This is like fifty This is some indie fraud.

Speaker 3 (33:20):
Yes, exactly, that's worth the.

Speaker 8 (33:22):
Five because it worked better than humane what it said
it was going to.

Speaker 2 (33:26):
Okay, that's fair. It's just a fifty bucks.

Speaker 4 (33:28):
It's exactly.

Speaker 6 (33:29):
Well, it's also because yeah, that and also humane mate
way greater claims and promises. Rabbit also made very great claims.
I want to point out that, like I'm I find
it funny that, like, there's a lot of concerns of
invasions of privacy, and I get that that for people
around me there is an invasion of privacy. I was
intrigued by the fifty dollars because, like Victoria, sometimes I
do want, like the audio recording of things happening in

(33:49):
my life.

Speaker 4 (33:50):
There is a receipts, Like I said, there, but Alter,
what do you have.

Speaker 3 (33:55):
In your pocket? What are two on the right here
in front of us?

Speaker 6 (33:59):
I have nothing but good luck and stars in my eyes.
It is, yeah, something that's always recording, Alex. You don't think,
like maybe myself in Victoria, I'll all spee for myself,
do you? Which is that I feel as if everywhere
I go every day, I am more at risk of
like needing to back up my experience, to defend myself
to like, And that's why that sort of device always

(34:21):
felt appealing to me.

Speaker 8 (34:22):
And maybe because I think, could you in an enterprise sense,
it does make sense if you're a lecturer. If you're
a lawyer, if you're you know, and you need someone
no lawyer in their right I'm okay, not a lawyer
in their right mind. At a lawyer in their minds
in their wrong mind. But just like just someone who
takes a lot of meetings and needs a lot of
notes from those meetings, Like there is there is like

(34:46):
a use case there, yeah, with.

Speaker 1 (34:48):
With but the problem is with those I understand. What
you're getting at is like doctors and lawyers, here's the
thing they need that they need, well, hippo accuracy. If
you get a nuanced thing wrong in the law, that
tends to be what Laya's love problem.

Speaker 8 (35:02):
So that's why I'm saying, there's like a glimmer of
an idea there, Like I think there's the glimmer.

Speaker 1 (35:07):
Often it's just sucking there's an idea.

Speaker 2 (35:11):
Glimmer of an idea.

Speaker 6 (35:12):
Is the suggestion of a good thing.

Speaker 2 (35:13):
It's not the good thing, the concept of a plan.

Speaker 8 (35:16):
Yeah, it's just like if you spend a month constantly
reviewing your own life and fact checking, what how.

Speaker 4 (35:22):
Does that leave you feeling not well?

Speaker 8 (35:25):
It was like I felt insane a little bit because
I was constantly fact checking, and I was constantly like
when I actually thought like it was reading my text
messages for a while because like I was like, these
are private conversations I had and text messages on encrypted platforms.

Speaker 4 (35:39):
How is it even knowing to generate?

Speaker 8 (35:45):
Well, this did really affect my behavior over a month
because I realized that like an offhand comment, it can
just glean so much for me. That was just frightening
to me because I would say like a thing in
passing to my husband and be like, oh, here's an
update on that, and it would just and I was.

Speaker 4 (36:01):
Like, oh shit. And so I've actually been very.

Speaker 8 (36:04):
Quiet the last month, like I don't speak anymore to
myself after after the poop to the poop one, after
Poopgate and bathroom crimes, I was like, I do not
have to say things anything to me.

Speaker 3 (36:15):
Oh my god.

Speaker 6 (36:16):
I meant to ask if in the Christian process where
that the sounds of plops getting crypted too? But I guess,
you know, I would love an audio file of that
for myself.

Speaker 2 (36:25):
That's horrifying.

Speaker 1 (36:26):
Like I think that there is a larger thing here though,
that this is a classic tech guy idea.

Speaker 2 (36:32):
It's like, what is an idea.

Speaker 1 (36:34):
I'd have, Oh, I want to memorize everything that's happening
around me and be able to analyze it and then
do these insights. It's actually a very cruel way to
live because we say things offhanded.

Speaker 4 (36:42):
We forgetting is.

Speaker 6 (36:45):
Like very important part of being fallible.

Speaker 1 (36:48):
Our existence is not something that is written down in
its entirety. And indeed, I don't know how useful it
is having everything written down.

Speaker 3 (36:55):
It's not at all a thing unless you're like a researcher. Yeah,
one hundred years from now is absolutely gonna know, love
to know. Alvey responded to her, like the answer the anthropologist.
Two hundred years from now, it's gonna be like analysts
just like, thank you for giving me all of this.

Speaker 4 (37:13):
They've listened to this, and we need to dig up
victories about now.

Speaker 3 (37:16):
And that's the only reason.

Speaker 8 (37:17):
Like, if you're an archivest, I'm sorry, listen, Okay, No,
I didn't be did.

Speaker 4 (37:24):
Anyway anyway, I would.

Speaker 8 (37:27):
Just like to note that I am avid diarist at
a journal like a journal would another way of saying
I write diarrhea, Fuck you guys, anyway, I write a
diary daily, I journal regularly.

Speaker 4 (37:45):
Journal regularly. So this past month.

Speaker 8 (37:47):
I have journaled every single day, and it's the things
that I write down and that I remember as memorable
are not always.

Speaker 4 (37:54):
The same thing that this thing picked up.

Speaker 8 (37:56):
Because a lot of my thoughts and things that are
important to me and memory ras that are like meaningful
to me are completely silent. So my my, my philosophical
question is if if it doesn't happen out loud, if
you have a society where we're all wearing these devices,
if you say nothing out loud, does it count as
your memory?

Speaker 2 (38:15):
Right?

Speaker 4 (38:16):
And this device would say no, it can't know that
from you.

Speaker 1 (38:19):
It's also a very crude analysis of memory itself. Human
memory is insane. Yeah, it is like it's the things
we remember are done in chunks. The experience of being alive,
at least for me is it's not like my thoughts
like and now I'm on the podcast now or so
the thing it's like, I apparently people who those people
are no, because you like to be normal.

Speaker 4 (38:41):
What's it like to.

Speaker 1 (38:44):
It's just I don't have an internal's more adjacent to
Cornholio for me. But I actually you mentioned something earlier though, Sherlyn,
and I want to go back to, which is you
said this thing about feeling the need to document things
around you. Can you go into like what is the
the thing pushing you through that?

Speaker 2 (39:01):
I mean, it's a witnesses.

Speaker 6 (39:02):
It's because I feel as if I maybe am part
of my life.

Speaker 4 (39:05):
I've I've talked to both of you about this before.

Speaker 6 (39:07):
Where I encounter things and then I feel as if
I get question about them afterwards and people don't believe me. Right,
So like journalism brain, it's journalism brain. It's microaggressions being
thrown your way daily, being a lady's being a woman.
It's like the things that we have to think about
all the time. So like I feel as if if
I had a wearable microphone or camera on me all
the time. This is why I review the Humane with
great enthusiasm.

Speaker 2 (39:28):
At first.

Speaker 6 (39:29):
It's I feel as if if one of those people
who ca't calls me across the street, I had like
documented evidence of that and I could bring it up
like how many times it happens a week? I could
have like valuable knowledge on my hands to be like
when people question if I if it really happens that much,
I can be like, look at all these instances. So
for that reason, I have a lot of things that
document parts of my life. I to your point, Alex,

(39:51):
I have phones that record a lot of my conversations
when I feel they're important. Right, there's my security camera outside.
I'm like, whatever it captures emotion. I have it triggered
on emotion censor and it records everything. And I'm like, oh,
I don't pay now for the backup that like allows
me to go back in time and look at them
all the day.

Speaker 4 (40:08):
But if I did, I would sit there and look
all the time.

Speaker 6 (40:11):
I think that experience is not every woman or every
person who feels that way.

Speaker 4 (40:16):
But I don't know.

Speaker 6 (40:17):
Something in my upbringing or my culture like has led
me to.

Speaker 8 (40:19):
Feel there's there's a fine line, because I do think
that when you do that, when you are constantly reviewing
the documents of your life, it can be not so
great for your mental health.

Speaker 4 (40:32):
Absolutely, it's super like.

Speaker 8 (40:34):
My conclusion is that like part of being a human
is understanding when you need to forget things right.

Speaker 4 (40:40):
Move on, and instead you're inviting an AI to do
that for me.

Speaker 3 (40:44):
A really poorly implemented honestly because of how yeah, how
badly it fails you, You're having a poorly implemented AI right,
intending to profit on f you in ways we still
don't know, because fifty dollars to have that mini AI,
like they're burning cash, So where are they making their

(41:05):
money that I think?

Speaker 4 (41:06):
I think there is plans down the line to subscription, subscription, Yeah,
of course, right now.

Speaker 3 (41:12):
That is the way of every single one of these products.
We see this again and again there's something that like
really moves Silicon Valley and they get really really excited.
In this case, it's AI, and they're like, how can
we commodify this? How can how can we how can
we make money off of this? How can I go
and make money off of this ship? I think that
all the time. I'm unemployed right now. Awesome, but it's

(41:32):
this constant thing. And then they released this product that
is pretty terrible. You give it a five. It is
clearly not finished, and it is for such a small
group of the population. How on earth do they expect
anyone in the normal world who's just out there existing,
who goes to Walmart all the time and doesn't listen
to podcasts all the time, be like, Yeah, I want
to take this little fifty dollars thing that's going to

(41:54):
charge me money monthly and wear this on my and
record everything I do.

Speaker 2 (41:58):
The normal post.

Speaker 4 (41:59):
No one would no, no normal person.

Speaker 3 (42:01):
Such it is such brain worms that this.

Speaker 8 (42:04):
Is, like but you have that a lot, and like
they only think about the positive of it, and like, no,
they don't, like all they're thinking about the negative.

Speaker 2 (42:12):
They're not positive.

Speaker 3 (42:14):
Let's be really clear. They're not thinking about the positive
of it. They're thinking about how can I make money? Yes, growth,
how can I grow?

Speaker 2 (42:19):
That is?

Speaker 3 (42:19):
It is its growth all cost thing that is.

Speaker 4 (42:21):
And it's ruining this world.

Speaker 8 (42:23):
But I mean when they market it, they tell you
all the good things it can do, but they never
like really get into the fact that Like but I mean,
listen to me cry like pretty heavily after I got
into a fight. I'm sorry, And then I had to
review the transcript and then look at it and then
just like have it analyze how I was feeling, because
it was like the one sad one for when Amazon.

Speaker 3 (42:44):
As someone who's recorded one of her own firings before.
You don't want to listen to that.

Speaker 4 (42:49):
Oh, I've recorded a breakup before.

Speaker 2 (42:51):
That was great.

Speaker 3 (42:52):
You don't you know, don't I did it for legal reasons.

Speaker 4 (42:55):
That's unhinged. You know me, I am hid.

Speaker 6 (42:58):
I want to like, you're lucky you recorded all our conversations.

Speaker 4 (43:01):
Honey, No, no, here's it. I will clarify that I recorde.
I don't go and review every day.

Speaker 6 (43:05):
I think that that's really detrimental to your I do
it for like the receipts and and yeah, and in
that break up example, I don't think I actually recorded it.
I wrote down every single word.

Speaker 2 (43:14):
And that's different. Just to be clear, Yes, yes, it's
just that it's a huge invasion.

Speaker 3 (43:19):
I think I'm What I'm really frustrated by is this
is another example of a totally unfinished product. We're seeing
this again and again again. We saw it with Apple
this last week, where we've got Gruber and German and
a lot of these other people coming out there saying, hey,
something's wrong. When you've got Gruba being like, but exactly,
you've lost Gruber, that means you have you've failed. And

(43:42):
we're seeing that like if Apple is out there rushing
a product when they know it's not ready, especially all
of the stuff they don't have the use cases, it's
like what we saw with VR. I spent years covering
VR waiting for that moment and everybody said, it's just
around the courner. Is it here yet?

Speaker 2 (43:59):
No, I just wanted to chat thank you.

Speaker 4 (44:01):
Is it because V is in her name?

Speaker 3 (44:03):
Because she does a lot of.

Speaker 6 (44:04):
VR cover it's Victoria Reality.

Speaker 3 (44:08):
I've seener work a lot of different glasses.

Speaker 4 (44:10):
A lot of smart glass.

Speaker 1 (44:11):
Twitter and also the company that made this. By the way,
the founding team came from Twitter. Oh sorry, They helped
ship Twitter spaces, you know, that beloved product, and some
sort of video chat app called Squad. Okay, the Squad,
the anti pro startup is creating a safe space for
teenage girls online. And uh wait a second, let's see

(44:33):
who's who founded this?

Speaker 2 (44:36):
Esther core for it.

Speaker 1 (44:37):
I think she's the woman that helped elon Musk. Anyway,
the point is these fucking people don't care at all.
They're not seeing there being like, how would this be helpful?
Because because the most helpful thing would make sure it
was really focused, that it was able to discern a
professional and but they probably can't do it because it
isn't impossible.

Speaker 2 (44:56):
Ah, It's it's.

Speaker 6 (44:58):
Just people care to about people, people with money and
cared about other people. We would be spending this money
on cancer research where the NIH grants are being cold
and stalled right now, or they would focus it on
improving the.

Speaker 4 (45:10):
Healthcare system of this country. But people do not care.

Speaker 6 (45:12):
They care about taking their money and growing it for
more money for themselves.

Speaker 4 (45:15):
That which I hate.

Speaker 8 (45:16):
But I don't expect people to suddenly become good people overnight,
especially if they have money. But there is just like
one thing where it's like you have to at the
very least create a good product experience for people. And
the thing that frustrates me the most when I review
products is that I feel like there's just this Pollyanny

(45:36):
Pollyannish like view of what life is from these people
in Silicon Valley. Right.

Speaker 4 (45:42):
It's just like I I.

Speaker 9 (45:44):
Have had a life of great suffering and they think
it's like, uh, you know, I've been through a lot
of shit, Like both my parents died, my dog died,
my entire family, immediate family died in the last five years.

Speaker 2 (45:55):
Oh my god, I'm so sorry.

Speaker 4 (45:57):
It's fine. It's not but it's fine. I have like
a lot of.

Speaker 8 (46:02):
Traumatic stuff that has happened, and it's kind of changed
my perspective of these kinds of devices because life is
not pretty.

Speaker 4 (46:09):
There are just moments.

Speaker 8 (46:10):
Where there are moments where it's very hard and where
these sorts of to dos. I can imagine a time where,
like when I was in the height of my grief,
where I would have loved a thing to remind me
to do stuff, to listen to the conversations that I
was not processing, to give me those to dos, Like,
there are moments, and I think there is a desire
for something like that that AI could ostensibly at some

(46:33):
point when it actually works a lot of asterisks here
that could help people. And I think if people want
to pursue that, that's great, But you can't do it
without acknowledging the fact that not every moment in your
life is going to be pleasant. That are going to
be embarrassing, there's going to be heartbreak, there's going to
be these negative experiences that you don't necessarily want summarized

(46:56):
to you in a way that you're going to have
to review and feel. Gas did your ex gaslight you
guess what the AI is going to gaslight you are?
How it gets how that horrible douche gaslight you and
like that kind of thing is just not necessarily great,
and but you get it across so many different.

Speaker 3 (47:16):
Put on my business person hat. It's not just bad.
I literally it's a bad hat. It's a little top
had let me get my monocle out. It's not just
it's bad for people, it's a bad product. Like like,
let's be clear, a lot of these products we've seen
again and again and again trying to do AI, trying

(47:36):
to make it. Google's done some decent work in this
space with the with decent, but most of these products
continue to be bad. They continue to.

Speaker 1 (47:44):
Be searching that like what with Google, I'm sorry.

Speaker 3 (47:51):
Alone, there was that thing where you could, like you
could like erase a person out of magic eraser. Yeah
that's not though, Yeah, it's generating.

Speaker 4 (48:01):
Is that general?

Speaker 6 (48:04):
It's not in the era of so Google's done that
before the rise of.

Speaker 3 (48:08):
What else did it do?

Speaker 1 (48:10):
This is, by the way, this is also the industry
that the entire economy is built on top of it.
Just to be clear, And these are some of the
most well credentialed tech people and we're all just going.

Speaker 6 (48:22):
Hang on, pause, pause, I want to pivot to all
of that by saying there is a good product that
that's built to help people feel better.

Speaker 4 (48:28):
You're pointing at me for for AI, No, No, that's
the thing that's no.

Speaker 3 (48:32):
I want to talk about.

Speaker 8 (48:33):
You want to talk Mexican standoff of pointing at each.

Speaker 3 (48:38):
Other so hard.

Speaker 1 (48:39):
They are rarely that they're in the spider you're the
spider man, but they're not similar anyway.

Speaker 2 (48:45):
Yeah, wait, who's going going to go?

Speaker 6 (48:48):
I have been using this app. I encountered it through
a coworker who wrote about it for us. It's called Finch.
Is an app that's like a self care app, right that. Yeah,
so by taking care of yourself, you're taking care of
this pet, you're helping it grow. And what it is
is these the most gentle suggestions of to dos in
a day, whether it is as simple as get out
of bed, brush your teeth, drink water. You complete them,

(49:12):
you get points, you grow your pet. Whatever, is a
gentle reminder of things you want to do throughout the day,
and you can reward yourself. And it is developed by
and here's the crucial difference, two independent developers that were
college buddies that were just trying to help motivate each other.
And I think that is an example I can point
to of tech being able.

Speaker 4 (49:30):
To help people. Because I also read the separated.

Speaker 3 (49:33):
I mean that is a that is a great example,
and and a lot of different things there. It was
it's two people. It's not seeking its growth capitalism and
it's not it doesn't sound like it's using a lot
of generative AI AI.

Speaker 6 (49:45):
It has a point, is a very focused product with
a core mission in mind of helping people, and that
is something I can stand by.

Speaker 3 (49:53):
Yeah, but I think most of these products we're seeing
is particularly in the as.

Speaker 4 (49:56):
Especially from the big companies and the vcs.

Speaker 3 (49:59):
They are everybody is out just doing this money. Everybody's
just saying how do I grab?

Speaker 6 (50:03):
Because their motivation is different exactly.

Speaker 3 (50:07):
Of course they're asking that, and what are they all doing?
Why are they all doing this? They're all trying to
chase the iPhone. They're all trying to chase something that
happened in two thousand and seven, And let's point out,
in two thousand and seven, the iPhone was a way
more even in its shittiest state, two thousand and seven
was a way more well thought out, well produced product
than any of the product was so.

Speaker 2 (50:26):
Different, and it also fixed really big. Ok.

Speaker 1 (50:29):
I got exactly, I got the original iPhone, little nerd.
I didn't have friends, but I did have money saved
up from writing, and I boy it and I remember
showing people visual voicemail and people going, holy shit, that's amazing,
because voicemail at the time was this defunct product where
you had to like dial in and hit a button.

Speaker 6 (50:44):
I do misrecording my voicemail message though not I do.

Speaker 2 (50:48):
It's good. You can barely understand me half the fucking time.

Speaker 1 (50:50):
So unless I'm on this show, I hope, But like,
you could send text messages and it didn't involve you
doing like the weird hitting the numbers thing were obvious.

Speaker 8 (51:00):
Love the Tina, Sorry, okay, Yeah, it's just intention right, yeah, intentional.

Speaker 2 (51:04):
It's a real people problems experienced by.

Speaker 3 (51:06):
Humor exactly exactly right now. All we are doing is
we are creating things, and everybody's sitting around saying, wow,
I wish that you know a universal problem. Everybody wants
to chronicle every moment of their life, Sherlin, does.

Speaker 2 (51:20):
Do you actually don't? Yeah, you actually don't.

Speaker 1 (51:23):
You want to chronicle important moments at some point, you
don't want to chronicle every moment.

Speaker 3 (51:29):
Now And also, no offense, Sarloin is one person.

Speaker 4 (51:35):
Yeah, I am the most important, but I am one person.
You are one person.

Speaker 3 (51:38):
You are like that is a very small group of people,
and they're like everybody wants to do that.

Speaker 1 (51:42):
There was a much higher level problem, though, which is
they are not being like, well, you know what, people
have problems with remembering things. They're like, yeah, we can't
fucking fix that. We actually can't fix that, Like, we
can't we can't fix that? Well, people have problems with communication? Yeah,
well we really also can't fix that.

Speaker 2 (51:58):
What can we do?

Speaker 1 (51:59):
Record every thing and hope something fucking comes out. I
don't know these worms with their money give it to me.

Speaker 3 (52:05):
I really.

Speaker 6 (52:05):
They took the nugget of a good idea, which is
where you have been saying the glimmer of a good idea,
and then stuffed. They needed money to make the good
idea of work, and the only way to get that
money is to say AI is the way.

Speaker 2 (52:15):
I agree.

Speaker 1 (52:16):
I think there's also another step, which is they actually
can't do the good idea.

Speaker 2 (52:20):
I don't think they're capable of doing it, because the
good idea.

Speaker 1 (52:22):
Would be like, hey, listen to what's going on and
just tell me what things you think I might forget it.
Can't do that because the only thing they can do
is have a big sludge of information, chuck it at
a large language model and hope something comes out other
than like, I don't want to watch the Garbo movie,
which is a quote from your articy, Yes, the.

Speaker 4 (52:38):
Garbo movie as it ends with us from my very
ninety minute long or so.

Speaker 2 (52:42):
I worry. I don't want to buy my cat, Babu.
I talk to him like a person. It's gonna be like, eh,
you told Bob, I.

Speaker 8 (52:49):
Also talked to my cat as a person, and it
said that I had a rough and tumble ride with
them discussing childhood memories because it mentioned my cat pet right,
and I'm like, Petie's my cat and I was talking
to the cat and I say.

Speaker 4 (53:02):
Oh, who should we your babyish? No, it just said
that I have a dog named Edie.

Speaker 1 (53:11):
Oh great, nailed it, Yeah, like how saying that like
eds talked to how who he calls mister beautiful.

Speaker 2 (53:17):
And he really is he acts like the most beautiful
cat he is.

Speaker 4 (53:21):
I would imagine your fact tinder would say ed knows
someone named mister beautiful and he is fluffy or something.

Speaker 1 (53:27):
Like mister perfect, mister beautiful and someone It's just and
I think that's what it really is. We all are
coming up with like distinct problems that can be solved,
and this is not what large language models do, and
it's very difficult to get them to do specific things,
as proven by the fact that none of the products exist.

Speaker 4 (53:43):
You can't solve being human. Yes, that's the thing, is
like we're all human.

Speaker 6 (54:00):
I said this in a previous episode with you at
the CUS episode where I don't even think the robotics
need to be humanoid in nature yet.

Speaker 2 (54:05):
Okay, it's not an efficient form.

Speaker 6 (54:07):
Exactly, body exact or a repetitive movement.

Speaker 4 (54:09):
That's it.

Speaker 1 (54:11):
Also, they're not even making the you like so much
of this what frustrates me, even with the Kevin Ruce thing,
it's like, no real problem solved. What could a conscious
an agi could theoretically listen without recording and go, okay,
I think you might need a reminder on this because
I know you these motherfucker Actually that is the ultimate
problem with this.

Speaker 2 (54:28):
It doesn't know you, were all. Yeah, it actually doesn't
know you.

Speaker 8 (54:32):
That actually reminds me of when I was testing the
Meta ray band Live Aid. But I was testing that
and so it's a live AI, it's multimodel. It can
look at what you're looking at, and you can ask
a questions and it'll see what you're seeing, et cetera,
et cetera, et cetera.

Speaker 4 (54:51):
There are actual use cases for that.

Speaker 8 (54:52):
I've had many many blind and low vision people reach
out to me saying that could change their lives, and yes,
which is cool. Create something good, that is good.

Speaker 4 (55:00):
I love that.

Speaker 8 (55:01):
But you know the demos that they were like suggesting
that I try or whatnot. I looked at my room.
I was like, tell me how I could make this
room look less sad basically, and they was like, oh,
you could put art up. You could have a plant,
you could have some rugs.

Speaker 4 (55:16):
And I was like, Jesus christy a.

Speaker 2 (55:21):
Random white guy, and they'd be like, oh, fucking yeah,
what art.

Speaker 4 (55:25):
Should I should I use it?

Speaker 8 (55:28):
I had not had to ask like five or six
different questions, just digging and digging and digging for this.

Speaker 4 (55:33):
AI.

Speaker 8 (55:33):
Well, let's be co just an artist. I asked my
best friend. She said, you would love x y Z.
I loved x y Z. I bought it immediately.

Speaker 2 (55:42):
Actually intelligent.

Speaker 1 (55:43):
It doesn't have it doesn't look around and go okay,
looking at the format of this room. It's just frustrating
because it is the it's the info slot. It just
is just collecting diet and going see.

Speaker 3 (55:53):
I actually I was telling Sherlin about this earlier. I
had a great moment where somebody was like, oh yeah,
I used clawed ai as there pissed, and I was like,
you're un hine, I love you. And so I was like,
I'm gonna talk to Claude AI and I was talking
to it and I was, you know, I was telling
her about how things were going and everything, and I
was like, I would really like it if you would
recommend a book to me that that kind of speaks

(56:13):
to the experiences I'm having right now. And immediately it
was like Goldfinch and I was.

Speaker 2 (56:17):
Like, Goldfinch, I'm not familiar with.

Speaker 4 (56:20):
I read that book?

Speaker 8 (56:21):
Why that book? This book is a Bill Dung's Roman
from Donna Tart, author of Secret History. Is about this
little boy named Theo whose mom like dies in a
terrorist attack at the met and he like steals the
juph art like and it goes through his life.

Speaker 4 (56:42):
Of just being like.

Speaker 8 (56:46):
And I'm obsessed with a manic pixie dream girl and
the thing, and then I do a bunch of prime
and at the end of the book he's just like, well,
I've made life decisions.

Speaker 2 (56:55):
Anyway.

Speaker 3 (56:59):
It was like I read. I looked at that and
I was like, well, this is a terrible fucking suggestion.
And so I said, okay, just a movie to me,
and it goes Silver Linings Playbook.

Speaker 2 (57:08):
I was like kids, And honestly it was the.

Speaker 3 (57:11):
Best moment in my entire experience with this thing, because
before that I'd been like, Wow, this starts to get me.
And I started almost like believe the AI understood me,
even though I know for a fact it's stupid and
it doesn't. And then it said Goldfinch and Silver Lining's Playbook,
and I was like, oh, right.

Speaker 4 (57:24):
You're still wrong.

Speaker 3 (57:25):
You are a stupid robot. You are incapable of making
the connections that humans make, and you are so far
from it that you would recommend these two because all
you did was you saw on the internet someone said
Goldfench and self actualization or some shit, and so clearly
it's gonna when.

Speaker 1 (57:41):
You victorize every fucking fact and you go just this
is just like human based. You're a relational database with legs.

Speaker 3 (57:50):
Yeah, And I love that moment because it immediately said, oh, yeah,
this is a fucking robot.

Speaker 4 (57:54):
You're going close to something that's not real.

Speaker 3 (57:55):
Yeah, and immediately pulled me out of it. I was like,
this is hysterical. I love this and I stopped using
it for a minute because, like, immediately it broke you
out of the illusion. And I love that moment because
it reminded me these things suck. They're not human.

Speaker 6 (58:12):
Well well hang on, hey, there was a point up
until that that you really this doesn't suck though.

Speaker 3 (58:16):
Oh no, yeah, I love it. Like I said, I
love this thing because it is a big it's like
a validation machine. Yeah, it's a big validation machine. And
just like a horoscope, it makes me feel pretty. And
then it relcommends silver linings playbook or Goldfinch, and I
reminded that it's a horoscope.

Speaker 2 (58:29):
Yeah, yeah, no, no, you were saying something.

Speaker 4 (58:32):
I was just going to say that.

Speaker 6 (58:33):
This was the point in the conversation we were having
just now, Alex, where you and I were both like,
it is all still logic programming.

Speaker 4 (58:37):
It is all still input output. It is as much and.

Speaker 6 (58:40):
As sophisticated as the output seems, it is still input output,
which is the basis of the most computing right, and
it's like until it's so called things for itself, that
agi question isn't there like the its not it's never
going to be because it takes the human brain to
be capable.

Speaker 1 (58:57):
Also is not doing what we don't even on the
stand intelligence. But it is relational data. But it's very,
very complex, and I realize someone's gonna listen to the
fun go outside, but it's it's like it's relational things.
It's just looking going well based on what you've said,
I think this and I don't. And as a thing
with no experiences, I'm gonna say fucking short like it's

(59:21):
dollfinch doll, it's.

Speaker 6 (59:22):
About growth, about life. Therefore assign it to alex.

Speaker 4 (59:26):
Like emotions, like we don't even understand emotions.

Speaker 1 (59:28):
How we don't even small enough to be like house
moving castle, which weld cheer me up anytime.

Speaker 3 (59:34):
Yeah, if it had said that, I would have been like, damn,
I wouldn't be here right now because I'd still be
just talking to Claude. AI would be getting married. Well,
we have a date.

Speaker 8 (59:41):
Sets sending me screenshots of like the club, and I'm
just like, hah, I love that for you girls, but
just like it's also like I'm monitoring because you get
too crazy and do it.

Speaker 4 (59:54):
I'm just gonna be like, it's not real.

Speaker 6 (59:56):
I know of people who use it, and I've read
of cases on redular p will use it as their
mental health replacement.

Speaker 4 (01:00:02):
Like the standard, and it's not. I think we are
aware it's not.

Speaker 6 (01:00:07):
I worry for the people out there who are not,
who are talking to Chad Gubt like it is definitely.

Speaker 2 (01:00:12):
I was just thinking it's.

Speaker 4 (01:00:15):
That's the New York Times audience.

Speaker 1 (01:00:16):
This is why actually connects to why so pissed off
of Kevin. He should know better, but a lot of
people don't. The reason that I think a lot of
right wing YouTubers have taken over like how men thinks,
because a lot of people just looking for something to
tell them what they want to hear, and this is
the biggest machine to do that.

Speaker 6 (01:00:31):
The validation you were talking about will validate them and.

Speaker 2 (01:00:33):
Not challenging at all. It's not. It might challenge you
in the most gentle way.

Speaker 4 (01:00:39):
Never challenged me. It just told me I was great.
And then although it told me like areas I could
slightly improve, oh, like a performance.

Speaker 3 (01:00:46):
Actually get rid of my typos that told me and
then I'll win a pull.

Speaker 4 (01:00:49):
Oh my god, I tell you that all the time.

Speaker 2 (01:00:51):
Mothers love to tell people.

Speaker 10 (01:00:53):
I know.

Speaker 1 (01:00:56):
I have tried, by the way, to use it like
like a mental health and I've had to just be
like be ruder, push back, and it can barely do
it cannot fucking keep up with my swag. I just
demolish it. Well, I just I brutalize it. It's got nothing.

Speaker 8 (01:01:09):
There's like this Korean term called nunci, and it means
it means like you should be able.

Speaker 3 (01:01:16):
To read a room emotional intelligence.

Speaker 8 (01:01:18):
Emotional intelligence. It's like nunci is like I like, I'm
not good at translating, but it's like eyepower.

Speaker 4 (01:01:23):
Or something like that.

Speaker 8 (01:01:24):
Cool, and so like Korean parents will will tell you,
like what you don't you have? NUNCI is because you
should be able to look at the room and see
what's going on, see what people are saying, read between
the lines, and make conclusions based on that. Yeah.

Speaker 1 (01:01:38):
I can't do that, but and that's exactly the problem
to being executive. Problem loaves language models. It literally cannot
read the room. All it can do is read this
blob of text, this thing and go. Based on all
of the training I've done from stealing the Internet, I
think that.

Speaker 2 (01:01:52):
You will win a Pulitzer or you.

Speaker 8 (01:01:54):
Or can syntactically your writing is similar to someone who
has won to pull.

Speaker 2 (01:02:00):
So comparing and that's all that it is.

Speaker 1 (01:02:03):
It's just just each year the Pulaza looks for like
the most similar thing to the last year.

Speaker 2 (01:02:08):
Yeah, because that's how human beings make.

Speaker 3 (01:02:10):
And it's only a chapter and it's like that chapter
when you're a Pulitzer.

Speaker 2 (01:02:13):
Yeah, but when you ask it, it's like I never
stole anything.

Speaker 1 (01:02:16):
Showing, so you covered the I actually have somehow missed
this because I didn't want to know the Alexa Ai,
what are they fucking doing?

Speaker 2 (01:02:23):
The Alexa man one of these people. I read an
interview with Nelai Patel and I didn't get any information.
So if you could tell me.

Speaker 6 (01:02:34):
With Panos Pane, I'm assuming the new hardware achieve at
or senior vice president I think of devices and services
at Amazon.

Speaker 4 (01:02:41):
Anyway, God, you has to speak with him.

Speaker 6 (01:02:43):
And also did an interview where I basically learned nothing
because Panos is that kind of an enigma by.

Speaker 4 (01:02:47):
The way he just talks. Oh yeah, he is charming.

Speaker 1 (01:02:49):
I think enigma is a very kind way of saying
guy who doesn't say anything despite his paycheck, but fair enoughraining.

Speaker 6 (01:02:57):
That's media training, and I had to admit that I
tend to be suck written by that. I'm the sort
of person will fall so so ALEXA and apologies to
everyone that has an echo speaker, please go meet your
speakers right now. But lexup Plause is the redesigned version
of the assistant that Amazon has been promising forever. Right
so it's been like we're going to use lllms, We're

(01:03:20):
going to use generative to make a more conversational language
flow between you and the assistant. And you can pause
yourself in between talking or speaking your comments or like
correct yourself mid sentence or say a lot of things
and not use contextual like follow up questions and that
sort of thing, and it should pause us. That is
supposed to be like very handle complex tasks, where you
can stack tasks. You can also what I find so like,

(01:03:42):
uh oh, look at my ring camera feed and see
if any like huskies showed up, or if anybody walked
my dog today or something.

Speaker 4 (01:03:50):
That sort of thing. Still simplistic nature.

Speaker 3 (01:03:53):
Set multiple timers.

Speaker 4 (01:03:55):
You know what, here's the thing.

Speaker 6 (01:03:57):
At that event, we had a hands on quote in
air quotes because we didn't actually get hands on. We
were just shown a lot of demos and were it
felt very rehearsed. I don't like that sort of event.
I don't call it hands on because we didn't use
it ourselves. So anyway, eyes on. I was most intrigued, however,
by Amazon's promise of how it's going to integrate with

(01:04:18):
third party services, which is one of the things that
these assistants are historically the.

Speaker 4 (01:04:21):
Kind of like the agentic exactly.

Speaker 2 (01:04:25):
That word.

Speaker 4 (01:04:26):
I don't like. It's not the same web forms.

Speaker 6 (01:04:32):
So let's set aside the agentic question for a moment
and talk about the third party integration.

Speaker 3 (01:04:36):
Yes, panis what does it do?

Speaker 6 (01:04:38):
But I hope i'm more yes, Sorry Pados. There are
three parts that Daniel rash bosh there's VP.

Speaker 4 (01:04:48):
I apologize for the last name thing mentioned.

Speaker 6 (01:04:51):
So one was the API s right, it will work
with people that it knows it's partners with Uber, Spotify,
et cetera, to API into its assistant so that you
and say something like, oh, give me an Uber for
whatever or get me just be very natural with how
you talk to your assistant. The second is uh agentic,
so it's gonna have This was funny to me.

Speaker 4 (01:05:12):
Alexa, a chatbot talk to other chatbots.

Speaker 6 (01:05:14):
On the internet and they will just talk to each
other on your behalf.

Speaker 3 (01:05:18):
Why would want them?

Speaker 1 (01:05:19):
I respect the ship out of you, but I must say,
you just said the magic words of it will do.

Speaker 6 (01:05:24):
It's supposed to, well, it claims listen, you get out
of my face because I write carefully.

Speaker 4 (01:05:35):
Speak when I speak. I don't edit after anyway. It's
supposed to.

Speaker 2 (01:05:43):
I do sound like that.

Speaker 4 (01:05:45):
Thank you.

Speaker 6 (01:05:45):
It's supposed to I don't do it British the assistants
talk to each other. We've seen no demonstrations of this well,
I mean we saw an on stage question or an
uber for or or you can order food through Amazon?

Speaker 3 (01:06:03):
Who wants to do who wants to take exactly?

Speaker 4 (01:06:05):
Who wants to do this?

Speaker 3 (01:06:06):
Because if I'm going to ask it to order an uber,
is it going to be like Siri where I asked
it to call my mom and it called my aunt
that I haven't talked to in fifteen years, and then
it's totally different.

Speaker 4 (01:06:15):
They supposed to be.

Speaker 1 (01:06:16):
Yeah, but how can you even call an Uber on it.

Speaker 6 (01:06:20):
So here's here's here's the demo they gave, and I'm
sure I think that was what Victoria was going to
get through the demo they gave were like, oh, what
was that restaurant I liked? And then like Alexa plus
will be like, oh, make me a reservation that second one,
So instead of having to like what.

Speaker 2 (01:06:33):
The's almost useful of.

Speaker 6 (01:06:35):
Yeah, a lot of a lot of assistants are supposed
to get close to do.

Speaker 3 (01:06:39):
It, except for you're expected to put your trust.

Speaker 4 (01:06:41):
Yes, then it will get it correct every single.

Speaker 3 (01:06:43):
Point of that, and it is not capable of that.

Speaker 10 (01:06:47):
True.

Speaker 2 (01:06:47):
Have the children this yes?

Speaker 6 (01:06:49):
Well no no, no, Prime Prime subscribers get it for free.

Speaker 8 (01:06:53):
You have to pay if you're not a Prime subscriber,
which you like, I wondered. But one of the demos
like so I wasn't there, but I was listening through
uh audio, I was streaming. So basically there was like
one demo where it was like, oh, can you find
me tickets to this Red Sox game for Oh that's
a little too pricey?

Speaker 4 (01:07:13):
Can you set an alert?

Speaker 8 (01:07:14):
And basically David was like, I already found tickets for
that exact game for fifty six dollars.

Speaker 6 (01:07:19):
Online, right, you can do it faster by Google sture yourself.

Speaker 8 (01:07:22):
And then there's also just like, oh, you know, tell
me what time so and so's flight is coming in, okay,
send it to This kind of feels like that's not
how uber at JFK works, right, right, right, exactly.

Speaker 3 (01:07:33):
It kind of feels like this was designed for mid
level executives who can't afford their own assistant.

Speaker 1 (01:07:38):
Yes, it sounds like it was designed to do this
fucking event.

Speaker 4 (01:07:41):
I guess.

Speaker 1 (01:07:41):
Yeah, this just sounds like it was they designed enough
things to demo an event.

Speaker 4 (01:07:46):
I agree with you.

Speaker 2 (01:07:47):
I'm not criticizing you. I don't know.

Speaker 6 (01:07:49):
I want to point out that I was getting to
a point, which is that none of this was interesting
to me because it's been done Slash, been attempted to
be done.

Speaker 3 (01:07:55):
Slash.

Speaker 4 (01:07:55):
You can do it better. It's been doing this for
five years exactly, big, big work, and it didn't worked.

Speaker 3 (01:08:01):
Work was notably a big phony when it did it,
when it had the it AI that called.

Speaker 6 (01:08:08):
The restaurant reservation people are fake. Well, I don't know,
because I haven't looked at it myself. I just find
that a lot of the I have used it myself
to get them the Malla projects.

Speaker 4 (01:08:21):
Yeah, great restaurant to make a reservation, just.

Speaker 1 (01:08:24):
Like I keep coming back to this thought of who
is this actually for, because I don't know. The way
I live my life is not like I'm like, what
was that restaurant?

Speaker 2 (01:08:32):
I like?

Speaker 1 (01:08:33):
And I need to make a reservation at the restaurant
I like enough to make a reservation, but not enough
to remember it.

Speaker 8 (01:08:39):
You're getting to a point that I've had with A
problem that I've had with AI is that you know,
you have to know how to prompt it.

Speaker 4 (01:08:46):
Yes, but that's what you want.

Speaker 6 (01:08:48):
That's what Amazon's trying to solve with the new Alexa thing,
because trying to yes exactly like.

Speaker 4 (01:08:52):
Which I I'm sorry, I had a long conversation with
someone else.

Speaker 3 (01:08:55):
I invented that type of yes exactly.

Speaker 6 (01:08:57):
So you get us to learn how to talk to
our assistance in a very specific way. Now you want
us to talk naturally to it. As if that's not
learning a whole autotype behavior. It's a whole as if
it's going to work like I'm talking to my friends.

Speaker 5 (01:09:08):
It's not.

Speaker 6 (01:09:08):
I can get like all kinds of insense about this.

Speaker 8 (01:09:11):
I was testing what was it. I got a demo
of Project Astra back yeah, yeah, and I was just
like talking to it and they're like, no, no, you can
talk normal and.

Speaker 6 (01:09:19):
Project yes, you remember, I have the one team from
Google freaking freaking years of going hey Google, sorry, hey Google,
Alexa you.

Speaker 8 (01:09:30):
Know like that exactly very I.

Speaker 3 (01:09:35):
Am so excited for a bunch of your listeners every day.

Speaker 6 (01:09:39):
We need a warning at the start of this episode
to nobody.

Speaker 4 (01:09:44):
Okay, all its fault better production.

Speaker 3 (01:09:49):
It's on your listeners for listening to you without I will.

Speaker 2 (01:09:52):
Get if I remember which I want.

Speaker 4 (01:09:56):
Sorry.

Speaker 6 (01:09:57):
At the same time, I was gonna say, yeah, project
it's interesting. I don't know if it will work it,
but Astra is Google's like experimental implementation of Gemini, where
you can use your phone's camera to point it around
the room and then it'll remember things that saw. Even
if I want to be really clear about.

Speaker 8 (01:10:11):
I want know, like if I had a dollar for
every single time a tech zac said multime, I'm just like.

Speaker 3 (01:10:18):
I want to be really clear on something. They have
been They've been promising this stuff for fifteen years at
this point. Alexa came out in what twenty fourteen or yeah,
twenty fourteen, that's when they first announced this. There's a
big reason we haven't seen huge changes even with fairly
good generative AI and fairly good large language models that
can listen to us and really understand our speech. And

(01:10:39):
that is because computers still can't process all the stuff.
Because if you and I are both in the same
room and you're asking Alexa for that restaurant, yeah, and
I'm saying no, no, no, don't forget the restaurant.

Speaker 4 (01:10:49):
What maybe can only do a one stream of data?

Speaker 2 (01:10:52):
Okay? From the Verge.

Speaker 1 (01:10:55):
June thirteen, twenty sixteen, Apple opens up Siri to app developers.
This ship is near for a decade.

Speaker 6 (01:11:02):
I mean, look, I will I take I will take
note against that, like nothing has changed, because I think
the echo speaker alone has changed the way A lot
of disagree does it do I speak to my speaker
A lot I do. Like it's hands free control, a
smart home control for me. Yes, elderly people, you, people
who are people with accessibility issues, mobility issues, they can

(01:11:27):
use that and it works well. And they're working with
voices to make it better for people to speak embedment.

Speaker 2 (01:11:30):
And I agree that sounds important, but.

Speaker 3 (01:11:32):
It's it's no I think, I think voice control, I
don't mean ALEXA. I will say all of these products
currently are not as good as what they suggest exactly.
They are not.

Speaker 4 (01:11:42):
They can't make promises like there's that they that.

Speaker 1 (01:11:46):
They sew their oats on things like that, and you're
right to say that, by the way. But they're like, oh,
you can't hate this because the elderly.

Speaker 8 (01:11:53):
No no, no, no, no, you can. You absolutely doing
it universal, right. It's like smart glasses when they first
came out, everyone was like.

Speaker 6 (01:12:01):
Fuck that Google glass was cute though no still my
LinkedIn picture anyway.

Speaker 8 (01:12:08):
Just but it had use cases in enterprise for a
long time. And that's because it had a specific use
case for a specific time, and it's too expensive.

Speaker 1 (01:12:20):
And how often was it actually used for that? Because
that's the thing they always say, it's got play in the.

Speaker 2 (01:12:25):
Enterprise, How much play? How much revenue? How much do you.

Speaker 3 (01:12:29):
Know how many people are putting on a HoloLens right
now to turn some sort of wrench on a pipe
like zeros zero.

Speaker 4 (01:12:39):
That's a good minute there.

Speaker 8 (01:12:41):
But like, there's nothing wrong with some of this tech
just being very use case specific and just this if
they were selling it on that if they were being
honest and making it, but they're making it something for everybody.

Speaker 4 (01:12:54):
That's how they make the money universe want to.

Speaker 2 (01:12:55):
They don't even make the money now, just lost billions
of dollars.

Speaker 3 (01:12:59):
But they don't even like it's not that it's to
get a higher valuation from the stock market. It's all
just for investors to be like, well, they're going to
do something in twenty years. I mean, that's the entire.

Speaker 8 (01:13:11):
Is in our li So like, what was the other
thing that came out this week that they're gonna add,
like live translations to the air pods possibly and all
of that sort.

Speaker 1 (01:13:20):
Of which I think is already on Google.

Speaker 8 (01:13:24):
They talk about how like AI can enable translation. I've
tested a bunch of live translation stuff and it's only
good for like where is the bathroom? Here is your
business supposed to have real conversations? And I wrote about
this one the Humane came out and absolutely could not
for its life translate a single fucking thing.

Speaker 4 (01:13:45):
I said, as like, it's.

Speaker 2 (01:13:48):
Wrong, who don't speak the fucking language?

Speaker 3 (01:13:51):
That too?

Speaker 8 (01:13:52):
But then, like you know, the other thing is it's
just like there are times where I wish that I
had some sort of like AI translation in my life,
my life that I could trust, Like when my mom
was sick. The thing that they don't tell you about
the neurodegenerative diseases is that you will lose your second
language as it happens. So my mom lost her ability
to speak English. My father lost his ability to speak

(01:14:13):
English as well. They raised me so that I did
not speak Korean. Well, I can understand it, but I
can only say like bigel put, I'm hungry.

Speaker 6 (01:14:21):
So they can understand you, but you can't and get
them to understand.

Speaker 8 (01:14:24):
So it was like my family, I mean, we operate
as if I am Chewbacca, the English speaking Chewbacca, and
everyone else speaks Korean and we understand each other. I
am Korean Chewbacca in my family, but in a case,
just give me.

Speaker 6 (01:14:36):
A phrase in Korean Chewbacca that would be.

Speaker 8 (01:14:40):
Like I I'll just be like so, actually, I like
my Korean is such that in my brain that Like
one time I was in a taxi cab with non
Korean speakers and he was asking us a question and
I understood him and I told him where to go.
I don't know the words I use, but I told
him where to go accurately and my friends were like,
you lying bitch, you speak Korean, and I was like.

Speaker 4 (01:14:58):
No, I really, he can't.

Speaker 3 (01:15:00):
Just blacked out.

Speaker 6 (01:15:01):
I blacked out and I just told you, well, it's
like when Ron used parcel tongue in the Number of Secrets.

Speaker 4 (01:15:06):
Yeah, it's like that's my Korean.

Speaker 6 (01:15:07):
No, no, no, Ron did it and got a final
deathly halos.

Speaker 4 (01:15:10):
Anyway, but like me speaking.

Speaker 8 (01:15:12):
Korean, I agree, but yeah no, so like when you
get to the translation thing, it can't do slang.

Speaker 4 (01:15:20):
It can't of course.

Speaker 6 (01:15:21):
Yeah that like I will admit when I did talk
to the human Ai in Cantonese, it did do a
colloquial better than any other language translator I've used, which
is hilarious.

Speaker 1 (01:15:29):
It feels like translation, though, is the most obvious one
where it can't fail because the nuances of language, and
so I have a coordinatial disability, despractice a spatial awareness.
It really affects my ability to learn languages because structural
concepts not so good. So learning I couldnt learn French.
I failed French, Latin, German, Spanish. I'm like the school

(01:15:50):
kept throwing him at me.

Speaker 2 (01:15:51):
It's like a fuck you, I really got it failing.

Speaker 1 (01:15:54):
I'm like the fail Master, and it was because like
the way English works, it's very different. Also, I didn't
care was very depressed, But it really was the structural stuff,
like trying to learn enough language when you don't get
those concepts. And I imagine latch language models also have
this problem.

Speaker 8 (01:16:09):
And add to the fact that not all languages are
higher like English is a very low context language and
Asian languages high context.

Speaker 2 (01:16:17):
I don't even know what that means.

Speaker 6 (01:16:18):
So what that means is a lot of these areas.

Speaker 4 (01:16:21):
A lot of things are unsaid.

Speaker 8 (01:16:22):
A lot of things are like how you talk to
a person in Japanese and Korean, which are my languages
that I studied, depends on are they older than you.

Speaker 4 (01:16:32):
You're going to talk to.

Speaker 1 (01:16:33):
Them differently, and there's no way for the language to
express that.

Speaker 8 (01:16:38):
You're going to be using things in that to express that.
Or like if you if I say something to you
in Japanese and I'm just like so classic business example
that they give for Japanese is you ask a yes
or no question and they got like, ooh, that might
be difficult. That means no, okay.

Speaker 6 (01:16:54):
Singlest is like this sing list has a saying ken
is can which means it can't be done, Like it's
an you can you can do this?

Speaker 1 (01:17:00):
Yeah you can?

Speaker 4 (01:17:02):
Yeay again, that's the thing. How is m men to
pig you can? How is it supposed to know that?

Speaker 3 (01:17:09):
Well, this is why not everyone at the un is
wearing a bee at the moment or the Google headphones.
This is why they have real human translators. We see these,
we see this translating stuff work.

Speaker 2 (01:17:20):
Like you're paying for the trust.

Speaker 3 (01:17:22):
Yeah, yeah, you have to pay for the trust because
Google Translate is really really great if I need to
go double check something. But if there was ever a
time where I needed to be like, hey I need
to know what this is in Japanese, I am not
asking Google Translate.

Speaker 8 (01:17:36):
I'm texting very basic things like if you're saying like
oh I have oh so. Another thing was like I
would have friends visit me when I lived in Japan
and they would be like, oh, I can't eat this
for uh because I'm a vegan. I was like, we
can't say that you're a vegan. They want to say
that you have an allergy to meet and then they'll
take it out.

Speaker 3 (01:17:54):
That is a psychological allergy.

Speaker 4 (01:17:56):
Yeah.

Speaker 1 (01:17:57):
See, this is the thing that drives me inside about
all of this because a look bit right back to
hate and Kevin Ruce's work.

Speaker 2 (01:18:03):
But it's like these people vigorously beating off about AGA.

Speaker 1 (01:18:07):
It's like based on an entire industry of people saying, yeah,
it sort of works, but it doesn't. And by the way,
it burns billions of dollars and by the way.

Speaker 2 (01:18:16):
Will it get better?

Speaker 1 (01:18:17):
Yeah, probably not, but it will. My company is so
powerful and strong and so weak and sick. It's the
best thing ever. Also is dying.

Speaker 3 (01:18:25):
And this is exactly what they said with VR. They said,
you know what, give it a minute. If we just
if we just keep going, everybody is going to everybody
is going to be doing VR. Everybody is going to
be moving on to AR and it doesn't happen because
they're trying to force something. It's not the.

Speaker 1 (01:18:38):
Phone, but it's kind of but it's kind of like
VR in the sense that like for VR to work
in the Ready Player one thing, which is fucking insane
if you Already Player one.

Speaker 4 (01:18:45):
To be real touch point.

Speaker 1 (01:18:47):
Also terrible film, worst book, simple to remember all the
things that you know, but also that film was a dystopia.
The other thing is it's an extra sensory psychological experience.
You would need things that need to do not even
the beginnings of exit. You don't have something that can
take over your senses and you can move in a

(01:19:10):
space vastly different.

Speaker 4 (01:19:12):
Accessibility in VR and augmented reality is like, really.

Speaker 2 (01:19:15):
Tough, We're not even there.

Speaker 1 (01:19:17):
And in the same way with AI, we don't have
a thinking computer. This shit can't do anything on its own.
You try and like, oh, agency here, fuck you, agents
on nowhere. Stop using that word. Every time you use
the word agent. Norman finks things anyway.

Speaker 2 (01:19:33):
For reference to people.

Speaker 4 (01:19:35):
Loves voices.

Speaker 2 (01:19:36):
I love doing voice. I love voices.

Speaker 1 (01:19:39):
It's just every time I hear from these fucking companies
and how much money they have, and they're like, here
is something that sucks.

Speaker 2 (01:19:47):
It stinks. You should be so excited. You need to
be excited for this. It sucks. It doesn't do the
thing you want it to do. Can it do this?

Speaker 1 (01:19:54):
No?

Speaker 2 (01:19:55):
Will it do this?

Speaker 3 (01:19:56):
Trust me, well, if you get us to our next
round of funding.

Speaker 2 (01:19:59):
But even if you're like amaz On or google on.

Speaker 6 (01:20:01):
Yeah, the thing I hate most about all of this
that beyond everything we've already said, and I think we've
already kind of alluded to this with what you just said, ed,
which is we're spending a lot of money on things
that don't work, but we're in the process generating a
lot of ways, generating a lot of like energy laws,
all the server farms that are just going to take
over this world just to back up and store all

(01:20:22):
of that data that they're scoping in.

Speaker 3 (01:20:24):
Like, that's the only reason I love AI is every
time I'm like, am I pretty I know it is
costing someone so much money? Little shitty like you're the
cutest enviloyment.

Speaker 2 (01:20:37):
In a furnace.

Speaker 3 (01:20:38):
Yeah, yeah, you're destroy the environment.

Speaker 4 (01:20:41):
It's like how many validation.

Speaker 6 (01:20:43):
From how many bees have they sold that.

Speaker 4 (01:20:46):
I don't know.

Speaker 8 (01:20:47):
They're like they're in like a beta thing technically an
Apple Watch app too.

Speaker 4 (01:20:51):
You don't actually need to buy their hardcore.

Speaker 2 (01:20:55):
This company rules. I fucking love this.

Speaker 6 (01:20:58):
There's so how many Humane apins have cotton fire and
then are now in the trash because they turned on
the company. They said ten thousand units, they said something
like that.

Speaker 1 (01:21:08):
Humane was the biggest like dunce moment for members of
the media. I'm not going to name. So many people
are like this is going to be great.

Speaker 2 (01:21:13):
Even though it was like, yeah, it's a seven.

Speaker 8 (01:21:16):
Mad at me when I wrote my translation piece and
they're like, we need to talk, and then I was like,
let's talk.

Speaker 4 (01:21:21):
They reached out or yeah they ghosted me.

Speaker 2 (01:21:24):
Well they go to management consultants both like in dignity.

Speaker 1 (01:21:28):
Anyway, we have to bring this to an end because
we are running out of time and otherwise I'm just
going to start reading the Kevin Ruce bit again that
motherfucking I just.

Speaker 3 (01:21:38):
As an irresponsible piece to it.

Speaker 1 (01:21:39):
It is irresponsible, and I think that that is a
good kind of place to get some final thoughts where
it's like, it's not just that these things don't work
and they stink and they're expensive, and they burn billions
of dollars, they destroy the environment, they steal from everyone.

Speaker 4 (01:21:52):
Not just that.

Speaker 2 (01:21:53):
Sure, yeah, other than.

Speaker 1 (01:21:54):
Those things, by representing them as imminently useful and helpful,
the only thing you are doing is empowering the powerful
and creating more cycles where useful things don't get funded
and useless things get more money than ever.

Speaker 2 (01:22:08):
And it's disgusting.

Speaker 1 (01:22:10):
I actually, I know it sounds a little direct to
be like the guy said the computer was too smart,
but it's you're speaking on the New York Times. And
it's not just him, it's Ezra Kleine as well, another
fucking moron. Jesus fucking Christ. These guys put one just
like put like anyone in this room. I think they
do a much fucking better job. But it's just like
it's empowering people who do not have anyone's best interests

(01:22:33):
in mind or actually fixing anyone's real problems.

Speaker 4 (01:22:36):
And or are out of touch.

Speaker 2 (01:22:37):
Yes, I.

Speaker 3 (01:22:40):
Still maintain exactly get a certain income you're not allowed
to have online, Like you can't talk online because you're
so out of touch for the rest of us. I
don't want to hear it.

Speaker 4 (01:22:51):
What's the number.

Speaker 3 (01:22:53):
Way more than I make?

Speaker 4 (01:22:54):
Well, yeah, or one hundred mili.

Speaker 3 (01:22:58):
It's like a couple of million, right, Yeah, Like if
you're like, oh, I can go spend twelve K on
a first class ticket, you have too much money, so
you can't talk anything.

Speaker 1 (01:23:06):
No. It reminds me of the Sam Willman moment on
This My Favorite Notebook podcast where he's.

Speaker 10 (01:23:10):
Like, yeah, so I write my little notes and I
crumple them up and I throw them behind me, and
that's how really good acts. The house can keeper comes
by and g I don't know where she come which
means she's there all the time. And it's just like,
first of all, you said in front of a person
you have a housekeeper. Second of all, you could not
remember when they're there, which means they're always there. A
third of all, you're just creating Like your fucking AI,
You're just creating trash information you throw around and expect

(01:23:31):
someone else to clean up.

Speaker 2 (01:23:33):
Loadsome little fucks. Anyway, great place to end it, shelon
Where can people find you?

Speaker 6 (01:23:39):
Angada dot com and I guess, uh, I don't know
what social media platform. Just shout out blue Sky, Sherlynn
b s Guide or Social Alex.

Speaker 8 (01:23:48):
Uh.

Speaker 3 (01:23:49):
Yeah, I'm about to spart, spart Yeah, sure, I'm gonna
start this over again.

Speaker 2 (01:23:53):
That's enough.

Speaker 3 (01:23:54):
Yeah, Yeah, I am about to start a four week
special engagement at Gizmoto. I'm gonna be hanging out with
those folks and saying all sorts of things that really
need to be said about technology, because oh boy, are
we in a moment. And you can also find me
on on all social media platforms until I make enough
money where I don't have to use them.

Speaker 8 (01:24:14):
Beautiful Victoria you can find me at Victims song on
every social media platform and I write out the verge.

Speaker 1 (01:24:21):
You can find me on the New York Times. My
name is Kevin Ruce, and I will be saying multiple
I am Kevin Ruce.

Speaker 2 (01:24:29):
Now you can find me on all my social media platforms.
Exit and there isn't another one, thankfully, because I think they.

Speaker 3 (01:24:33):
Weren't you in the New Yorker though, aren't you in
the New York recently?

Speaker 2 (01:24:38):
Yeah? They catch up too spicy me.

Speaker 1 (01:24:41):
They're like right at the beginning of it, I didn't
think that he wasn't recording yet and he was like.
I was like, yeah, it's kung pow chicken spicy, and
they're like yeah. Thankfully he kept up. The bit was
like is the plump chicken spicy? And the bit where
I was like eating white rice and me going on
hot ones and just drinking the milk.

Speaker 2 (01:25:04):
Please don't kill me. But yeah, thank you for listening.
At one.

Speaker 1 (01:25:07):
It's been another radio Better Offline and Alexa play Tool
forty six and two.

Speaker 4 (01:25:13):
That's cruel some banks.

Speaker 2 (01:25:16):
I don't care. One of the greatest songs of all times.
Thank you for listening to Better Offline.

Speaker 11 (01:25:30):
The editor and composer of the Better Offline theme song
is Mattasowski. You can check out more of his music
and audio projects at Mattasowski dot com, m.

Speaker 1 (01:25:39):
A T T O s O W s ki dot com.

Speaker 11 (01:25:43):
You can email me at easy at Better offline dot
com or visit Better Offline dot com to find more
podcast links and of course, my newsletter. I also really
recommend you go to chat dot Where's youreaed dot at
to visit the discord, and go to our slash.

Speaker 2 (01:25:56):
Better off Line to check out our reddit. Thank you
so much for the thing.

Speaker 4 (01:26:01):
Better Offline is a production of cool Zone Media. For
more from cool Zone Media

Speaker 7 (01:26:05):
Visit our website coolzonmedia dot com, or check us out
on the iHeartRadio app, Apple Podcasts, or wherever you get
your podcasts.
Advertise With Us

Host

Ed Zitron

Ed Zitron

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.