All Episodes

January 23, 2024 46 mins

At this year's CES, several trends revealed themselves among the exhibitors. But none were more prevalent than the incorporation of artificial intelligence. How are companies using, or at least claiming to use, AI in their products this year?

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Welcome to tech Stuff, a production from iHeartRadio. Hey there,
and welcome to tech Stuff. I'm your host, Jonathan Strickland.
I'm an executive producer with iHeart Podcasts and how the
tech are you? So? Longtime no podcast y'all. The last

(00:25):
time I actually recorded a new episode, it was about
CT scanning machines, and I talked about how I had
just had my first CT scan. That was on December
thirtieth of twenty twenty three when that happened, and it
was all part of a medical emergency I experienced over
that last weekend of twenty twenty three. But guess what,

(00:48):
The very same day that I was recording that episode,
I ended up having another emergency and I had to
go right back to the hospital. My blood pressure had
spiked again. It got to the point where I was
completely incoherent, more so than usual. You know, I realize
my coherence is a spectrum. Well this was off of it,

(01:10):
and I honestly only remember the second half of what
happened at the hospital once I was moved into the
Intensive care Unit or ICU. Before that, apparently I spent
several hours in the emergency room, but I have no
memory of it. I was told I even received a
lumbar puncture, which is I understand it. Those are pretty painful,

(01:33):
but none of that is in my mind. But a
group of doctors and nurses took very good care of me,
both in the er and the ICU. And as I
am recording this now, I'm just back from getting from
the hospital. I had to go in for a surgical

(01:54):
procedure that is connected to all the emergency stuff. The
surgery went pretty well. The recovery has been rough, but
hopefully this will get me toward being on the right track.
It has been a heck of a way to start
off twenty twenty four. And as I mentioned in that

(02:16):
last new episode from Me, I am scaling back on
the number of shows that I do per week. One
contributing factor to the blood pressure issue is stress, and
we figured that reducing my stress should be a little
bit helpful. Now, that's just one part of what I'm
doing to get better. It's not like cutting back is

(02:39):
going to magically make me healthy. I have to do
a lot of other stuff. I'm on medication. I've got
a new approach to my diet and exercise, which is
well overdue. So it's just one piece of a bigger
puzzle anyway. Enough of all that. One of the many
things that I missed out on while I was in
the hospital was CES, which sometimes is also referred to

(03:04):
as the Consumer Electronics Show, although I think now they
just prefer to have it called CES. And in case
you're not familiar with this event, CES is a huge
trade industry conference in the United States in which companies
come together in Las Vegas, Nevada, every January to focus
on all things in the consumer electronics space and beyond.

(03:27):
If we're being honest, consumer electronics is really just one
part of it, which might be why they prefer to
call it just CES these days. It's at CES where
manufacturers and retailers will show off products intended to launch
within the next year or so. Sometimes they just show
off things like prototypes that are never going to see
the light of day, but might show off some features

(03:50):
that could find their way into products in the future.
Various members of the media attend and an effort to
get really cool stories. And you know, you've got other
industry professionals who go for all sorts of reasons, you know,
analysts and such. Most of them are really looking for
ways to find free food. At least in my experience,
that's kind of what the main priority seems to be.

(04:10):
And every year at CEES, trends emerge. Sometimes these trends
will establish themselves and eventually become a pillar in technology.
Like think about flat screen televisions. I mean, that's the
common form factor now, but once upon a time it wasn't,
and when those started to pop up, they eventually did

(04:33):
become the standard. Or things like Bluetooth connectivity and smart
home technology those fall into these categories too. Now. Other times,
various industries will push really hard to get a technology
kind of established in order to make it a trend,
and it ends up going nowhere. So the one I
always cite is three D television. When I first started

(04:55):
going to CES, which is back in the mid two thousands,
all the major companies in television space were gung ho
on three D televisions. There was a ton of support
behind three DTV within the film and TV industries, you know,
because three D formats would be a lot harder for
people to pirate than your typical visual media. So the

(05:16):
studios were really gung ho on three D TV as
well because they were like, well, if this means people
can't just steal our stuff and they have to buy
it from us, then that means more money. So let's
definitely make three D television a thing. I mean, what
better way to crack down on those pesty pirates than
to convince them that they all have to watch the
stuff in three D. And you know a lot of

(05:38):
pirates can't even watch stuff in three D because they
got eye patches, which means they have a lack of
death perception. That's a parallax joke. Anyway, the whole gambit
for three D television didn't work out. Consumers rejected three DTV.
Most folks decided they didn't want to have to wear
glasses just to watch television in their homes, or that
having yet another component that could go missing was a

(05:59):
real hassle. Like if you've ever been one of those
folks who's like, where the heck did the remote control go,
imagine doing that with glasses as well. Or they were
arguing that there just wasn't enough compelling material that was
in three D to justify the investment of purchasing one
of these televisions. So after a couple of valiant years
of trying to make three D happen. It followed in

(06:20):
the footsteps of Fetch and didn't happen. This year was
no exception. There were other trends that came out this year.
A few different ones popped up, like transparent O led
television displays. There were a few of those, and I'll
probably do an episode talking about that in the future.
But undeniably, one type of tech really dominated conversation at

(06:41):
the show floor this year, and that was artificial intelligence.
And we all know that AI is a big deal.
Companies like Microsoft and Google and Amazon and Apple are
all struggling to find ways to incorporate AI into their
business models in a way that benefits them, and that
might also mean that these companies will make cuts to

(07:03):
actual human staff in the process if they find that
the AI can take on some of the load that
people would normally carry. And if you read up on
any business conference, any conference that has happened in the
last year and a half, you're going to see a
lot of discussion will be devoted to AI and how
it's going to change everything. And often at these exact

(07:25):
same conferences, at these exact same speeches, you're going to
find hardly any detail as to how we're going to
get there or what strategies companies should employ well forging
the path, it just becomes like this will change everything.
How it becomes like the underpants gnomes in South Park.
You just have a bunch of question marks and then

(07:45):
at the end it says profit. So essentially, everyone knows
that AI is powerful and it's important, but we don't
have wide agreement on how it should be developed or deployed. Also,
it's not like this was the first time that AI
was part of the conversation and at CES. In recent years,
lots of companies have leaned on AI for all sorts

(08:05):
of things, from voice assistance to image recognition to robot navigation.
If I'm being honest, I would say the AI on
display this year more often seem to lean on the
large language model and generative AI versions of artificial intelligence,
the kind of stuff that we've been seeing so much
about from companies like open AI and Google. And I

(08:28):
get it. They're very flashy and they are impressive when
they're working properly, but it gives a very narrow view
of what artificial intelligence is, and AI is so much
more than just large language models and generative AI. But
unfortunately that's harder to sell. So it gets easier if

(08:48):
you just kind of reduce it all to one form
and say, like, this is what AI is. I don't
think that's very wise because it's misleading, but you know,
I'm one voice in a big crowd. Now, there were
a couple of major approaches that I saw while reading
up on how various companies were positioning AI in their

(09:10):
pitches to the media over at CEES. The big companies,
generally speaking, were actually a little less bullish. You know.
They didn't position AI as being a definitive feature in
their technologies, that did not seem to be the big thing.
Like it wasn't like a big flashing knee on sign
saying this has AI in it. So in their products,

(09:31):
AI was often a component that they might mention as
contributing to the functionality, but it wasn't positioned as being
the main event. Now, some smaller companies went in the
opposite direction. They developed products that put artificial intelligence front
and center, like this is AI. And we're still seeing
some examples of companies that are shoving AI into their

(09:53):
marketing message, even if it seems like they still don't
quite have a handle on how AI adds value or functionality,
or in some cases, I'm not entirely convinced that AI
is actually part of the whole thing in the first place.
Sometimes it just seems like we need to put AI
in there because that's a buzzword that if we don't

(10:14):
put it in there, it's going to seem like we're
falling behind. But I would argue that, at least in
some of the cases I was looking at, AI really
was a grandiose way of saying whatever it was the
product was supposed to do. So let's get started. Let's
talk about an implementation that I actually think is a
pretty good idea, and it's BMW's use of a technique
called retrieval augmented generation. So BMW has a voice assistant

(10:39):
that you can get in in certain BMW vehicles, and
it uses generative AI to respond to your requests. But
this voice assistant can't just chat about anything at all.
It's it's not grant access to a limitless selection of topics,
So you can't just be like on a long road
trip and you're like, let's start talking about Sartra, or

(11:00):
can we have a deep discussion about Lord of the Rings.
It can't do that. Instead, this voice assistant can really
only provide information about the vehicle itself. So this restriction
means that the voice assistant isn't prone to hallucinations or confabulations.
So that's the tendency for generative AI to just plane
make stuff up on occasion. Right, Sometimes AI, in the

(11:24):
lack of information, will make something up and it sounds
like it's reliable, but it turns out it's completely false.
This is why AI critics warn that without strong guidelines,
AI could manufacture and distribute misinformation in such a way
that the misinformation seems like it's reliable and it's not malicious.
It's not that the AI is trying to mislead. It's

(11:46):
just trying to answer a question and doesn't have the answer,
and like some other people I know, it's too scared
to say I don't know the answer to that question.
So BMW built a barrier around their voice assistance knowledge
base to prevent this from happening. They said, well, it's
only going to be restricted to matters that involve the

(12:08):
vehicle itself. So the assistant draws upon the power of
Amazon's Alexa large language model, and it can interpret what
you mean when you ask questions. So that way, even
if you are not a car person and you don't
really know how to frame a question properly, like you
don't know what you're asking about, you're just trying to

(12:29):
find an answer to something you don't know the answer to, well,
this assistant can still try to help you get to
where you need to go, like what information do you
need to know? It can actually ask follow up questions
to you in order to get a better understanding of
what it is you're asking about, So like if you
don't know what it is you don't know, it can
at least ask follow up questions to try and narrow

(12:51):
down what the matter is. And then it can simplify explanations.
If you find something too technical to follow, you could say,
can you explain that to me in in plain English?
And it can reduce the complexity and do it now.
The beauty of this is that the voice assistant becomes
sort of an interactive owner's manual that's capable of rephrasing

(13:12):
passages if they just don't make sense to you, which
is great if you've ever flipped through a car owner
manual and encountered stuf where you're like, this doesn't seem
like a human being wrote this. I don't understand what
they're actually getting at. Well, imagine that it would be
able to rephrase it so that it could convey that
meaning to you like that to me is incredibly valuable.

(13:33):
So for example, if your vehicle has multiple drive modes, right,
and you have no clue which drive mode should be
used in any given situation, the assistant can help. You
could ask it and you might find out that if
you were to switch to a different driving mode, you're
going to get way better performance on the roads that
you're currently driving on and your drive is going to

(13:54):
be far more pleasant. Or maybe that by making a
few other changes you can dramatically decrease the amount of
fuel consumption you're going through and save some money. I
think that's a pretty neat idea. Now, I should say
that the description I read about this tool was done
as part of a non interactive demonstration. So the reporter

(14:15):
whose report I read is Tim Stevens. He wrote a
piece for Ours Technica, and he was not allowed to
ask any questions of the voice assistant himself, so he
could not interact with it. Instead he had to just
witness a kind of a performance almost so BMW representatives
ran the whole demo. And it's hard to say for
you know how far along this product is, or how

(14:36):
reliable it is, or whether or not you could prompt
it to make a mistake, because often that becomes a
thing where people will try to get AI things to
mess up to see if, in fact it's possible, because
it's better to find out through testing than to find
out through actual use. But I think this is an
AI implementation that really makes sense, and I could easily

(14:56):
see other companies in other industries making similar use of
this technology to make it easier to navigate increasingly complex systems,
not just vehicles, but all sorts of stuff like I
can see businesses that are incorporating generative AI to do
it in this kind of way where they are effectively
geo fencing the AI so that it's only focused on

(15:18):
the things that are pertinent to the business. That just
makes sense to me. Okay, we're going to take a
quick break. When we come back, I'm going to talk
a lot more about AI at CEES twenty twenty four. Okay,

(15:38):
we're back. So we talked about BMW's approach, which I
thought was really cool. I don't know if the tool
itself is cool. I just think the strategy makes sense.
But let's contrast BMW's rather focused application with a handheld
device that is completely dependent upon AI and large language
models or large models, I probably should say, and that

(16:02):
is the Rabbit r one. So this gadget got a
ton of attention at CEES, at least from the media
as I understand it, it did not really go viral
over on Twitter. I'm not on Twitter anymore, so I
just have to take other people's word for it. But
like whenever I went to any outlet, if I were
looking at Wired or the Verge or Ours, Tetnica or anything,

(16:28):
there were articles seen at the articles about this thing,
the Rabbit are one. This kind of happens at the
show occasionally, right if you go to CEES, sometimes a
particular thing will stand out, and it's usually something that's
not from a big company. It's usually something that's kind
of a surprise, and it often can be a small thing,

(16:49):
but it just captures everyone's attention and then it generates
tons of articles. So like in past years, there was
a vibrating fork which was a real thing. It was
a fork that had a little haptic motor inside of it,
and if it detected that you were eating too quickly,
it would vibrate so that the food would fall off
your fork, forcing you to eat more slowly. That was

(17:11):
the whole concept. That was a big deal one year.
The pebble smart watch was a huge deal one year,
and that doesn't even exist anymore. They'll fitbit purchase that
and then kind of killed it off. So the pebble,
it shows that these little things that can become viral
and get everyone's attention and get a lot of excitement,

(17:32):
that doesn't mean that they are guaranteed to stick around.
They're not guaranteed to be a success. The Pebble was
huge and doesn't exist anymore, so that can raise the
question will the rabbit r one have better luck than
say the pebble. Well, let's talk about what it is first.
So it's a handheld device. It's kind of square ish

(17:53):
in shape. It's about half the height of your typical smartphone,
so it's not as big as a smartphone. Most of
the front face of this device is covered with a
touch screen. That touch screen typically has a sort of
cyber rabbit icon face on it. To the right of
the screen is a little physical scroll wheel. It's mounted

(18:17):
so that you can scroll up or down with it.
Above that is a forward facing camera. It has a
simcard port, It's got Wi Fi and Bluetooth connectivity. It's
got a speaker, it's got a microphone. It's got a
few more things. But the physical stuff isn't really where
the story is with this device. See The company behind
the rabbit are One Rabbit in other words, says that

(18:40):
it wants to create the next generation of computers. But
these computers should be able to interface with all the
functions we rely on today without us having to deal
with the actual applications that provide us those functions. So,
for example, than having to open up an application or

(19:02):
a program and run a function, you could just tell
the computer what it was you wanted to have happen
and the computer would do it. So these computers should
be able to interface with all the functions we rely
on today on behalf of us. They can act as
almost like a middleman. So for example, I could get

(19:22):
a smartphone and download and install apps on my phone
to do all sorts of stuff like to call a
cab or a ride share. I could have an app
to order food. I could have a different app so
I could listen to music, a different app so I
could watch films, or maybe I have apps that are
various games. But I have to download and install each

(19:44):
of those apps in order to do that. And obviously
lots of different companies make these apps, so these interfaces
aren't universal, right, So it might actually take me a
little while to learn how to navigate those apps properly,
because one app may use one kind of interface, another
might use a different one. And I don't know if
you've had the experience where you've opened up programs and

(20:06):
like you have that moment where you realize, oh, the
shortcut I'm trying to use doesn't work because that's actually
for a totally different program. I have it happen all
the time. But then I'm also I'm also getting old
and broken and perhaps senile. So maybe that's just a
Jonathan problem. But what if instead of doing all that,

(20:27):
my device just took those steps out and just interface
directly with the underlying platforms. What if all I needed
to do was to tell my computer order me a
deep dish Chicago pizza to get here by five pm,
and the computer, acting like a personal assistant, handles the
entire transaction. And what if I could do any kind
of transaction that way, not just ordering pizza. Maybe I

(20:50):
need to order a ride. Maybe I want to watch
the latest episode of True Detective. Maybe I have a
list of things I want to do, like maybe I
want to do a full fledged vacation, and I need
to do things like I need to book flights, I
need to secure a hotel room, I need to get
tickets for a walking tour I was interested in, I

(21:10):
want to make a dinner reservation for a particular restaurant.
What if I just had a gadget and I just
told it all these things I needed it to do,
and it took care all of that for me in
one go, Right. That's the kind of idea behind the
Rabbit r one. The company says the secret Sauce is
a system they call a large action model, and it's
this model that figures out how to interface with all

(21:33):
the different services out there in order to get the
result that you desire, whether that's pizza delivery or an
update on your health records or whatever it might be.
There are a lot of unanswered questions when it comes
to this actual approach, like how can the product and
system ensure privacy and data security? For example. As is

(21:53):
often the case at cees, answers to the hard questions
didn't really take center stage. Instead, folks got swept up
with this idea of a device that you know, possibly
could do all these things, And in just over a week,
Rabbit had sold out of its pre order inventory that
had set aside, and then it sold out of the
next batch that it set aside, and then the next one,

(22:14):
and it keeps going and one hundred and ninety nine
dollars a pop. It's a pricey piece of technology, but
admittedly it's significantly cheaper than your typical smartphone. So you
could argue, well, if you wanted to free yourself of
a smartphone, then maybe you could use this. Now, gret
you can't use this thing as a smartphone, it's not
intended to be a smartphone, but you could use it

(22:35):
to do all these other tasks. Now, some evangelists have
already talked about how the device lets you do lots
of stuff that we do use smartphones for today, but
it removes pesky things like notifications and distractions. So if
you're someone who's like, I'm so sick of getting text
messages and phone calls and all these little pop ups
and stuff. I just want to be able to do

(22:57):
what I need to do when I need to do it,
and then not be bothered that I could see where
you'd be. Really you'd find the appeal of the rabbit
r one. Maybe you go back to having just a
phone at home and an answering machine or something similar
to that, where you're not carrying your communications device with
you everywhere. I think a lot of people, at least

(23:17):
on one level, think that's attractive. But when you start
really thinking about the convenience of having a communications device
on you, it does. I mean, that's a strong use case,
right Like, as someone who has been through a couple
of medical emergencies recently, I can tell you having a
communications device on you is invaluable at times. But you know,

(23:38):
it does seem like it would be cool to have
a gadget that could handle these things and you know,
you just tell it what you want and it handles
all the details. That does seem pretty nice. But as
one YouTuber vrun Maya and I apologize for the Butchering
of your name. I saw his YouTube video about the
Rabbit R one, and he made some really good points,
some criticisms about this, and he said that the rabbit

(24:03):
R one's place in the pantheon of established technology is
by no means assured. He argues that the use case
for the r one is pretty limited. And if you
were to actually look at how much time the typical
person spends on their smartphone and then take another step
and say, all right, let's break that down, what are
you using that time to do? Like, what what are

(24:25):
you mostly doing on your smartphone? He argues that you
would only see a tiny fraction of the use time
that would go to interfacing with apps in this way
for doing things like ordering say a ride haaling service
or food or something. He says, yeah, people do that,
but that's like five or six minutes a day. If
you were to look at how much time they're spending

(24:46):
on their phone, most of their time is spent consuming content, right,
watching videos, listening to podcasts, you know, that sort of stuff,
but not booking travel or ordering food or whatever. So
his argument is that there's actually not enough of a
use case to justify purchasing the R one that yeah,
it might work. If it does work, we don't know

(25:07):
because we only got to see some demonstrations and stuff.
But it's only taking off like the like, if it's
taking off like five minutes of work a day, that's
hardly valuable, right. It's not like that's a huge load
off of you. It's not a big enough departure from
the smartphone to necessitate it. Plus, and this is a
really good argument. I think it stands to reason that

(25:29):
smartphone manufacturers are going to experiment with their own approach
to things like large action models, which means you're likely
to see rabbit R one functionality finding its way into
established smartphone lines in the future, right, whether it's Apple's
Siri or Google's Assistant, that kind of thing. You would
expect to see these companies start to build those capabilities

(25:53):
into their own tools, which means, well, you don't need
the R one anyway. You're going to still want a phone,
and your phone is going to end up doing the
same thing the R one does, so why would you
need a separate device to do those things. It's kind
of like why most people, not everyone, but most people
don't bother having their own like separate MP three player anymore. Right,

(26:16):
you don't need an iPod or anything like that. You
can just use your smartphone to access music. So for
that reason, most people don't carry around iPods are MP
three players. Why would you carry around a separate AI device. Still,
it was neat seeing something that was different from everything
else that showed up in Las Vegas. Right, it was

(26:38):
nice to see something that did not look like everything else.
The design is also kind of it's got the sort
of retro cool thing to it. It looks a little clunky.
It actually makes me think of the seventies because it
was in bright orange, and I associate the seventies with
like oranges and browns and it, you know, it was
like a little square box and I was like, well,
this feels like it was like technology was imagined in

(27:00):
the seventies. It's got this little scroll wheel thing, and yeah,
that was cool a neat idea. And I think the
idea of having AI that can interface with these different platforms,
I think that's interesting too. I do think that there
is some value in that, like, especially if you're on
a vacation or something and you want to be able
to arrange a bunch of stuff, but you don't want

(27:22):
to take up time on your vacation to do it.
I could see that being really valuable, so I can
definitely see this evolving from here. I don't know that
the Rabbit R one is going to stick around. There's
also the question of if Rabbit were to encounter financial difficulties,
Like the company is really young. It launched in like

(27:43):
October of last year. It has not been around for
very long at all, which should also raise some questions
about things like how likely is this device to work?
But if it doesn't stick around, then does that mean
the R one just becomes a paperweight? Right? Two hundred
dollars paper weight? Because if it doesn't have the connectivity

(28:04):
to any underlying services, presumably this device doesn't have a
massive amount of processing power. If it's two hundred bucks,
there's no way it's got really high end components inside
of it. It must be relying upon cloud and edge
based computing. If the company that's behind all that goes under,

(28:27):
then presumably that functionality goes away. So I don't want
to see them fail. I think it would be really
cool if they succeed, but I would warn people to
consider it carefully before plunking down two hundred bucks that
they might not ever see again for something that ends
up being interesting, but then ultimately fuels other companies to

(28:49):
create similar tools and the r one gets lost in
the shuffle. But who knows, maybe like a year from now,
Rabbit's going to become like one of the big names
and consumer AI. All right, we're gonna take another quick break.
When we come back, I'm gonna talk about some other
AI components and consumer goods that were shown off at

(29:11):
CES and talk a little bit about whether or not
I think they're interesting. But first let's take this quick break. Okay,
so we're back now. I got a question for you.

(29:31):
Would you sleep better if you knew that artificial intelligence
was in your pillow? Because the Motion pillow is a
pillow that is outfitted with AI according to the company
behind it. So you might say, why why is there
AI in a pillow? Well, the idea is that this
pillow can monitor your sleep and detect when you do

(29:55):
things like snore, and snoring indicates that you're your breathing
pathway is obstructed and so you're not getting as much
air as your body wants. So if the pillow in
you know, if it detects that you're starting to snore,
it will activate pumps to inflate little balloons within the

(30:18):
pillow to help raise your head in an effort to
clear your airway. And it also includes other stuff inside
it as well, like sensors that are meant to keep
track of how well you sleep right, you know, are
you staying nice and still, are you moving around a lot?
Are you getting up in the middle of the night.

(30:39):
And it can give you sort of a score in
the morning indicating how well you slept the night before. Now,
as someone who personally very soon I'm going to have
to do a sleep study because that's another part of
the follow up. They want to check me for sleep apneam.
I understand the appeal of a device like this, right

(31:02):
the idea of, oh, here's something that can help me
deal with sleep apnea, and I'll get better rest and
I don't have to worry about all that other stuff,
like I don't have to worry about getting a seapap
machine or anything like that. I actually worry that some
folks might lean on technology like this in an effort
to bypass the need to consult with physicians about this

(31:26):
kind of thing. I do think technology can help. Don't
get me wrong. I think that having technology to help
support a healthy lifestyle is a good idea. I just
worry that people will rely on the tech rather than
rely on medical assistance, and they won't get the quality
and quantity of help that they really need. But still

(31:47):
I think it's a neat idea. I don't know how
I would feel about plunking down nearly seven hundred bucks
for a pillow, though, because that's how much it costs.
At that point. I think I might be thinking more
of the seapap mache because as much as I don't
want to look like a Star Wars alien when I
go to bed, I would like to have something that
has a good track record, and potentially my insurance could

(32:11):
help me pay for it. But you know, I don't know.
Maybe the pillow is a really good idea. Now. I
did mention that pillow has a lot of components in
it that are mirroring things that you find in fitness trackers,
right Like, there are a lot of fitness trackers out
there that you're supposed to wear when you go to sleep,
and they'll track your sleep. Well, we saw lots of

(32:33):
other fitness trackers at CEES twenty twenty four, No big surprise.
They have been a huge thing there for years now,
and this year we saw a couple that were meant
for our four legged friends. Again not totally new, I've
seen this sort of stuff before, But in Voxia introduced
a dog collar. It calls Mini Tales and includes stuff

(32:55):
like a GPS tracker, very useful if your dog happens
to get away. So has health sensors to monitor things
like heart rate and breathing and that kind of stuff.
And it also includes an AI element, so that if
you wanted to check in and see what your puppers
was doing while you were, say, off at work, you
could open up an app you could see if they
were sacked out or if maybe they had a serious

(33:17):
case of the zuomis. So the AI's purpose, from what
I was reading on this description is to interpret what
your dog is actually doing based upon the data it's gathering.
So if your dog's heart rate is elevated and if
it's breathing is faster, then and GPS is giving indications
that could say, oh, well, your dog's running around right,

(33:37):
because it's not like there's you know, a clear running
around indicator on there, the AI is making an interpretation.
So I think that's kind of an interesting concept. It's
probably one of the the fuzzier AI things that we're
talking about today. This color's ninety nine bucks, which for
a fitness tracker isn't terrible, but it also requires a

(33:59):
month subscription of twenty five bucks, and that does start
to pile up, and just like I was saying with
the R one, it makes you worried. If you're paying
a subscription, that usually indicates that the functionality of the
device is dependent upon ongoing support from a back end,
and if that means that the company, for whatever reason,

(34:21):
has trouble, then you may no longer get that support,
which means your smart collar just becomes a regular old
collar and it becomes one hundred dollars collar that does
the same thing as just a regular, you know, fabric
based collar would do, or leather or whatever. So yeah, again,
buyer beware. I don't think you know if you really
want to buy these things. I don't think there's anything

(34:42):
wrong with it. I just think knowing ahead of time,
the risks involved in purchasing something that could eventually become
kind of a dumb version of what it once was
due to a company dropping support. I think that's a
good thing to know. The gamer out there already know
this really well, because there are a lot of games

(35:02):
that over years have lost features due to companies either
going under or just stopping support. Now, there were a
couple of smart mirror devices at CEES twenty twenty four
that leaned a little bit on AI. One of them,
the b Mind Smart Mirror, claimed it could help users
practice mental wellness activities to improve their state of mind.

(35:24):
So imagine you're the evil queen in snow White, and
you walk up to the magic mirror and you demand
that it proclaimed you as the fairest of the land,
and instead the mirror adjusts the lighting to a more
soothing color and brightness, and it encourages you to find
value within yourself rather than to seek validation from others

(35:44):
or something. I've only read about this mirror, so I'm
not entirely sure how it works or how extensive this
AI actually is. I mean, it could be that it's
a relatively small library of responses that it can provide
based upon your prompts, but I don't have any idea
if it's some that can make more, you know, extensive interpretations.
That's really interesting, But I don't know that. Another mirror

(36:07):
like gadget is the Anura Magic Mirror, which looked to
me more like a screen than a mirror, sort of
like how some people use the forward facing camera on
their phone to do stuff like check their makeup and
that kind of thing. Anyway, this gadget can perform a
full face scan. Apparently it takes like half a minute,
and then it analyzes your face to determine a bunch

(36:27):
of stuff like what your blood pressure and heart rate
happens to be that kind of thing, And maybe I
should get one of these, because you know, I need
to keep track of that sort of stuff. Or maybe
I should just keep using my blood pressure cuff and
just to use that. Anyway, according to the company, the
planned customer base isn't for the average person. Instead, it's
for stuff like doctor's offices and gyms and that kind

(36:49):
of thing. We also got more than a few robots
at CEES twenty twenty four. That's to be expected. Every
year that I've attended, I've seen tons of robots. Some
end up being fairly simple, like there might be a
robot that's essentially kind of like a smart shopping cart
that will follow behind a specific person and act as

(37:09):
kind of like a little robotic valet and carry their stuff.
Others can get much more complicated, or at the very
least can fulfill more complicated tasks. So this year there
were a couple that I saw lots of folks mentioned
in particular. I mean there were tons of robots, don't
get me wrong, like hundreds of different types of robots,
but two that a lot of people specifically wrote features on.

(37:34):
One came from LG, but I couldn't find like a
real specific official name for it, although the one I
saw referenced it as the smart Home AI agent. That
just doesn't seem very snappy to me. But it's part
of IG's zero labor home concept, and it's a very

(37:55):
cute little robot. It's got two stubby little legs, the
end in wheels, and its body looks like a horizontal cylinder,
so like imagine like a canister on its side, but
it's being held up by these two legs with wheels
on the end, and it looks like it's wearing a
little pair of headphones, and it's got a little digital
screen with digital eyes in it. I really wish I

(38:18):
could have seen this thing in person. It's so adorable
in pictures. So, according to LG is essentially a moving
smart home hub. So it's meant to interface with other
smart home appliances and such so that you can control
your home by talking to your plastic pal who's fun
to be with. Shout out if you happen to get
that reference. So, this device has facial recognition capabilities that

(38:41):
means it can learn to recognize the various members of
the household. It can monitor the home. It has a
built in camera so it can patrol and keep an
eye on things. It can also check on various factors
like air temperature, humidity, and air quality within the home
and alert you if any of those are in ranges

(39:01):
that are perhaps unhealthy. And it's even supposed to be
able to figure out if you're in a good mood
or not. So the idea is that you get home,
this little robot rolls up and looks adoringly into your
face and then tries to figure out if you're happy
or if you're grouchy or whatever. Then immediately it begins
to select settings and content to help you out with

(39:22):
the various smart home appliances in your home. So maybe
you come home, it looks at you and it can
tell that you're all stressed out, so immediately starts to
set the lights to a kind of calm lower level
and plays soothing music on a smart speaker and puts
a silence on notifications for the time being so that
you don't flip out. I have no idea if this

(39:43):
thing is ever actually going to be an actual product,
but I can definitely see the appeal of it. Samsung
also got some buzz by showing off an update to
its Bali robot or bally, depending on how you pronounce it.
I watched a video where it was from Samsung. I
had both pronunciations in there. But it's a b a lllie.

(40:04):
I would think it's Bali because it is shaped like
a ball. Now. Samsung first introduced Bali back in twenty twenty,
but this twenty twenty four version has a few extra
bells and whistles. Now, like I said, it's rather ball shaped,
but it uses again a pair of thin wheels, one
on either side of the ball that helps it move
around the environment. And this new version of Bali has

(40:27):
a projector built into it, which allows it to project
images on floors or walls or ceilings, essentially being able
to turn any surface into a video screen. So promotional
video showed folks using Bali to create an impromptu video
calling screen, or to turn a wall into a television,
or even project stuff on a floor in an effort

(40:48):
to entertain a Golden retriever, which I think is unnecessary
because we all know golden retrievers have like two brain cells,
so you really don't need to work that hard anyway.
Like LG's robots, msun showed off that Bali is meant
to interact with smart devices and thus give the robot
control over appliances and Internet of Things gadgets throughout the home.

(41:09):
So like the LG one, you could use it to
do things like adjust the thermostat, or change what's playing
on the smart speakers, or change the lighting. It can also,
like the LG one, it can also patrol and keep
an eye out on things that are going on back
at home base while you're out and about. So those
were some of the robots I mean, like I said,
there were tons of others. There was one that was

(41:30):
a few people mentioned that was like a combination robot
that was a couple of different appliances, with one big
appliance acting as like the docking station for the roving
robot that could mop. It was a washing machine and
mop combo, and the mop part could wander around the
house and mop and then dock with the washing machine

(41:52):
and offload the dirty water through the washing machine's drain
so that you didn't even have to empty the mop.
It could fill itself with clean water and empty the
dirty water, which I think was pretty cool idea. So
a lot of different stuff like that, But moving on,
let's talk about Nvidia, because that was another company that's
heavily entrenched in AI, and that should come as no

(42:15):
surprise because it has been manufacturing powerful processors that have
been tweaked to support AI functionality for the past couple
of years. Powering artificial intelligence requires a whole lot of OOF,
and Nvidia has a rep for building chips that are
very much oomph centric, whether it's to provide the best
performance for a state of the art gaming PC or

(42:37):
a computer system that's running artificial intelligence applications. The company
held a special address during CEES to talk about how
its products will power the tech of tomorrow, and it
can be challenging to walk away from stuff like CES
and not have a feeling that at least some companies
are still more than a little wishy washy when it

(42:59):
comes to You've got companies like Nvidia that can very
firmly point at how they support AI functionality, but when
it comes to the companies that are building the actual
AI implementations, it gets a little more vague. You might
have limited implementations, you might have some very loose definitions

(43:20):
that don't make a very strong stance as to how
AI is a factor. But we do need to remember
that artificial intelligence itself is kind of on thin eyes
at the moment. There are governments around the world that
are taking a very close look at AI and are
starting to consider the sorts of regulations that may be
needed to keep AI from going all terminator on us,

(43:42):
and companies need to keep that in mind too. It
may be necessary one day to walk back some AI strategies,
so diving wholeheartedly into AI tech could end up being
a costly mistake, and that might be one reason why
companies are a little slow to do so. It's not
just that it's hard to figure out how do we

(44:02):
do this in a way that makes sense. It's also
how can we do this in a way where we
don't over commit and if governments decide to push back
hard against AI. We haven't gotten into a position where
we've you know, over invested in an area of business
that ultimately doesn't pan out. So that also opens up

(44:23):
opportunities for smaller companies like Rabbit to potentially cash in.
But I'm still not convinced that Rabbit will see much
success beyond its initial launch. Maybe I'm wrong, We'll have
to wait and see. It's a really weird situation. We
already have and use so much technology that has various
elements of AI built into it like a Again, AI
is not new. You know your smartphone has AI components

(44:45):
built into it. It's something that's everywhere all around us.
It's clear. It's obvious AI is going to be a
big part of our technology moving forward. There's no denying it.
But at the same time, I think most of us
recognize that AI also has the potential to do amazing things,
but potentially also terrible things. So here's hoping that companies

(45:05):
make the best choices and that our refrigerators don't rise
up against us, because I'm pretty sure it could take
me anyway. That's kind of a round up on an
overview of what was going on with AI over at CES.
As I said, that was just one small thing that
happened at CES this year. I might do another episode
talking about some of the technologies that were shown off

(45:27):
to go into further detail, like those transparent OLED screens.
It's something we had been hearing about for a very
long time, and we had even seen some prototypes in
the past, but man, they were on display like crazy
this past year from what I've seen, and I'm kind
of sad that I missed seeing them in person. Not
so sad that, you know, I would have traded all
that wonderful time I spent going to various doctor's appointments. Anyway,

(45:52):
enough about all that. I hope you are all well.
I'm so glad to be back recording. I look forward
to doing that three times a week. I'm thinking about
news episodes on Fridays and then other just regular textuff
episodes on Mondays and Wednesdays. And it's a pleasure to
be back in the saddle and recording again. I hope

(46:16):
you are all well. I hope you all take very
good care of yourselves, go see your doctors on a
regular basis. Trust me on this, you don't want to
fall into the same trap I did, and I'll talk
to you again really soon. Tech Stuff is an iHeartRadio production.

(46:38):
For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts,
or wherever you listen to your favorite shows.

TechStuff News

Advertise With Us

Follow Us On

Host

Jonathan Strickland

Jonathan Strickland

Show Links

AboutStoreRSS

Popular Podcasts

2. In The Village

2. In The Village

In The Village will take you into the most exclusive areas of the 2024 Paris Olympic Games to explore the daily life of athletes, complete with all the funny, mundane and unexpected things you learn off the field of play. Join Elizabeth Beisel as she sits down with Olympians each day in Paris.

3. iHeartOlympics: The Latest

3. iHeartOlympics: The Latest

Listen to the latest news from the 2024 Olympics.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.