Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Tell you what does stop, unfortunately is Tech Tuesday with
rich Dmurrow.
Speaker 2 (00:05):
Wish we could do that forever because I have so
many questions.
Speaker 3 (00:07):
Good morning, rich Oh thanks Bill. I thought you said
it's going to stop.
Speaker 1 (00:11):
I was like, oh, wow, no, no, yeah, Well, unfortunately
it's going to stop because we are limited. When we
talk about these lawsuits for crazy those are not limited.
The amount of time we have together is limited. That's
where I was going. You know that anyone can sue
for anything.
Speaker 4 (00:28):
I mean, when it's that kind of high level thing,
that's a whole nother world.
Speaker 1 (00:33):
Yeah, this is the Texas Attorney General doing. It's not
just some crazy ass guy out there.
Speaker 2 (00:39):
But anyway, let's.
Speaker 1 (00:41):
Get into our stuff because I think this actually is
more fun. And so let's talk about fake AI receipts. Now,
what is that about?
Speaker 4 (00:55):
Yeah, this is and this is something that's been kind
of going on for a bit. And now there's a
report from the Financial Times that businesses are getting tricked
by these AI generator receipts. And we know that if
you do expense reports. Number one, they're annoying. Number two,
they require a lot number three, A lot of it's
automated because these companies have seated control to these giant
(01:17):
concurs of the world and things like that, and employees
are now apparently submitting ultra realistic fake expense receipts thanks
to AI. And so one of these software companies ramps
As they flagged over a million dollars in fake receipts
in just ninety days. And they look really real, So
the AI can even make them look like they're wrinkled paper.
(01:41):
They can put signatures on them, and people, you know
these AI companies, they put like a watermark in them
so that you can't really submit them because the software
will figure that out. So what do the employees do.
They just screenshot it or take a picture of it
with their phone and they get around that. So this
is a huge issue, I think for companies, and it's
(02:02):
something that I if you're an employee and you're doing this,
I don't know why you would do this because you
don't want to get caught doing this.
Speaker 3 (02:08):
Would Yeah, But.
Speaker 1 (02:09):
I have a question, why can't anybody use existing technology
and just put in a fake expense report or are
you talking about providers that putting that in there and
getting money sent to them for products they didn't send
into the company, didn't sell the company.
Speaker 2 (02:26):
I don't know who is doing this.
Speaker 4 (02:29):
Well, I think it's the employees that are going on
company travel and so they are they're padding. It's it's
kind of a new way of what's been going on forever,
which is padding the expense report, right, except this time
it's ultra realistic. And because I think the the idea
here is because AI is fact checking the expense reports
before they go for payment. Majority of the time, you know,
(02:53):
I think that AI just looks at the receipt says, oh,
that looks good, and it passes it through.
Speaker 3 (02:58):
So but even say concur.
Speaker 4 (03:00):
Which you know, they scan a lot of these receipts,
one of their executives said, you can't even trust your
eyes anymore because these receipts look so good. And so
my question is, if you're an employee doing this, you're
gonna get fired. So probably not a good idea.
Speaker 2 (03:18):
Well, I mean, it's just another way of stealing money.
Speaker 1 (03:21):
I'm assuming right, you pitch in a fake report, which
people do all day long and until they don't. I
what you're saying is AI just makes it easier for
an employee to rip off the employer bottom line.
Speaker 3 (03:34):
It makes this It makes it very easy.
Speaker 4 (03:36):
And I remember seeing a story about this a couple
of months ago where this was just starting to happen,
and this was before the AI image generators have gotten
so good now, and I remember trying to generate one
of these receipts and I could not believe my eyes.
I mean, it was so good that it's like, oh
my gosh. So I think for maybe here's what I
think happens. Maybe employees tiptoe into this by saying, oh shoot,
(04:00):
I forgot to get that receipt or I need that receipt,
and so maybe they recreate it, and then all of
a sudden, you know, some other people just realize that
you can just make this very simple.
Speaker 1 (04:11):
Okay, Rich anti recognition glasses. I don't even know what
that means other than I don't recognize you.
Speaker 3 (04:21):
Yeah.
Speaker 4 (04:21):
So imagine you put these glasses on and you try
to use face ID on your phone and it does
not work. That's what we're talking about here. So this
is a company called Zenny Optical. They've come out with
over the years a bunch of glasses for like blocking
blue light and things like that. But now they've got
this new technology called zenny id Guard, and it blocks
(04:44):
infrared facial recognition systems from tracking you. So I thought
this was kind of cool, kind of interesting, kind of
a sign of our times. So the coding on these glasses,
which is a subtle pink so that you know, you
kind of signal to other people that hey, I'm blocking
this stuff, reflects up to eighty percent of near infrared light,
which is what a lot of cameras and biometric systems
(05:07):
use to identify faces. And you can test this out
very simply by trying face idea on your phone or
something like Windows Hello on a Windows PC, and they
will not work. So the idea here is that you
wear these glasses, you can walk around town and facial
recognition cameras that are trying to capture your biometrics are
just not going to work.
Speaker 3 (05:27):
So you can.
Speaker 4 (05:28):
I don't know if you're completely anonymous when you wear these,
but it's just one step in kind of protecting your identity.
Speaker 3 (05:35):
Yeah.
Speaker 1 (05:36):
I know, there's entire industries that are anti where technology
comes up with something and then instantly there now is
a group, a small group or a company to say,
oh no, no, we're going to fight that and it
seems like it's one industry following another industry. So if
(05:56):
you don't want your face right, if you don't want
facial recognition you walk around all day with, if you
don't need glasses with these playing lenses with no prescription,
and then you're theoretically fine, Do I have that right?
Speaker 3 (06:09):
Yes? Huh?
Speaker 1 (06:12):
Is it easier just to wear like ninja ninja masks
and you know those cost six those cost six bucks.
Speaker 4 (06:19):
It's simpler than the ninja mask because this still lets
people see your eyes. But it's just you know, look,
I think when it comes to what you said to
your point of any industry that pops up, there's always
going to.
Speaker 3 (06:29):
Be a reactionary industry. And I think that you know.
Speaker 4 (06:33):
Is this a bit of a ploy to get people
to spend money on these glasses they may or may
not need. Is this really protecting your you know, your
facial identification?
Speaker 3 (06:42):
Like who knows?
Speaker 4 (06:43):
But the reality is that we see this stuff happen
all the time, and you know, it's one way that
people can feel more secure if they feel like they're
getting id'd in too many places.
Speaker 3 (06:55):
And I think it is interesting.
Speaker 4 (06:56):
We've heard over and over Bill that they use this
facial recognition in places like concerts. If you look and
this is you know, because I love technology, so anywhere
I go, if you look at like small print on
signs at places you go, a majority of them are
now mentioning that they are using some sort of automated
facial recognition or facial tracking at grocery stores and different
(07:19):
retailers that you go to.
Speaker 3 (07:21):
So when you walk in.
Speaker 4 (07:21):
There, yeah, they're they're figuring out like, oh, okay, this
person comes in. They may not have a name to
you just yet, but they know who you are based
on the fact that you come in there three times
a week. And eventually they could layer that with some
sort of third party database and figure out, hey, this
is Rich he's in here every Wednesday.
Speaker 2 (07:38):
Hey, are we reaching the point where you know?
Speaker 1 (07:40):
For example, China uses face recognition basically for every single citizen.
Speaker 2 (07:47):
They know what every single one of.
Speaker 1 (07:49):
Their people looks like with technology, which I don't understand
because they look all the same to me. But you've
got an entire country doing this. Are we heading in
that direction? You think where we're all going to be
somehow surveilled.
Speaker 4 (08:04):
I think it's already happening. I think that it's already happening.
And I mean, look, when I travel, I go from
my car to the airport, to the security to the
next country without speaking to a human being, or in
most cases, without even showing any ID. I mean, when
(08:25):
you go to LAX and you park on the parking
structures there, it reads your license plate. If you've already prepaid,
it opens the gate. You go into the LAX, you've
got facial recognition, You walk up to a camera, it
snaps a picture of you. It says, okay, you're rich DeMuro.
The gate opens up, you go in. When you enter
(08:45):
the other country, you scan your passport or your face
and it just says, hey, okay, come on in. So
we're already living in a society where the government already
knows who you are. They've got your biometrics, they've got
your face, and ye know, we've all sort of agreed
to this. And it's I think the private companies that
are catching up to this, they probably have more information
(09:07):
than the government.
Speaker 3 (09:08):
Because we've handed it to them.
Speaker 4 (09:10):
So I think, you know, we're going to see a
renaissance of people saying I want to reclaim some of
my privacy.
Speaker 3 (09:16):
I'm not sure that's going to be possible in the future.
Speaker 1 (09:18):
Now, I mean, how do you do that without legislation,
without laws that tell these companies and these organizations you
cannot do that anymore short of that.
Speaker 2 (09:28):
And you're right. It just occurred to me.
Speaker 1 (09:29):
I just came back from Europe a few weeks ago,
and as I was getting off the airplane and there's
a line for American citizens and people who have a
foreign passports, and literally, I have my passport in my pocket.
And as I'm taking my passport out, the TSA person
or in this case, the customs and the customers people
(09:51):
say no, no, just walk through here, just look at
the camera. And I was kind of floored. And there's
my picture ugly. I mean, there's is there such thing
as a good looking picture on driver's licenses or any
of this, uh, any of these programs.
Speaker 4 (10:08):
I mean, I don't mind mine, but they don't let
you take another one. I mean, I you can't.
Speaker 1 (10:13):
Drive you in yours, because yeah you can for your
driver's license.
Speaker 2 (10:17):
You can say, well, you take another one. Let you
look at it, and Kate will.
Speaker 4 (10:21):
Because they wouldn't let me look. They just said, that's okay,
you're good no, actually no.
Speaker 2 (10:26):
You ask hey, can I take a look?
Speaker 1 (10:28):
Do you mind if I take a look, Because it's
a digital takey, you're going.
Speaker 3 (10:31):
To you're gonna come up the system. They got to
get people there, Yeah, I know.
Speaker 1 (10:35):
But then they have to they take the picture because
what if someone blinks, what if someone has their eyes closed?
Speaker 3 (10:40):
They look for that.
Speaker 4 (10:41):
They they all they're looking for is just to make
sure that the measurements for biometrics are fine.
Speaker 3 (10:45):
They're not. They don't care if you have you know,
if you look good.
Speaker 1 (10:48):
No, what I'm talking about, No, no, I'm talking about I'm
talking about that. Yeah, I think we're on different pages here.
I'm talking about the driver's licenses photos.
Speaker 2 (10:55):
That's where I was going.
Speaker 1 (10:56):
Okay, okay, we' I'm glad we missed each other and
two ships passing in the night. Rich, we'll catch you
at this Saturday, eleven am to two pm right here
on KFI.
Speaker 2 (11:05):
You have a good win.
Speaker 3 (11:07):
Thank you, you got it.