Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
You're listening to KFI AM sixty on demand in.
Speaker 2 (00:04):
Promote kelledy night.
Speaker 3 (00:05):
I'm Chris Merril KFI AM six forty. Listen anytime on
demand on the iHeartRadio app. The app is also where
I'm asking, hopefully you'll respond, do you use AI to
cheat at work? Talk a little bit about AI this hour,
because there's been so much It seems like it's everywhere
(00:25):
and it can be incredibly helpful, But if you are
like the godfather of AI, you also believe that it
may completely wipe out all of humankind.
Speaker 2 (00:35):
Yeah. Yeah.
Speaker 3 (00:37):
Joshua Benjo, professor at the University of Montreal and it's
University de Montreal that whatever he is an AI pioneer.
He's one of the He's among the loudest voices calling
for a moratorium on AI model development to focus on
safety standards. He's considered one of the godfathers of AI,
(00:58):
and he says, if we build miss that are way
smarter than us and have their own preservation goals, that's dangerous.
So he's concerned that AI is going to wipe out humanity.
The head of Anthropic, which is one of the big
AI companies, has said the same thing. He believes there's
a twenty five percent chance that AI could wipe out
all of humanity. Elon Musk, who has his own AI Rock,
(01:21):
is a bit more optimistic.
Speaker 2 (01:24):
He says, only a twenty percent chance.
Speaker 3 (01:27):
You know, as we think about AI and will it
kill us all, I think back to President Trump's first term,
and I don't know if you remember, we were having
we were talking about Syrian refugees at the time, and
there was a big concern that refugees could be terrorists
(01:48):
posing as people trying to come to the United States
seeking asylum, right, And so there's this big.
Speaker 2 (01:52):
Debate over to allow Syrian refugees in.
Speaker 3 (01:55):
And the liberals were saying, you know, they're in a
war torn country and they're looking for safety, and you're
gonna leave them under the thumb of this oppressive regime
and terrorized every day by you know, by the what's
going on in Syria. Then you had the conservatives that said, yeah,
we get it, we're sympathetic, but it's too big of
(02:16):
a risk to our country.
Speaker 2 (02:17):
We have to put our safety first.
Speaker 3 (02:18):
And all it would take is one bad actor in Syria,
one terrorist who says, oh, no, take me, I'm scared
too to come to the United States and then do
something horrible here. And the analogy that was used by
Donald Trump Junior was, you may remember this. It became controversial.
He said, if you had a bowl of if you
had a bowl of skittles in front of you and
(02:42):
one of those skittles was poisoned and would kill you,
would you still take a big handful?
Speaker 2 (02:48):
Right? That was the analogy that he used.
Speaker 3 (02:52):
When it comes to AI, if we were to use
that same analogy, we would say, Okay, you've got a
bowl of skittles in front of you, and one out
of every four skittles will kill you and everyone you've
ever loved and wipe out your entire bloodline, would you
still take a handful? And the answer is, well, how
(03:14):
much money could I make? Yeah, that's the answer. The
answer is if I don't take a handful and China
takes a handful and they don't die, we fall off
as the economic powerhouse globally.
Speaker 2 (03:30):
That's the big concern right.
Speaker 3 (03:31):
Now, which is why we keep putting more and more
eggs in the AI basket. And I'm wondering if it's
trickling down like we keep hearing about AI and you
keep hearing about how it's going to be used everywhere,
and you keep hearing about how it's I mean, it's
more than just.
Speaker 2 (03:45):
Chat engines or whatever language learning models.
Speaker 3 (03:48):
And I did use some for today's show to help
me summarize some of the stories that I have, not
a lot, because what I found was I found a
couple of things because I asked it to help me create,
you know, to come up with creative takes on different stories.
And what I found was I spent a lot of
time using the left side of my brain trying to
(04:08):
figure out what are logical prompts in order to get
the creative side of AI. And then I found myself
not accessing the right side of my brain, which is
the creative side, and that's the side I really I
like to use right So I found my brain being
twisted a little bit. And so my question to you is,
do you use AI to cheat at work? Maybe it's
(04:30):
not cheating, Maybe you're required to do it.
Speaker 2 (04:31):
I don't know. My wife uses it.
Speaker 3 (04:34):
All the time, but she works with contracts, appeals letters.
She know, she fights with insurance companies. It's basically your job,
and so she says she uses it. It saves her
a lot of time, but she has to be meticulous
with double checking anything that AI uses because it sometimes
makes things up and she's run into that. And also,
(04:55):
you may ask it for something specific about, say California
insurance policies, and it will say, here's what I found about,
you know, the Medicare reimbursement rates in New York, and
she's like, that doesn't help, right, So it can be.
It can be more trouble than it's worth. Here's what
you said. And by the way, the talkback is still open,
so if you're on the app, go ahead and hit
the talkback button and let me know.
Speaker 2 (05:15):
Do you use AI to cheat at work?
Speaker 4 (05:18):
Yeah?
Speaker 5 (05:18):
This is big e and I have definitely used AI
a couple of times, and I'll never do it again
because it is so obvious that it has been influenced
and written by AI that yeah, never gonna use it
again to cheat at work.
Speaker 2 (05:33):
Yeah, all right, fair enough. And that's what I found too.
Speaker 3 (05:36):
In fact, I was I was using it to write
some radio teases, you know, when we said here's what's
coming up, right, and I found that it was using
a lot of cliches like coming up next, which you
will never hear me say because that's a radio cliche. Right,
It's like, hey, how you doing out there?
Speaker 6 (05:53):
Now?
Speaker 3 (05:53):
You'll never hear me say that because that's radio cliche.
That's the kind of stuff you learned in your first
year of small market radio. Don't do this right, it's
so bad. And it tried to do it, so I
had to tell it, don't do that. That's horrible. And
I got it. So that was a little bit creative.
And I told my partner I do a radio show
with him. And I told my partner, I said, hey, man,
(06:13):
I'm trying this, and it's all right. I'll speak curious
to see how it works. And we started going and
he says, I can tell. He said, I could tell
because it's not exactly your voice. It's not exactly how
you would say things. And that's even after I tweaked
what AI gave me a little bit, so give me something.
I go, okay, I'll do this, and that I would
tell AI, hey, this is what I ended up using it,
and it goes that's helpful, and it's getting better and
(06:35):
learning what my voice is, but it's still not right.
Speaker 7 (06:39):
My job is prize picks and it makes up information
so no, I don't use it.
Speaker 3 (06:45):
Okay, yeah, man, the phantom info is is a bad deal.
Speaker 2 (06:50):
Bad deal.
Speaker 4 (06:51):
I can't cheat because I've been at work so long.
My supervisor knows exactly how I write it change from
when I usually write, so I would be in trouble
or using AI because everyone knows how. Am I right?
Speaker 2 (07:07):
Yes? And that's what I'm running into.
Speaker 3 (07:09):
I've also found that a lot of the AI videos
that are coming out, and some people are fooled by them.
But if you watch enough videos that are AI, you
start picking up on what is AI, it becomes obvious.
Like I've seen, it might be a completely different face.
You might have a young white woman in one video
and you might have an old, grizzled Mexican man and
another video, but they have the same speech pattern, and
(07:30):
you go, they're not real. They look real. But you know,
we're good at picking up those nuances. AI is not
so good at separating. Hey, Chris, how you doing damnthing?
Speaker 4 (07:41):
Hey?
Speaker 8 (07:41):
I don't up this costs or not. But I'm a
truck driver and I talk to it and sometimes I'll
say you right here, and I'll have a conversation with
it like a person that's cool. You know, I'll talk
about regulations on trucks, talk about music, talk about system.
Speaker 2 (08:01):
Does a lot.
Speaker 8 (08:03):
That's good, tell me a few things.
Speaker 2 (08:06):
All right, it does count. And I think you're I
think you're using it correctly.
Speaker 3 (08:11):
I like that, although be careful because Meta has announced
it's Facebook Instagram. Meta has announced that they're going to
start using your conversations with their AI chatbot to help
personalize ads and content and offer you a glimpse of
just how the company intends to pay for its expensive
artificial intelligence efforts. Yes, they're harvesting info. Remember if it's free,
(08:34):
you're the product that's on Meta. So far, I don't
know that others are doing it yet. I don't know
that they're not going to. There is a concern that
we're in an AI bubble. Have we dumped too much
into the AI basket?
Speaker 2 (08:50):
What if?
Speaker 3 (08:52):
What if we are at a bubble similar to what
we saw with the dot com is back in the
nineteen nineties.
Speaker 2 (08:57):
Oh man, are.
Speaker 3 (09:00):
We about to see the Magnificent seven go down to
magnificent toilet? The billionaires say no, no, no, no, keep
using it. I'll tell you who's high on it, because
he's already got billions in the bank. That's next. Chris
Merril KFI AM six forty. We're live everywhere on the
iHeart Radio app. Elean Gonzalez is live from the KFI
twenty four. She's really live. It's really I lean, I
am really live. That's like, that's right, That's why I
(09:22):
like you.
Speaker 1 (09:23):
You're listening to KFI AM six forty on demand.
Speaker 3 (09:26):
Chris Merril and from O Kelly, KFI AM six forty
more stimulating talk on demand anytime on the iHeart Radio
app Talking AI And I'm wondering if you're on the app,
go ahead and hit that talk back button.
Speaker 2 (09:40):
Let me know. Do you ever use AI to cheat
at work? All right?
Speaker 3 (09:45):
Or maybe you don't. I think the most interesting one
we've had so far was the truck driver. He says
that he listens to it, you know, kind of has
conversations with it. He says, he learns a lot. That's
really fascinating to me. I have a tendency to do
these long cross country drives. I have family in northern Michigan,
and I I will Uh you didn't hear this from me?
(10:08):
I will I will drive straight through and if I
have to have a little fener meme to help me out.
Speaker 2 (10:11):
It's five hour energy. It's a little diet pill. Off
I go.
Speaker 3 (10:15):
And I'll drive all night and it freaks my family out,
but it's it's not been a problem.
Speaker 2 (10:22):
You just get off the road when I do it
as well.
Speaker 3 (10:25):
But I got to tell you, there are times that
you're like, Okay, this could not be more boring. And
what I find is I will listen to I try
to plan those trips when I know that there's.
Speaker 2 (10:33):
A full day of football on.
Speaker 3 (10:35):
So I'll try to do it on a weekend where
I get to listen to college football on Saturday, and
then I know I'm going to get NFL on Sunday
and I'll listen to sports talk shows overnight.
Speaker 2 (10:43):
You know. So that's that's how I tend to plan it.
Speaker 3 (10:47):
But now I'm fascinated by the idea of driving and
then having AI converse with you and also you know,
ask you questions.
Speaker 9 (10:56):
My car does that?
Speaker 2 (10:58):
It does?
Speaker 9 (10:58):
Yeah, I have a Tesla and I recently did the update,
and I have groc And it's like I'm like, tell
me a story about a dog, tell me happy story
about a dog, and he starts telling me this whole
story about a golden riches. Yeah, it's pretty cool. And
it also has a therapist section to it.
Speaker 3 (11:13):
So that is really interesting to me, the whole therapy side.
And I've talked with doctor Wendy about this too, you know,
and she's a little leery.
Speaker 2 (11:20):
I think that's fair to say.
Speaker 9 (11:22):
Yeah, I think you should definitely be careful with that.
Speaker 2 (11:24):
Yeah, but I don't hate it if you're trying to
combat loneliness. Yeah, if it just feels like you've got
a companion.
Speaker 9 (11:32):
Yes, like somebody's sitting in the passenger seat.
Speaker 2 (11:34):
Yeah.
Speaker 3 (11:35):
And I'm the kind of geek that I would see
something on the roadway and I would say.
Speaker 2 (11:40):
Hey, Rock, what's the story on the largest ball? It's wine?
Like whose idea was that? And how did it come about?
Speaker 9 (11:47):
You know, I would tell you.
Speaker 2 (11:49):
That's what I would love to do.
Speaker 3 (11:50):
I would just love to have it basically narrate for
me like a tour guide while I'm driving across the country.
Speaker 9 (11:55):
I'm gonna have to do that, except the only thing
with the Tessa is not very good for road trips.
Keep having got to stop in a few hours. Yeah,
I always rent cars for that. Oh do you that's
interesting for long trips. Yeah, yeah, I.
Speaker 3 (12:08):
Get that, but you're saving a buttload on gas too,
and that when you see this refinery fire, you're like.
Speaker 9 (12:14):
Yeah, yeah, I was cheering. I'm like, yeah, burn baby. No, no, no,
not at all, not at all. No, it's actually close
to where I live, so yeah, really scary. Yeah, have
friends calling me? You work in the news. What's going on?
Speaker 10 (12:28):
Help?
Speaker 9 (12:28):
What do we need to know?
Speaker 2 (12:30):
Yeah? Huh was it? I mean, could you smell it
at your place? Was it stanky? They were saying, like, well,
the air quality tested.
Speaker 9 (12:38):
Okay, the air was blowing the other way from me,
so I was fine. But yeah, my friends, my friend
said they could smell it. One of my friends that
was irritating her eyes.
Speaker 3 (12:47):
Oh okay, did she have red, itchy irritated eyelids?
Speaker 9 (12:51):
That's what she said.
Speaker 2 (12:52):
Yeah, it could be caused by mites.
Speaker 9 (12:55):
Was that demodermics something, you know, bleferitis? Something irritated. I
ain't linz. I love that commercial so much. It's cheap.
Speaker 3 (13:07):
Every time it comes on, I tell you, Kayla is
producing remotely today, And every time it comes on, I'm
just like, hey, Kayla, red it irritated.
Speaker 2 (13:16):
I ain't linz. I hate you.
Speaker 9 (13:19):
Yeah, the songs get stuck in your head.
Speaker 2 (13:21):
I love it so much. All Right, here we go
your talkbacks.
Speaker 10 (13:24):
I don't use AI in my job. I'm a truck
driver and I haul chemicals, so there's no use for
AI here. I use it for other things, but yeah,
for my job.
Speaker 3 (13:35):
Now, I bet the I bet that the people at
his office use it for things though. That's where they
say that the AI is is the biggest threat to
jobs is office jobs and middle management. Entry level and
middle management, they say, is where the biggest threat is.
And all I can think is, well, don't you think
(13:57):
AI could easily replace the CEOs? Unfortunately, would have to
be the CEOs that would decide to replace themselves with AI.
Speaker 2 (14:04):
And they're not gonna do that.
Speaker 3 (14:06):
You ever noticed the CEOs never when it comes time
to right sizing reductions and force to people.
Speaker 2 (14:14):
On top never get reduced ever. All right, from talkbacks.
Speaker 11 (14:19):
Chris, Hey, I've used AI to get winning lottery and
Kino numbers. That didn't work.
Speaker 2 (14:26):
Oh that's too bad.
Speaker 11 (14:28):
Watch the original nineteen seventy three West World movie with
Yule Brenner. Watch nineteen sixty eight two thousand and one
Space Odyssey.
Speaker 3 (14:38):
Oh yeah, that's how I can't do that, Dave.
Speaker 11 (14:42):
Watch I Robot with Will Smith, and you have your answer.
AI is the devil. Oh, good night and good luck.
Speaker 2 (14:50):
Thank you, Oh, good night and good luck.
Speaker 12 (14:54):
Yeah.
Speaker 3 (14:54):
But there's also been a number of films where AI
has been good.
Speaker 2 (15:04):
Truth be told.
Speaker 4 (15:05):
Here.
Speaker 2 (15:05):
I've got another story. Let me see, where's s A
Where is this? Oh? I had it to schedule for
the next segment. I'll do it now.
Speaker 3 (15:12):
A story about how AI can design toxic proteins and
they are escaping through biosecurity cracks. A research team patched
one vulnerability. They say safeguarding against the next ones will
require constant vigilance. This is from the Washington Post. So
scientists are discovering a vulnerability in the safety net intended
to prevent bad actors from using artificial intelligence tools to
(15:34):
concoct hazardous proteins for warfare or terrorism. In other words, toxins,
bio warfare. AI can accelerate the development of these weapons,
and that can be used against this. I don't think
that the threat of AI is AI itself. I think
we are far more likely to use AI to wipe
(15:56):
ourselves out. That's where I think the threat is I'm
not concerned about AI becoming sentient and deciding it's gonna
protect itself and wipe out all of humanity. I'm concerned
about all of humanity using AI to wipe out all
of humanity. That's my biggest, my biggest fear. We'll get
more of your thoughts here the talk about question. Do
(16:17):
you use AI to cheat at work or anything else
that you've got thoughts on the AI? I love that
and Mark Cuban says he loves it. AI is the
great democratizer. I'll tell you how. He's a bit contradictory
in his own thoughts and he's a guy I actually like.
Speaker 2 (16:34):
But that's next. I'm Chris merrill Ian from O Kelly.
Speaker 1 (16:38):
You're listening to KFI AM sixty on demand.
Speaker 3 (16:41):
It's never disappointed. I love it when I love it
when we get a chance to talk. I love this
talkback on the on the iHeartRadio app. I just love
it because I ask you, do you use AI to
cheat at work? Or you know, maybe you're using it
somewhere else in your life. Just curious about how you're
using it? And you guys have given some great responses
so far.
Speaker 2 (17:00):
How else?
Speaker 3 (17:00):
And I'm going to tell you what Mark Cuban uses
it for here in just a moment too. But I
think Mark Cuban is missing some key details of the
AI stuff.
Speaker 6 (17:09):
Hey, Chris, I don't use AI for work because it's
not good enough. Oh, but I'll tell you when I
think it will be good enough.
Speaker 2 (17:16):
I kind of feel his.
Speaker 3 (17:17):
I kind of feel that I've tinkered with it this week,
and I feel like it's not quite ready for prime time.
Speaker 6 (17:22):
And that's sometime in twenty twenty eight. Why after in
Video releases their Fieman GPU.
Speaker 2 (17:30):
Tony, is this guy talking in geek? Sure sounds like
I love it.
Speaker 3 (17:37):
I love talking geek. Oh my gosh, Well you want
to just get my attention. You start talking about you
start talking in geek, or you start talking about power tools,
and I am in Okay, go ahead, all right. What's
the fine in GPU?
Speaker 6 (17:51):
Because the fine men will be able to do actual reasoning,
not simulated reasoning. Oh like the current GPUs for it's
potentially scary. Yeah, kind of game changer.
Speaker 2 (18:05):
Total game changer.
Speaker 3 (18:06):
So if it can reason, then it could reason how
it should teach itself.
Speaker 2 (18:11):
Oh, talk geek to me, talk geek to me. I
love that. Thanks, for that call.
Speaker 7 (18:19):
Chat GPT has the same feature as a Tesla. I
just use that on my phone and I've had lovely conversations.
Speaker 2 (18:27):
Oh it's like you, Elean, he's doing the same thing
talking to his car.
Speaker 9 (18:30):
Yeah, but he does probably does it through his phone.
Speaker 2 (18:32):
He said he uses chat GPT through his phone.
Speaker 3 (18:35):
Oh no, he said. He said, he's got chat GPT
in his phone. It does the same thing as what
his Tesla does. Because Tesla's going to be rock.
Speaker 9 (18:40):
Right, Yeah, I think Min's cooler because it's kind of
like what was that kit? Oh yeah, yeah, exactly.
Speaker 2 (18:49):
Very good. What was it? What night Rider?
Speaker 3 (18:54):
No? Night Industries kit stood for something K I T
T two thousand.
Speaker 2 (18:58):
Oh is that what it was? Yeah? Night Industries two thousand.
That was it?
Speaker 3 (19:02):
See Tony speaks geek. I watched the show, I know,
beautiful main it. That's one of those shows you watch.
Chips is another one. You watch those shows from like
the seventies and eighties, and then you see stars that
were making the guest appearances, but they weren't stars yet.
Speaker 2 (19:22):
So I'm flipping through. I don't know.
Speaker 3 (19:24):
It was late night and h and I see night
Rider was on, and I started watching it and Geina
Davis was the was the girl who needed Michael Knight's help,
and I was like, this is great.
Speaker 2 (19:34):
It's Geea Davis. Nice.
Speaker 9 (19:36):
I love Geena Davis, I know.
Speaker 2 (19:38):
But she hadn't been you know, we didn't know her then.
I just love it. I love those old.
Speaker 9 (19:41):
Shows before they were stars.
Speaker 2 (19:43):
Yeah, it's perfect with AI.
Speaker 7 (19:46):
For long cross country trips. Although it does limit the
amount of time.
Speaker 2 (19:50):
You could talk to it.
Speaker 7 (19:51):
You get a few hours at a time, but they'll
tell stories. I'll give you interesting information about the roadside,
all of that stuff as possible. So it's a great
tool for that.
Speaker 2 (20:01):
That's cool.
Speaker 3 (20:02):
Oh yeah, chat GPT has limitations unless you pay for it,
right yeah.
Speaker 9 (20:08):
Yeah, it says you've reached your limit then you have
to pay chat GPT. Yeah you get like no, I
think it comes with my Tesla. With the Tesla, well,
I pay like I don't know how much a month
for a few extra features like the fart features and
all that. Yes, my car farts.
Speaker 3 (20:24):
Hold on, hang Eileen, Yes, you're a respected journalist. My
car in the second largest medium marketing.
Speaker 9 (20:33):
I make my car. I choose the quack over the fart.
Speaker 2 (20:36):
But it's not my concern is not that the car
makes the fart noise. My concern is that you're paying
for the car to make fart noises.
Speaker 9 (20:43):
I pay for that. I pay for some radio features,
and that's fair.
Speaker 2 (20:49):
I like that.
Speaker 9 (20:50):
There's there's more that comes with it.
Speaker 2 (20:51):
Okay, you have subscriptions in your cars. What you're calling
Chris great show? You hear that, everybody, one more time.
Let's hear what he says, Chris great show. He said
great show.
Speaker 3 (21:04):
So I needed that because Keyla sends me all the
ones that people tell me how much they hate me too,
and I have to listen to those.
Speaker 2 (21:14):
But I don't blame on.
Speaker 12 (21:15):
The respond to your AI question. Okay, Yes, I use
AI pretty much every single day to help expedite what
I do in writing emails, analyzing reports, et cetera.
Speaker 2 (21:30):
It's a great aid to.
Speaker 12 (21:31):
Have, and on top of that, most of my colleagues,
if not all, use it, and so it's the only
way I can really keep up with the work.
Speaker 2 (21:39):
So there you go, there you go. Okay, So it's
the only way you can keep up with the work.
Speaker 3 (21:43):
And this is why I'm tinkering with it, because I
feel like if I don't, I'm going to get left behind.
And I have to tell you this, It's one of
those deals where I'm getting to that age where the
new technology breaks and I'm I'm really hesitanting to embrace it,
but I'm so concerned that if I don't, I'm going
to get left beyond. Mark Cuban sat down with Axios
and he was talking about AI and they were asking
(22:05):
him what he thought about it, and he said that
AI was the great democratizer. Now how do you come
with that? So he's doing this conversation with Axios and
they said, what do you mean by that? And he said,
right now, if you are a fourteen to eighteen year old,
you are you're in not so good circumstances. You have
(22:27):
access to the best professors and the best consultants. It
allows people who otherwise would not have access to any
resources to have access to the best resources. In real time,
you can compete with anybody. So the thinking is that
AI is a tool that all of a sudden, we
all have the same tool. It's not like your tool
(22:49):
is better than that tool and that it gives you
an advantage. We all have the same tool. The problem
is if you're using AI to learn if you're using
AI like a twenty twenty five version of Goodwill hunting,
to give you the education that everyone else can have,
you are You're accessing the best professors and consultants, but
they're only the best in fields where humans are no
(23:11):
longer needed. AI is not going to train you on
a better technique to run a back hope, right, It's
not going to train you on a better technique to
be a long haul driver. These are jobs that are
going to be necessary in the future that can't be
replaced by AI. These are the jobs that AI cannot
(23:33):
train you on. So basically, if AI can teach it,
AI can do it, and you're redundant. Now that's not
to say you shouldn't expand your knowledge base, you shouldn't
absorb as much as you possibly can. I love that
aspect of it, but just be cautious that if you're
learning something that AI can do, then why does it
need you. Mark Cuban goes on to say, there's nothing
(23:56):
I don't consider using it for it. So how is
the How is the playing field being level? So instead
of in the real world, instead of competing against one
hundred other qualified applicants, now you have to compete against
one hundred other qualified applicants for that job and one
applicant who will work twenty four to seven at a
fraction to the cost and no office drama. So how
(24:17):
is that better for the other one hundred people. Sure,
you may be on the same playing field now as
the other one hundred, but there's one number one on
one the AI outshines all of you in those fields.
They asked him about whether or not we're in the
AI bubble and is it comparable to the Internet bubble,
(24:39):
and he says no, The difference is the improvement of
technology basically slowed to a trickle. Talking about AI, he said,
we're nowhere near the improvement of technology slowing to a
trickle in AI. So it slowed when we talk about
the Internet. Excuse me, but it's not slowing when it
comes to AI. I guess I would argue that means
it's a bigger bubble when he uses it, and he says,
(25:01):
in terms of my health workouts, I use it all
day every day, just feeding in information and asking for
feedback because it builds up the memory and it starts
to know me the way people go to a doctor
for a second opinion.
Speaker 2 (25:11):
I do the same thing with.
Speaker 3 (25:12):
AI, all right, So I guess I can trash my
P ninety x VHS tapes now because AI will not
do it for me. Tony Horton and Billy Blanks are
out of luck. Sorry fellas. Ty Bow is so last year.
Speaker 2 (25:28):
Now I've got groc Bow.
Speaker 3 (25:33):
There is controversy stirring in our own backyard when it
comes to AI, and Hollywood is not happy.
Speaker 2 (25:37):
That's next. Chris Maryland from O Kelly to night.
Speaker 1 (25:39):
You're listening to KFI AM six forty on demand.
Speaker 2 (25:43):
I'm not Mo Kelly.
Speaker 3 (25:44):
I'm Chris merril In from MO Tonight KFI AM six
forty more stimulating talk. And remember you can always listen
anytime on demand of the iHeartRadio app. While you're on
that app, go ahead and hit that talk back button.
Our question tonight do you use AI to cheat at work?
But also do you use it anywhere? And you guys
have been great, Honestly, I don't think we've had any
(26:06):
bad we have any bad calls tonight? Have we have
all been really good? Insightful? Even people that are like no,
I don't use it. Here's why. Others that are like
I won't use it. Here's why I love that. You
guys are great. You're dynamite. However, there is a big
to do bruin. You've probably heard about this, Tilly Norwood.
ABC News did a report on her.
Speaker 13 (26:26):
Tilly Norwood is beautiful, she's cheap, and she always nails
it on the first take.
Speaker 2 (26:31):
Yeah, she is smoking hot.
Speaker 9 (26:32):
My teen's a binary.
Speaker 13 (26:34):
But even though Hollywood's hottest new name may have the
IT factor, it's her AI factor that's drawing fire.
Speaker 14 (26:40):
Cammen set are so great driven that they don't think
about consequence.
Speaker 13 (26:45):
Doctor Frand Drusher just wrapped off a term as president
of SAGA, after the union that represents Hollywood.
Speaker 2 (26:50):
Actors and their radio hosts.
Speaker 13 (26:55):
Norwood is definitely not an actor.
Speaker 2 (26:57):
Tilly Norwood I generate it.
Speaker 3 (27:01):
Also, the people that are doing this, that are introducing her,
they're all AI generated too. They don't tell you that,
but they're all AI generated. I can tell because I've
seen enough AI videos and they all have the same style,
same mannerisms.
Speaker 13 (27:12):
She's an AI created character that Particle six Productions is
now pitching to studios.
Speaker 9 (27:18):
Three seasons on the podcast.
Speaker 14 (27:20):
Is It's a threat to everyone. It's a threat to writers,
it's a threat to lawyers. It's a threat to doctors.
I mean, yes, it's a threat.
Speaker 13 (27:28):
Sagafter has been fighting for protections against AI for years now,
concerned that synthetic characters that don't require any kind of
human presence or voice will destroy the acting world.
Speaker 2 (27:42):
I mean it will today, The.
Speaker 13 (27:44):
Union wrote sag After believes creativity is and should remain
human centered. The union is opposed to the replacement of
human performers by synthetics.
Speaker 3 (27:53):
Yeah, I agree, and I also don't want my job
to be replaced by AI. Now it couldn't do it,
but in the future, I don't know that it could.
And I'd like to think that even if it does
that you would know there's no real connection there, you know,
(28:14):
like you and I connect.
Speaker 2 (28:15):
I don't know that AI would actually be able to
connect with you.
Speaker 3 (28:19):
When you're driving in the car, you hear somebody that
sounds real, You're talking about things that matter to you. Right,
it'll get better at picking topics, angles, all that kind
of stuff that we do, But in the back of
your mind, you're always going to know.
Speaker 2 (28:32):
I had a friend argue with me about this. He goes,
who cares this is overblown?
Speaker 3 (28:36):
Nobody got mad when Jessica Rabbit was on the screen,
and I thought, it actually took more people to create
Jessica Rabbit than it would have to hire an actress
to do that live, because you had to have artists,
you had to have voiceovers, you had to have you
know what I mean. You had to have more people
to create the animation than you did to just have
an actor play a part.
Speaker 2 (28:57):
So I think some people just don't get it.
Speaker 3 (29:01):
Speaking of Jessica Rabbit, Disney set a cease and desist
to one of the AI companies. They set a CND
to Character AI, demanding the personalized AI chat bot developer
immediately stop using copyrighted characters without authorization. Disney emphasizing its
main character its main concern, excuse me, isn't just financial,
but that Character AI's platform weaponizes Disney characters in a
(29:23):
way that could damage its brand long term. Yeah, this
is one thing I'm concerned with too, is that we
seem to have this lase fair attitude about copyrights when
it comes to training AI, and they're like, yeah, but.
Speaker 2 (29:35):
It's it's for the benefit of all so it's okay.
Not really.
Speaker 3 (29:40):
I mean, there's a reason that copyrights last as long
as they do. There's a reason that we just got
our winning the Pooh horror movie, and that is because
Disney was protecting that. But now all of a sudden,
we're gonna go yeah, but AI can do it if
it wants no not okay, So Disney said, the report
that they got from Parents Together Action and Heat initiative
found character AI chatbots engaged in grooming and sexual exploitation
(30:02):
and emotional manipulation. That's not good. So they're obviously very
worried about that. As far as further developing video of
AI characters.
Speaker 2 (30:16):
TikTok has reason to be worried. There's a new.
Speaker 3 (30:21):
Call it a social media times suck app that is
AI generated. So all those videos that you see on TikTok,
now that people are making stupid dance videos and stuff
like that, you're already starting to see some AI pop up.
Speaker 2 (30:35):
I am.
Speaker 3 (30:36):
I have lots of videos I send to my friends
of dogs flipping them off. But evidently this new Sora
app that was just released by Open Ai this week,
and Meta released one as well as part of their platform.
But remember Meta captures whatever you serve so they can
market to you. It's making AI generated videos, and it's
putting them in.
Speaker 2 (30:56):
A scrollable social media feed.
Speaker 3 (31:03):
So the next time you want to sit on the
commode and watch your brain melt, it's all coming from AI.
Speaker 2 (31:11):
Yeah.
Speaker 3 (31:11):
Now, let me tell you, if all of a sudden,
the TikTok influencers start losing money, they're gonna start coming
out against AI too.
Speaker 2 (31:19):
As soon as all of us.
Speaker 3 (31:21):
Start losing money on things, we're gonna go this AI
is not working for us.
Speaker 2 (31:26):
But will it already be too late? That's the question.
Speaker 3 (31:29):
I do a show every Sunday evening, and I would
invite you to join us this week. It'll be right
after the Chargers game. I think we're on at five
thirty after the Chargers unless it goes into overtime. And
so we do at six o'clock every week, and i'd
like to I'd like to bring it back tonight. We
do a segment called There's No Business Like and it
largely show business focused. I think MO tends to focus
(31:49):
on entertainment type stories in the nine o'clock hour as well,
so we will we'll do that too. There's No Business
Like Next Chris merrill in from Okelly k if I
AM six forty. We're live everywhere. iHeartRadio app kf I
A M six forty on demand.
Speaker 7 (32:09):
M hmm