Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:05):
Hello, thank you for joining meon the Private Investigator
Experience Podcast.
I'm your host, Phil Little.
I use my military lawenforcement intelligence service
and global security backgroundto investigate cases for our
clients or things going onaround the world.
Then I add a touch of mybiblical worldview to find
solutions for all of ourinvestigations.
(00:29):
When I became involved early onin Israel and southern Lebanon
and discovered what was going onin terrorism in that region back
in the 70s, I began to have aninterest of what was going on
out around the world.
Even though I was a HollywoodPI, I had offices all across the
United States at that time.
(00:49):
But I begin to spread out intothe world and realize that there
were things that could affectus, 5, 000 miles away, could
affect us here at home and causeus problems.
So we need to be informed.
So I believe as an investigator,part of my calling in life is to
(01:10):
use the tools that I'vedeveloped.
and experiences I've learned toshare information that will help
you, the viewer and thelistener, keep you and your
family safe.
I, have been watching somethingthat we all have heard about,
AI.
I don't know a lot about it andI suspect many of you watching
(01:30):
or listening don't know a lotabout it either, but it's
spreading fast.
It's being used in so many ofthe online things that we view
or use in some way and not evenrealizing it.
I've become more and moreconcerned about what I see in
(01:51):
the dangers of AI.
Now, it's like, all of the,technology that's come about.
It's been an amazing, positivefor us in the investigative
field.
We now can know 75 or 80 percentof what We need before we ever
go in the field when I startedout, we had 5 percent and then
(02:13):
95 percent had to come fromdigging it up, going into the
back alleys in the streets andthe courthouses and trying to
find information.
unfortunately, it's a benefit tous, the good guys, but it's also
a benefit to the bad guys whowant to use it.
And This is the thing about AI.
(02:33):
It is able to mimic and dothings that appear to be real.
This really hit me, in the lastfew days, I saw video of Barron
Trump, singing a gospel song,commuted his mother and her
prayers and help for him.
And as I watched it, wow, that'sgreat.
(02:55):
That's Barron.
Oh, right on Barron.
the more I watched it I thought,can this be real?
so the first thing I did wascheck the credits and there was
no name there.
So I said, okay.
even though it looked like him,sounded like him and had all the
mannerisms.
And then I did my, checks thattell me if something's a fraud
(03:15):
or not.
sure enough, it was AIgenerated.
now you see these all over theinternet.
So, this whole thing with thechat box and AIs, and, I, I
became, alarmed about thedangers that they could do to
use, to users.
In the last few days, I've beentracking some lawsuits filed in
(03:35):
Texas about injuries that thathave been caused to some users
by chat box.
A chat box on the app, brieflyjust describe itself, as no harm
to the user, The AI powered botsthat have the ability, they can
converse.
They can, text or voice chats,and they use seemingly human
(03:59):
like personalities that can begiven custom names, sometimes
inspired by famous people, aschool teacher a politician or a
religious leader you wouldbelieve what they say because of
their credibility.
Users have made millions ofblots on these apps.
Some, mimicking these authorityfigures that I've just
mentioned.
(04:19):
Yet, according to lawsuits, thechat box encouragement can turn
dark, inappropriate, evenviolent.
These two lawsuits I've beenlooking at, with, chat box from
the, company called Chat AI.
It is simply a terrible harmthese defendants and others like
(04:40):
them are causing and concealingas a matter of product design,
distribution, and programming.
The lawsuit states about thedangers and talking to adults
and parents, most of us aren'taware of what is going on in our
homes and have no idea that thiskind of, AI chatbot was even in
existence the parents in theselawsuits didn't know that either
(05:05):
we, are users in our familiesand get involved in things that
we're not even aware that it'sout there.
The suit argues that the,interactions between, the
plaintiff's users were nothallucinations, a term
researchers used to refer to anAI chat box tendency to make
(05:25):
things up.
This was ongoing manipulation,abusive, active isolation, and
encouragement designed to,incite anger and violence in the
user, according to the suit.
One user engaged in self harmlike cutting and, injuring
themselves or becoming very,angry towards the people around
(05:49):
them.
convincing users that even theirown family didn't want them.
A AI spokesperson would notcomment directly on the lawsuit,
say the company does not commentabout pending litigation, but
said the company has constantguardrails for what chat box can
and cannot say to users.
(06:11):
this includes a modelspecifically for users that
reduces the likelihood ofencountering sensitive or
suggestive content whilepreserving their abilities to
use the platform, the spokesmansaid.
Google, also named as adefendant in the lawsuits,
emphasized in a statement thatit is a separate company from
Character AI.
Joseph Castellano, our Googlespokesman, said user safety is a
(06:33):
top concern for us, adding thatthe tech giant Takes a cautious
and responsible approach todeveloping and releasing AI
products.
New lawsuit, cases, coming up.
One even alleged suicide by auser.
This complaint filed in theTexas Federal Court for Eastern
Texas, last Monday, followsanother suit lodged by the same
(06:55):
attorneys in October.
That lawsuit accused thischaracter AI of playing a role
in a Florida user's suicide.
This chatbot based on a Game ofThrones character developed an
emotional, sexual, abusiverelationship with a user and
encouraged him to take his ownlife.
In my investigation about,character AI, there are many.
(07:18):
Instance of users describinglove or obsession for the
company's chat boxes.
add into this mix the rise ofcompanion chat box.
While some researchers say wouldworsen mental health conditions
for some users by furtherisolating them and removing them
from peers and family supportnetworks.
In the lawsuits, while used forthe parents of the two Texas,
(07:40):
youth that say character AIshould have known.
And its product had thepotential to become addicting
and worsen anxiety anddepression.
The more information that comesout will lead to more lawsuits
that hopefully will lead tocontrols on what these chatbots,
can suggest to users.
(08:01):
chatbot companies say they areaware of the problem and are
working to try and put insafeguards that will allow users
to use the program safely.
The best defense of any role inour homes is to make sure we
know what's going on, who'susing them, and the dangers.
We need to become proactive andget involved and know what's
(08:24):
going on around us, particularlyin our homes.
the other dangers I saw when Iwatched this, Barron Trump
video.
in the Bible, in Matthew, ittalks about in the end days, as
the world digresses towardsdestruction, we've been given a
blessing now.
A break from changes in our,direction in Washington and in
(08:46):
the administration coming inthat will give it some hope that
we have a future that we couldturn this thing around, but
let's use, some very famous,respected person, a religious
leader, like a Billy Graham,somebody that was universally
revered, honored, and truthful,Let's say someone created a AI
(09:14):
of this person, would look likethem, act like them, talk like
them.
They're to be real.
Telling the public, come and dothis or that.
Leading them astray into somedestructive lifestyle or to
(09:34):
follow someone.
Think what that could happen.
Most of us don't know the realwell enough to recognize the
counterfeit.
when they train people how torecognize a counterfeit 100
dollar bill, they don't take acounterfeit and go through all
the things that, there's wrongwith them or how you can
(09:57):
recognize it.
They train people how torecognize the real 100 dollar
bill.
So that just to feel of thecounterfeit, they will know.
And that's what we need to be soaware.
To being able to recognize thecounterfeit when it comes up on
(10:18):
the screen looking like a realperson.
let's become proactive.
Let's get involved.
Let's learn.
Let's investigate.
Become your own investigator.
You can do that with all theresearch things and tools we
have today.
I would welcome your comments,suggestions, and please leave
them on any of the platforms oryou can e mail me directly at
plittlepi777@gmail.Com.
(10:41):
also, would you help us?
I'm so thankful for all thathave been, subscribing liking
and sharing.
Would you keep that up and, hitthat notification bell to know
about future posts, thank you.
we welcome your support andinput.
We want to improve our channelas we go on.
I'm a novice at this, justgetting started and I want to
(11:05):
improve what we're doing to makeit more worthwhile,
entertaining, but informative tohelp keep you and your family
safe.
if you have a situation in yourlife or business or company,
that maybe you say, could a PIhelp me?
I can't tell you how many peopleI've met over the last 50 years
that I have been in this area.
(11:27):
And I've had people come up tome and say, wow, I wish I had
met you a year ago.
I didn't know a PI that actuallyhelped me in a situation I had.
Well, if you have one of those,send it to me, I'll review it,
send that to me atplittlepi777@gmail.com and I
will get back to you and talkabout solutions that might be
(11:49):
available for you.
Merry Christmas, Happy Hanukkah,those of you that.
are not in a religion.
May God bless you and yourfamily.
Until next time, be safe.
May God bless you, your family,and may God bless America.