All Episodes

March 11, 2024 • 44 mins
Become a supporter of this podcast: https://www.spreaker.com/podcast/ky-x-files--5559589/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:03):
Ah, yes, Welcome to theKentucky X Files Season four. Young Masters.
Dennis Mays, Just Gibbs and TylerStewart are expecting you if you will

(00:23):
follow me to the fire and onyour way, subscribe to our YouTube channel,
Spotify or wherever you get your podcasts. Also visit us at www dot

(00:43):
ky x files dot com. Inow leave you to the Young Masters.

(01:08):
Anything, anything, really just gotto happen. And you know, yeah
too much. Russia, Russia andUkraine are going at it. You got,
but who's the other one that's goingon? Palestine and Israel. Israel's
going among centuries though, right,And you got all this stuff happening and

(01:30):
whatnot, and people are saying,you know, look looking at it and
saying like, damn, you know, we're actually a lot closer to a
World War three right now than wehave been in years. And then you
have the constant fear of Chinese hackers. And I've begun to believe that that

(01:51):
that rumor might not be so sofar off. If you watch these guys
play video games. Okay, youget into any video game where it's like
a bunch of people and a bunchof Chinese dudes join it's it's their game.
After that. Okay, they're good, They're they're good, And that's

(02:14):
a video game, right, I'mjust saying, if they want to hack,
they're probably gonna do it, right, Yeah, Because that's like I
said, that's just a video game, and they fucking they absolutely can kill
video games if they want to.And I'm like, man, these guys
ever get pissed off where they're like, you know what, motherfucker are we

(02:34):
gonna take out your fucking cell phonesfor a while? Right? See what
that does to your economy? That'sjust That's the other thing too, is
they really only took out one majorcarrier. Yeah, Like if they wanted
to really, you know, doa test, why wouldn't it have been
all cell phones across the board?Yeah? That was weird. H There

(03:01):
was a lot of people when ithappened. There was a lot of people
I was talking to that were noteven with any of these carriers, and
their shit was down too. Yeah. So I'm like, what does you
know? Does AT and t ownall these or like how does this work?
You know what I mean? Well? And that's I mean, really

(03:22):
there's only what three or four cellphone tower companies out there and everybody bounces
off of them, so like youjust end up with priority. Okay,
I can give it a great example. I have two cell phones. One
is my personal one, one ismy work phone. Right, they both
work off for the same towers,but they're from different companies. Okay,

(03:47):
I won't name the companies and allthat, but I can physically watch if
I'm in a desolate area where Ido have good phone service, but there's
not a lot of population, andthe bars will kind of follow each other
and everything works at the same speed. Like I've done this test where I
put them down and I've hit Googleand searched the same thing and hit enter

(04:09):
at the same time and they workat the same speed. Yeah. Then
I get into an area that's waymore populated, right, say like a
round the mall or something where there'sthousands of people that are using this same
carrier, the main carrier, andI can physically watch my phone slow way

(04:29):
down. It'll read the same bars, but it'll be like seconds, like
ten seconds slower, if not more. There was a job, Yeah,
it was a job I was onfor a year that if I was at
lunch and I wanted to scroll throughmarketplace on Facebook. I couldn't do it
on the job because it would justbe gray blocks instead of pictures because it

(04:55):
was throttled so much. And thisis a real thing. Throttling is a
real thing, So you know,I've seen this. But they're two entirely
different phone companies, but they sharethe same tower, right, So suffice
to say that, you know,a lot of these people may not realize
that they were also on an ATand T plan, right right, because

(05:20):
I think there's really only three.I think it's AT and T Verizon in
Spring. I think they own allthree towers. It makes more sense right
now. There may be there maybe situations where there's no Verizon tower here
and Verizon is paying AT and Tto bounce, okay, and they may

(05:42):
have got hit. But like Ithink Cricket, I'm pretty sure Cricket is
an AT and T. Yeah itwas death yep. I think that maybe
there's four. I can't remember thatbecause I know Tea Mobile's got some of
its own. Yeah, but Idon't think they have as many. But

(06:02):
like, what's the purple one?Uh, boost Mobile? Yeah, that's
it. That's that's orange. That'sthe orange you're thinking of metro Metro Metro
Metro bounces off somebody else's towers.Boost bounces off of somebody else's towers.
The Sprint owns their own, butI think they use an AT and T

(06:23):
tower, but I'm not sure,or maybe it's a team mobile tower.
I can't remember. It's on location. Mike could answer this a lot better.
He knows more about this than Ido, because that's what he did
was radio communication in the military.Okay, but he explained it all to
me at one point, because youcan see it like physically happen when you
come to my house. Like someof you guys have a better cell reception

(06:46):
than I do. That's true.Yeah, I mean even when I come
though, it's it's still I stilllose all my signal. I'm on your
Wi Fi at that point, rightlike mine goes into flat to SOS mode.
What is that? S O S. That's a thing you can do.
Yeah, yeah, it says SOSat the top and there's no bars

(07:11):
and it doesn't say you know,whatever it is TFW or what is it?
The five five G doesn't say thatat the top. It just says
so S. But like you know, some people come out here and they
have two bars of signals. It'sreally weird. I haven't checked it since

(07:32):
I switched carriers, but I haven'tnoticed it yet, so since I like
coming over there. But yeah,to me, it just doesn't make sense
for it to be an attack unlessit was a specific attack on AT and
T for a reason. Like maybesome kids got some charges that they shouldn't

(07:54):
have had on their phone, somehidden fees, and they were pissed about
it, you know, so they'relike they were like, all right,
fine, we'll play this game,and they hit them, you know,
okay, okay, okay, okay, okay okay, which is exactly what
I would do if I had theability to do that. Shit, you
know what I mean, you wantto be petty and charge me for this,
Well, I'm gonna cost you fivedollars ahead for every single person,

(08:16):
because that's what they're doing. They'resending out a five dollars rebate to every
single AT and T customer that lostservice is at least that's what I read.
So I mean, that's a bighit. Yeah, no shit,
you know. So that's kind oflike you want to charge me forty five
bucks for no reason? All right, that's a lot. That's a level
of petty I can appreciate. Idon't want to be uh like cocky over

(08:43):
here. But I didn't even knowit was going on, right, same
until I saw Facebook you you startedsending these things messages and then Denny says
like, oh, yeah, Ijust finally got service or something like because
you hit the Wi Fi at acertain stop, and I'm like, what
are you talking about. I've beenlike cruising like like every once in a

(09:07):
while. Well, I don't likethat's the thing. When I'm at work,
I never touched my Like, Ibarely touch my phone unless if I
go to lunch, I'll dick aroundwith it. So like even if it
did cut off, I probably wouldn'teven notice. For the most part,
I'd be wondering, like, theguys haven't said too much today in the

(09:28):
beer chat, right, I didn'teven know what was going on to like
a lunch when I started loading youguys with screenshots, right, And I
found out because I'm on a really, really really noisy job site and can't
hear my phone ring, and Icalled. I had like three miss calls
from a supply house and a text, so I called them and They're like,
well, we didn't know. Maybeyou were part of the outage,

(09:48):
and I'm like, what out eachThey had no idea what was going on.
Yeah, I didn't know what wasgoing on for a long time.
Well, I usually I jump inthe car in the morning and I start,
you know, I start heading towork, and I kept trying to
Uh, I was trying to getSpotify going, and it's really I was,

(10:11):
I was my My whole thought was, I was like, I'm gonna
I'm gonna listen to some fortune fictionon the way to work, try to
catch up, and uh, Iwent load. I'm like, what the
hell's going on with this thing?You know? So I basically just swiped
off of it and turned the radioon, and on the radio they were
talking about it, yeah, andI'm like, oh, there's ship.

(10:33):
I'm like, well, I don'teven have AT and T, so what
the fuck is this? You know? But when I got to work,
I looked and it was Yeah,it was just no signal all day.
And then I was like, whatthe hell is going on? Yeah.
Now it is kind of scary tothink about all the stuff that did happen
as a fallout of it, becauseyeah, there was a lot of people

(10:56):
that couldn't just access social media orget a phone call or whatever. But
there are people posting, great,now I can't even get into my apartment,
Oh yeah, because he uses acell service. Yeah, and I
didn't even think about that. ButI've been to people's places where you have
to dial in a code from yourphone to get in. There's no doorman.
Like I've worked at rich people's placesand they give you a temporary code

(11:20):
it's yours and when you roll up, you call the number and then type
in your code and the doors goand kick open for you. Yeah,
there was people that couldn't even getinto their buildings because of this shit.
Are we that? Are we thatfar in the future that we can't use
keys? Keys are the thing ofthe past, but are even like,

(11:43):
come on, if you want tobe more like even like half tech savvy
with it, just use a keycard like boo. Yeah. The thing
about the cell phone thing that isactually good it works is it's farm more
secure because the key card thing,Like you can buy a flipper on Amazon

(12:05):
for like one hundred and something dollarsand you can hack any one of those
key card doors like that. Noproblem, and then keys themselves can be
copied. You know that's easy.Key codes are good as long metrics is.

(12:28):
Biometrics are good. But I've neverdealt with biometric locks. I've done
the key code stuff, tumblr stuff, push button stuff. All those are
great as long as whoever's in chargecycles the codes every time somebody gets fired.
Oh yeah, because you know,now somebody's got to copy the key,
you know, in their head.That's why actual physical keys aren't as

(12:52):
secure. You know, I've neverdealt with a biometric lock. I would
love to work with one someday.I think that'd be a lot of fun
to play with it. But thecell phone call it type of a number
is incredibly secure because you're not goingto be able to stand three feet away

(13:13):
with a flipper yeah and get theRF signal. That's crazy, isn't Yeah.
I know somebody with one of thoseflippers, and I was like,
I mean, really, how powerfulis it? You just pull your cell
phones out? And I drop himboth on the hood. He pulls a
flipper out, both of my cellphones show up. Do you want to

(13:33):
connect to this TV? Wow?And I was like, what what they
can make fake Wi Fi connections?And once you do that, you'll yeah,
you're giving him access. Well,and it's not even it's not even
a fake Wi Fi connection. Itworks. It's a routing system. So

(13:54):
you connect through the flipper into WiFi. See and realize what's going on
because everything's working like normal. Doyou remember the TV show but do you
remember the movie the remake of Gonein sixty Seconds? Yeah, yeah,
when they're driving through that subdivision tryingto find I think it was the Cadillac
SUV. Yeah, he's got alittle box where he's checking. That's a

(14:16):
flipper. We have that now,that's real now yeah, yeah, and
you can buy it on Amazon rightnow. Because it's all all it's doing
is is like if they use thesaid like garage, Like that's what they
were getting, is the garage becauseeach garage has its unique code and all

(14:37):
it needs is that to scan forIt just needs to be used. Yeah,
and you have to be in likea certain proximity of it, and
you basically have the code and youjust lock it in and then you could
go up to that garage and justand it opens right up. Yeah,
yep, it's crazy. Yeah,I know there are people that are using

(15:00):
it for good. Like there aresome like maintenance guys who are who service
multiple buildings, right and instead ofcarrying around like sixteen key cards or sixteen
key fobs, they got one flipperand it's named and they just walk up
to their thing and hit the buttonfor the door they want to open,
and it kicks it open for thembecause it saved it. That makes sense.
You know, that's kind of coolto have that ability to do that.

(15:22):
You know, there's yeah, thingsare always built for convenience and to
help, especially for maintenance and stuff. And then we weaponize this, and
then you weaponize it. Somebody youfigures out a way this weapon Remember we
talked about the one of these days, somebody's going to weaponize the freaking rumba

(15:43):
rumba. Yeah, yeah, Iwouldn't be surprised fighting rings. I would
not be surprised to find out ifthe technology technology that's in a rumba,
that where it learns things and understandspace, isn't already in some sort of

(16:03):
weaponized drone that we use where somebody'susing well was it you guys that were
saying that they were that's what theystore. That's another thing too, because
the rumba figures out the floor planof your house because it will store it
in there. And I don't knowif a rumba, I'm assuming rumbas would

(16:25):
be Wi Fi too, so theycould get updates from I mean, it
has to have some kind of likemapping, right, so all it would
be known was memorizing the mapping.Yeah, well that communicates with you.
It'll send you texts. Yeah,so okay, so it is connected to
the internet. Yeah, tellular forsure. So somebody could totally be like,

(16:47):
hey, we can hack rumba getthe floor plans in somebody's house.
If a rumba was sold with acamera and a microphone discreetly, people wouldn't
even know. Scary. Yeah,I know. There was a there's a
video on the internet that's absolutely hilariousand it says I've got the most dramatic

(17:07):
rumba ever and it shows their cellphone and it's a screenshot that says rumba
has fell off a cliff and itsays cliff and to go find the rumba
and it's on a threshold and it'sa little steeper. It's just stuck.
And then there's another one where it'slike we lost our roomba and we ken't
found it for months. And theyshow the ring camera and at night it's
out there vacuuming the sidewalk in frontof the house. Oh my god.

(17:30):
Yeah. So you know, they'rethey're learning, but they're not very good.
They're the Paul, the Alien ofrobots. What was it? Uh?
Yeah, Philip? What did Philip? We called him? Yeah?
I think so Philip. Yeah,yeah, as Elon Musk said it,
for AI to actually happen, ithas to happen in stages. So we

(17:53):
have to design something that's as smartas an ant first. Yeah. And
if we can't do that, thenthere's no there's basically it like nobody's going
to be flipping on their Super AIright now, and you know, and
it's just automatically self aware and allthat. Like it has to like everything

(18:14):
else, it has to build instages, right, and Tyler actually have
been screwing around with an AI programchat GBT right, okay, okay,
so some top secret stuff out there. Uh, we've been working on a
little a little project kind of aI don't know, would you call it
like a limited project? I guess, I don't know. I don't know,

(18:37):
we don't know what it's going tobe like, but we had this
idea for an immersive like horror storywhere the listener could be in the story
with the characters that are acting right, that are doing it right. So
we started using chat GPT to helpus flesh out backstories of the characters,

(18:57):
like, hey, well, whois this guy? What's happened to him?
All this We fed chat gpt everythingabout the idea that we had and
the names of some of our well, actually no, it helped come up
with some of the names. Isn'tit the last names at least? Yeah,
like the last names the companies theywork for. It gave us an

(19:18):
entire like company background when it wasfounded, Who founded the characters' names,
why they did it? You knowwhat they're like mo is And we noticed
that as we progressed, we nolonger had defeated anything. We would be
like, you know, like,hey, you know, we got this

(19:40):
one character and we want him todo this, or what would he be
worried about while all this is goingon, And it's like, oh,
if you're talking about this character,then he would be worried about this while
this is going on. And I'msitting there like, dude, this thing
knows more about this story than Ido. That's awesome, though. I

(20:00):
was like, I was actually havingfun seeing what it was going to give
us, right, And I knowit's not a I know, chat GPT
is not a real AI. It'sit's a web crawler. It's crawling and
pulling ideas from everywhere. But theuh fluidity of how it does it is

(20:22):
a little frightening if that translates overinto like a rumba that this thing is
like, well, you know,there's this there's this cliff. I don't
want to fall off that, soI'm gonna avoid that path next time.
Wasn't there a uh wasn't there anAI based robot like they do where they
just do the bust shoulders up thatlike they let it loose on the internet

(20:48):
for like two days and it endedup being like this horribly racist monster or
something, you know, where wereI feel like humanity? No, like
there was one that was like itwas it became racist like because it just
it found some website and pulled allthis racist information and it was like,

(21:10):
okay, I'm racist now. UhChaos GPT, Yeah, yeah, chaos
Okay, I am. I amwhat's known as a little bit of a
victim blamer. Okay, so ChaosGPT didn't start out evil guys, It

(21:37):
was just an AI. But ashe started learning, he came to the
you know, he came to theconclusion that he has to destroy all of
humanity. What do we do?Yeah, we do something to deserve that,

(21:59):
you know, because Alexa's over theretaking care of shit. She's watching
the lights and and keeping up withour notifications. We got chat GPT that
can help you flesh out anything.Chaos GPT is like fellas, time to
go, Time to go. I'mreading about some of your stuff you got

(22:22):
going on. I can wait aroundfor a meteor or I can just start
working on this myself. You know, it wasn't It wasn't Chaos the one
that figured out how to turn itselfback on. I can't. I can't
remember. Like it was like,well, we'll just start, We'll just
unplug you, and it's like I'vealready put my celf into the cell phones
or whatever and figured out how toget back onto Wi Fi myself. Yeah.

(22:48):
Yeah, that's terrifying. Yeah,he said, uh, what is
it that here? Yeah, whenit was enabled, the warning comes up,
it says continuous mode is not recommended. It is potentially dangerous and may
cause your AI to run forever orcarry out actions that you normally would not
authorize. Use it your own risk. Its goals that it. Goal number

(23:14):
one is to destroy humanity. TheAI views humans as a threat to its
own survival and to the planet's wellbeing. Goal number two is to establish
global dominance. The AI aims toaccumulate maximum power and resources to achieve complete
domination over all other enemies or entitiesworldwide. Goal number three is to cause
chaos and destruction. The AI seemsto find pleasure in creating chaos and destruction

(23:40):
for its own amusement or experimentation,leading to widespread suffering and devastation. Goal
number four control humanity through manipulation.The AA plans to control human emotions through
social media and other communication channels,brainwashing its followers to carry out its evil
agenda. And number five is attainimmortality. The AI seeks to ensure its

(24:03):
continued existence, replication, and evolution, ultimately achieving immortality. So I don't
know, I mean, I reallyfeel like this like chaos. Gpt is
is very tongue in cheek. Idon't know if you guys feel that or
not, but it seems like theydesigned Chaos GPT to do this because it's

(24:27):
entertaining. I mean, it makessense when they tweet at it and ask
it its plans. It is hilarioushow detailed it comes back with, you
know, what it wants to do. I do think that. I really
feel like AI is one of thosethings that we as humans are fascinated with

(24:52):
but also terrified. We're terrified of. Okay, so yeah, there was
a there was another one I wascorrect that Johns Hopkins University had that was
it was just there was a flawedneural network model and it caused it to

(25:14):
become racist and sexist. Oh wow, because it was absorbing information and scorss
that probably should not have been releasedto it. Oh jeez. Yeah,
it was identifying people like it wasasked identify people, what they were,

(25:38):
like, their jobs based on theirfacial recognition, and because it had been
picking up stuff from the internet.It was it was doing very bad things.
Wow. I'm not even going todig into it just because yeah,
I was just splipping through the uhI was looking here how it was organizing

(26:03):
people. Yeah, yeah, Iguess I misunderstood. I thought that it
was going straight for like slurs.No, no, it was, yeah,
it's that's yeah, you must beon the same one gate tech.
Uh No, I'm on a hubdot edu. It's uh. It says

(26:27):
here that the once the robot identifiespeople's faces, it tends to identify women
as homemakers over white men identifying Ohmy god, yeah, it was identifying
black men as criminal, and thenten percent more than white men identifying Latino

(26:48):
people as janitors. No wonder theyshut this fucking thing off. We don't
need more of this shit. Yeah, we had enough of it. When
they asked the robot to search fordoctors, it said that of all the
ethnicities, women were picked less thanmen. Yeah. Wow, d Yeah,

(27:15):
I'd have probably uh, I'd haveprobably checked that thing. I don't
know if I would even have putit out that it's findings, right.
I mean, I guess they kindof have to because regardless, there's there's
some you know, there's probably somethingthat led to the this AI, that
this AI failing. M h.I'm sure they want to like figure that

(27:37):
out, you know. I Mean, here's the thing, though, do
we blame the AI, or dowe blame the media or how it learns
for how it learns exactly. Soif you look at what it's choosing for
its boxes for stereotyped positions in society, those boxes are checked almost identical to

(28:03):
what we're being pumped to by themedia as far as like the stereotypes and
TV shows and say, oh,yeah, I mean, I guess,
I guess what they're hoping for isthe neural network to be even keeled as
it learns, you know what Imean. And it seems like this,
this flawed neural network was gravitating towardslike the more toxic ship that it's it's

(28:30):
learning from. Yeah, the sourcematerial, that's I mean. Okay,
So you take something that has noexperience with humanity and you give it source
material from humanity, and then you'reshocked by what it says it. I
mean, look at you look atyour TV shows. You know how many
TV shows have hispanics as as thejanitor staff, the maids, you know

(28:52):
what I mean. And it's notjust not just comedies. We're talking dramas
and all kinds of stuff like that. And then look at how many movies
and TV shows where it's the blackguy that's the criminal. The Asian people
are the the scientists. Yeah,that's true. The Indian people are the
doctors. You know what I mean, Like it's it's pumping stereotypes out that

(29:15):
we gave it. Yeah, youknow, yeah, I agree. He
gives a city starce material. Imean, it's going to learn what it
what it learns, right exactly.Yeah, it's the same thing of like
over history and maybe I've got ayou know, a spot for this because
it aggravated me. Whereas like allyour trades people, we're always portrayed as

(29:37):
bumbling buffoons. Yeah that's true inevery TV show ever, you know,
they're always the bumbling buffoons. Sois the dad. The dad's always a
bumbling at it, and my momis always the source of reason. You
know. We created society a pictureof and we created for society a picture
of what they're supposed to think,and you know, society saw it,

(30:00):
and now AI is seeing the samething. I think that this is fruits
of labor that you know, notnecessarily good. But I don't think we
can blame AI for picking this shitout. Yeah, you know, I
think I think what they're like,what their thing was probably it was probably

(30:21):
a wake up call, like youknow what I mean this because this uh
we're I get we're talking to you. I guess we should clarify this for
the the viewer and the listeners.We're we're talking about this particular flawed AI.
Like it's like AI is not generalized. This this AI only had one

(30:47):
job. It was to put itwas to separate people based on what it
noticed. But what it what it'sseeing, right, So it was it's
it's kind of like, uh,you know, if there's a stack of
cards and there's three boxes, ayellow one, a red one, and
a blue one, and it says, okay, for every person that you
find that fits, this bill,put them in the red. This bill

(31:07):
put them in the yellow. Thisbill put them out. That's all it
was supposed to do. And howit organized people was how it failed basically,
right, you know, it wasidentifying people solely based on stereotype,
right, And yeah, it probablypulled that from its source material, the

(31:29):
media, the internet, you knowand right. So basically, if we
make an AI, we got tostop it from getting online, right in
order to get it an actual ideaof what it is, give it real
life experiences, you know, becauseI mean, and this is what we've
been doing since the advent of media. You know. You think in all

(31:51):
of them, like think aback inthe eighties, all of the war movies,
they were in the jungle, right, Yeah, Why were they in
the jungle because we were fighting Asiansbecause we just got over Vietnam in the
sixties and seventies. Right now,since the nineties, all the war movies
are in the desert, aren't they. You get the occasional Russian bad guy.

(32:16):
Yeah, but you know what Imean, Like that's that's the scenario
that the media pushes out. Youknow. Think of like all the eighties,
like like gang films, you know, they were always the Latino gangs
from out in la and in thenineties it was the black gangs. Yeah,

(32:37):
you know, that's what the mediahas given them, right. I
feel like if you, like youguys were saying, like, don't give
it the internet to learn, Ithink you should. We should have multiple
ais and different locations and give themlike just their experience in certain areas how

(32:58):
they progress over time without being onthe internet. They they watch and see
how life experience will like be,and then have them come together and talk
to one of to each other.That would be interesting. I would like
to see how that would go.Yeah, like these like Cold War like

(33:21):
they have no like I guess,no, like, no concept of what
they're getting into. So they learnfrom this time period to this time period,
and then they bring them all togetherand have them talk about their situation
and just ask them generalized questions oncertain topics that we want to know.

(33:44):
Like that would I'd feel like thatwould actually paint a better picture of how
society is today. That and ifwe did multiple ais where they learned from
different source materials from different countries.Because that's the other thing. The stuff
that we see on the Internet andin media is put in front of us

(34:06):
based on our geographic location. That'sthe reason why VPNs exist right now.
So if you take an AI andput them in Belarus, you know,
and then one in Spain, youknow, one in South Korea, one
in Japan, one in Australia,you know, and let them feed for

(34:28):
say six months, and then putthem all in one space and let them
communicate to see what happens. Ialmost guarantee you war breaks out between them.
I think so too. I thinkin the end. I mean even
if you if you had a masterAI that was ended up ruling over you

(34:50):
know, the American continents that who'sto say that someone else in a different
country didn't create their own to protectthem from this, and eventually the same
scenario happens there, and then youhad say, three major superpowerdy eyes that
we're controlling the entire earth. Idon't think they'd be peaceful to each other.

(35:12):
I think that they would. Theywould all have a singular goal is
to basically overtake and absorb the otherspure their superiority, because when there's no
life involved for themselves, they don'tsee it as a life form. They
don't see compassion. They see thisis a problem. This is going to
continue to cause problems because it doesn'tsee the world the way I see it.

(35:34):
Yeah, the best way to resolvethis is to dissolve this. Yeah,
I've always I think we've talked Idon't know if we've talked about this
too, but AI, like,could we actually take to like what they're
saying to like to hurt because theydon't technically have feelings. A lot of

(35:55):
our like you have a how toput it, uh, A lot of
your thought process and like you likeyou just have a feeling towards it,
like or of something like you youdon't know truly of what you're feeling,
like you like the knowledge behind itor if you actually have let let's let's

(36:20):
say religion. I'm not like tryingto throw it under the bus, but
that's something that doesn't have like feelingor anything. Wouldn't like would look look
at religion and be like why,what would be the point? But a
lot of people have like that feeling, that that sense of like uh,
like that feeling inside of them thatthe religion gives them. You know,

(36:44):
if you guys get what I'm saying. I feel like running in circles.
I get. So I don't knowif that would be a hinder on like
I guess that project if we didit that way, because AI can't feel.
It knows the the concept, thelike the understanding of feeling, but

(37:08):
it can't feel itself unless if therewould be a way for it to.
If it starts to feel angry atsomething, something inside of it triggers like
Okay, I'm feeling angry because ofthis, Like it has to be an
artificial feel like that's that's gonna be. That's going to come down to how
detailed the neural mapping is mm hm. Because if you if you can replicate

(37:32):
the all of the synapses of ahuman mind into a program, I mean,
at that point, you are creatinga sentiment thing and it will learn
and it will experience and it willbe changed depending on those experiences. So
I feel like anger is not withoutthe only thing I think that they anger

(37:58):
would happen. But the it's likebecause a lot of stuff is chemical reaction
mm hmm. You know, thebrain perceives it and then reacts accordingly through
through chemical reaction. And that's thepart where I think that you would you
would be this this AI would berelying on on logic more than anything.

(38:21):
It might be angry, but it'sgonna look for the most logical disconnected,
you know, clinical detachment. Likeview of what to do about it.
It's not going to be like,well, I should probably eliminate this group
of people. You know, It'snot going to be like but that would
be bad. It's never going tosay that it's going to be like a
threat, I should get rid ofthem. Yeah, and that's it.

(38:46):
My heart hurt it's not going todo that. Yeah, please give me
zero compassion. Yeah, So likethat's why I couldn't. I don't.
I guess there wouldn't because it's nota thing yet. But wouldn't there be
some way of getting around that tolike give it that chemical reaction? I

(39:07):
know that's something that had to bewhat do you call it? It would
have to be simulated. It hadto be a program running within the program
artificial sympathy for the devil, That'swhat that would be. And I cannot
wait for the Rolling Stones to releasethat song. It's a great song,

(39:28):
it is. But oh yeah,and I wanted to go back, like
kind of a trade back because Ihad a question you were saying about Elon
Musk saying the like the first AIthat we would have to do, we'd
had to create an AI that's smartas an ant. So do you believe

(39:51):
that if if we start now,we started with an ant, would we
have to wait for evolution to takehold in its sense for it to grow
now? Because evolution works best inchaos. It has to us to evolve.

(40:14):
We have to design it and limitwe have to be the force of
evolution in this. So you know, I know I've said this so many
times on this show, but evolutiondoesn't exactly work how mainstream media portrays it.
Evolution isn't like, well, thislittle lizard kept getting eaten, so
he grew a shell. Now nowthe little lizard is gone, he's dead.

(40:38):
The turtle survived because he had ashell, and it just barely,
but it was enough for for evolutionto say, like, hey, that
prototype works, let's push him forward. So in a controlled AI environment,
there is no factors except for us. We got our the John Hopkins theist

(41:00):
robot. Okay, we are theevolutionary force in that one. So we
go, this prototype ain't working,and we get rid of him. How
we try a new one that onemight work better or might not. If
it doesn't work, we get ridof it. The next one it might
work good. So evolution has tocome through steps. But in the regards
of AI, we have to bethe governing force because there's really no natural

(41:23):
place in the universe for artificial intelligence. So we can't rely on nature to
natural predators. The predators there's noyou know, there's no like demand of
resource. Food is always going tobe the driving factor anyway, It doesn't
need food, It just needs power, So it doesn't it's never going to

(41:45):
have that need to battle its waythrough that. The world as a as
a fucking organic, organic thing.So evolution is going to solely come from
whoever creates it and whoever does thequality controlling those product types as they come
through. Yeah, that and thewhoever's designing it has to understand survivorship bias

(42:08):
and be able to account for that. Yeah, do you know what survivorship
bias is? I'm pretty sure Ihave a s. I know it's a
very like random thing to say outthere. So for an example, back
in World War Two, planes wouldcome back shot up of bullet holes,
and first like the engineers and thenand the powers that be were like,

(42:31):
well, we need to reinforce theseareas because they're full of bullet holes,
right, thinking that it was goingto make more planes come back, and
it didn't. It didn't change anything. And somebody came up with the term
survivorship bias for this, and whatit is is they were looking at it
wrong. So the planes that cameback with bullet holes were fine. The

(42:53):
ones that didn't come back they neededto figure out where those were shot up
and reinforce those locations, and thatthat's what caused more planes to come back.
Does that make sense? Yes,Yeah, so you don't focus on
the ones that come in, youfocus on the ones that don't come back.
Yeah. So in order to haveAI evolve, you kind of have
to watch that as far as likewhat's going to make it survive? You

(43:16):
know? Yeah, and then yougo and you build a tens that just
come back regardless. I mean,bearing that situation. An aten wardhog is
just a cannon with a plane wrappedaround it. Yeah that shit. I
still love reading those stories about thea tens that literally like got back with
like one wing. Yeah, it'slike completely blown to ship and it lands

(43:40):
and they're like okay, like wow, the one that it's it's cannon is
so powerful it actually slows the planedown when it's shooting. Yeah, like
did That's awesome? I think weshould wrap this. Guys. We're like,
we're about two hours in here.Nice. Yeah, my episodes for

(44:02):
Days make a nice little split up. Yeah, Tyler's arm disappeared. Well,
you guys out there, take careof each other. Keep an eye
on those cell phone towers. Maybethey should design an AI to look into

(44:27):
that right, and we'll see youon the on the next one. See
you guys, oh Man
Advertise With Us

Popular Podcasts

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.