All Episodes

May 4, 2025 • 126 mins
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
Hello friends, you have a moment so that we may
discuss our Lord and Savior minarchy. No, seriously, I'm just kidding.

Speaker 2 (00:11):
Hi.

Speaker 1 (00:11):
My name is Rick Robinson. I am the general manager
of Klrnradio dot com. We are probably the largest independent
podcast network that you've never heard of. We have a
little bit of everything, and by that, what I mean
to tell you is we have news, pop cultures, special events, inspire, attainment,
true crime, mental health shows, drama productions, and pretty much

(00:34):
everything in between. So if you're looking for a new
podcast home to grab a little bit of everything that
you love all in one place, come check us out.
You can find us on x under at klr and Radio.
You can find us on our rumble and our YouTube
channels under the same names. You can also find us
at klrnradio dot com and pretty much every podcast catcher
known demand. So again, feel free to come check us

(00:55):
out anytime you like at klr and Radio.

Speaker 3 (01:05):
Are you ready to reach for the stars? Tune in
to The Lost Wanderer, the number one monthly podcast on
Good Pods in Astronomy. Join our host Jeff as he
takes you on an interstellar adventure to explore the mysteries
of space and the wonders of science, from rocket launches
and distant galaxies to the latest discoveries in astronomy. Each
episode is a thrilling ride through the cosmos. Don't just

(01:29):
gaze at the stars. Come explore the universe with us.
Follow the Lost Wonder wherever you get your podcasts, and
let's discover the stars together.

Speaker 4 (01:44):
I'm Jordan Klinger, an attorney at McIntyre Law. The decision
to hire an attorney after you've been injured is important.
The decision on who to hire is even more important.
At McIntyre Law, we will settle a case if the
offer to our client is fair. Partial justice is no
justice at all. At McIntyre Law, we are committed to
obtaining full justice for our clients. Contact McIntyre Law at

(02:06):
four zero five nine one seven fifty two hundred or
visit us at mcintyrelaw dot com.

Speaker 5 (02:13):
Not to be a backseat driver, but can you say
for sure you've got the best monthly payment possible on
your auto loan? Could it be that you might have
gotten a better deal by shopping the loan at a
few places and have a lower car payment next time.
Before you go car shopping Visit Communication Federal Credit Union. First,
our auto loan experts will find you a perfect loan

(02:34):
and get you the lowest monthly payment we can. Communication
Federal your auto loan experts, Restriction Supply Federally insured by NCUA.

Speaker 6 (02:44):
Hi, everyone, this is JJ, the co founder of Good Pods.
Heard of it yet. Good Pods is like Goodreads or Instagram,
but for podcasts. It's new, it's social, it's different, and
it's growing really fast. There are more than two million podcasts,
and we know that it is impossible to figure out
what to listen to on Good Pods. You follow your

(03:05):
friends and podcasters to see what they like. That is
the number one way to discover new shows and episodes.
You can find Good Pods on the web or download
the app Happy Listening.

Speaker 7 (03:18):
The following program contains course language and adult themes. Listener
and discretion is advised.

Speaker 8 (03:25):
Dream Man Ger.

Speaker 9 (03:32):
Full out of Site, Government Shadows, Secristine, Conspiracy on Fold Listen.

Speaker 10 (03:44):
Straighten Council, Side Play to This Shame Man with Mother's
Losses for Level History Story, Some Souls.

Speaker 8 (04:00):
It was fifty one, Whispering Me, Beautiful Scientis, Haunting, Flame.

Speaker 10 (04:12):
Love, Miss Monster, A Watering myth crypt also logic injurious
Kiff Strange Encounter Sun explain to this out that Bradley
Change Men would know as ICs all other mystery stories

(04:34):
unfold sees takes out believe as your foreignswers getting into
the firelight.

Speaker 2 (04:45):
So logic things. The search continues Tonight.

Speaker 10 (04:51):
Strage Encounter Sun explain to this out that Bradley Change
men would know us.

Speaker 11 (05:02):
Sorry, good afternoon, good evening, good morning, good night.

Speaker 1 (05:26):
I don't know. I don't know wherever you are. I
know you're somewhere, everybody somewhere. Anyway, this is juxtaposition. It's
been a minute because everything's been crazy. I blame space
lasers and AI for weather stuff because yeah, anyway, I'm
one half of the crew, mister Bick Robinson, he's the
other half, mister Ordnance J. Packard, our resident Amish and

(05:48):
Butter Turner. Extraordinary good evenings there.

Speaker 2 (05:50):
How are you? You know, a couple of years ago,
when we were doing Juxtober and we were just absolutely
cursed with trying to get a show on the air,
we were doing the Witchcraft. Why don't we blamed witches? Yep,
I'm totally blaming AI on this one. Dude, I'm telling
you he's out there. And if you don't get the
Winter New reference, go read some William Gibson for fock's sake, dude.

Speaker 1 (06:11):
I'm just saying when you texted me last week and
you're like, I don't know if I'm going to be
home in time because I'm in Carson City and there's
a freak snow storm.

Speaker 2 (06:20):
I was like, what the hell everybody thinks set the
Jews in the Amish have the weather machines. It's AI
control that too.

Speaker 1 (06:29):
I don't know.

Speaker 2 (06:29):
It's supposed to be a light rain turns into fucking
dumping snow on me. What the fut? Of course I
didn't take my floor runner. I took the small car.

Speaker 1 (06:38):
So yeah, anyway, well that's what you get for not
taking your fore runner.

Speaker 2 (06:44):
Yeah, sow you're doing man, You're.

Speaker 1 (06:47):
Supposed to be better prepared than that, mister Amish man.

Speaker 2 (06:50):
I don't know.

Speaker 1 (06:51):
I mean, I'm doing all right. I'm kind of I
don't know. I feel a little kind of out of
sorts tonight because I'm trying to get ready for tomorrow
and I'm I'm all kinds of nervous. Launching new shows
doesn't normally make me nervous when you're launching a show
for Korn Nimick, I'm nervous. I don't know why, because
I've been doing this for like twenty years at this point,

(07:11):
but this one is making me nervous. I don't know it.

Speaker 2 (07:14):
You just relaxed. He puts his pants on the same
as me, double backflip with the pants being held up
by a couple of cherubs.

Speaker 1 (07:19):
Dude, you say that, but my ex wife used to
jump into pants both legs at a time. That I
was like, how I because everybody's like, no matter how
rich you are, no matter how famous you are, you
everybody puts their pants on the same way and one
leg at a time. That I'm watching my wife get
my ex wife get dressed, and she like puts both
feet in and then just jumps off the bed and

(07:41):
pulls them up. But I'm like, that's what.

Speaker 2 (07:45):
I remember some hot check doing that in a movie.

Speaker 1 (07:50):
So yeah, that, Like, I know, there's always exceptions the
proved rules, but I was kind of like, what just happened?

Speaker 2 (07:55):
Now? But yeah, anyway, so yeah we got that happening.
And you know what, because it's been a few weeks,
I just forget how fucking rock and our theme song
is Thanks again Jeff. Jeff's been doing a lot of
the music for KLRN and he's doing it in AI
and I've been gushing about the Alice song all week.
But still I'll get our theme song stuck in my

(08:19):
head at random times.

Speaker 1 (08:22):
Owe me too. I'll be working on something and all
those enybody tooth is out. Where the hell did that
come from?

Speaker 2 (08:29):
So alluded to it, But this has been one of
those topics we just we've had the hardest time getting to.
We did it once like four years ago, and then
when we decided to circle back, everything is just falling
apart with us getting this one out well.

Speaker 1 (08:46):
I mean to be fair, we have AI on everything
we own now, so it's all like, you can't talk
about this, don't you dare?

Speaker 2 (08:53):
Get out there in bad mouth me not even drive,
not even gonna lie.

Speaker 1 (08:58):
Five minutes before you put out the it's actually happening post,
my internet died for like three minutes.

Speaker 2 (09:03):
I was like, you've got it, hey, As I got
I gotta thunderstorm rolling in, I was like, fuck, I'm
gonna lose power, I know it. But whatever we're doing it,
fuck it.

Speaker 1 (09:12):
We'll do it like.

Speaker 2 (09:16):
What we're doing tonight. It's it's one of my favorite
topics I like to circle back to. We've done it
once and alluded to it a lot on other shows.
It's we're doing a triple threat tonight. We're doing the
dead Internet theory, which we'll explain in a minute Bought
Wars and roguae I and the dead Internet theory is.

(09:38):
I don't know about you, Uric, but when I started out,
the first time I actually got online was back in
the mid eighties on my Commodore sixty four and my
thirty six hundred bod modem connecting the bulletin board systems
and you know, trading bootleg going go Boingo tapes for
bootleg Generation X tapes and it's on some fucking BBS

(10:02):
that was based out of Topeka. It was just wild
times on that, and then it evolved a little bit.
Late in the life of the Commodore sixty four. There
was a service that I later found out was the
beta test for America Online, and it was called quantum Link,
and it was a US Canadian company and they kind
of had like a very early version of AOL chat

(10:25):
rooms and they had a casino on there too, but
it was really expensive, especially if you're in the middle
of nowhere, because like I had to, you know, pay
for a long distance call to the nearest city that
had a node that I could link into. But yeah,
how about you, what was your first experiences on the internet?

Speaker 1 (10:44):
Actually mine was not a little bit later, but still
with a Commodore sixty four with the whole quling thing, okay,
and then that was kind of maybe like a once
or twice kind of thing. And then it really didn't
become fairly commonplace for me until I would say probably

(11:04):
ninety six ninety seven. My mom had compy Serve and
right anytime I was over at her house, she would
always be on the internet. And then after I got
divorced the first time, I always spend in some weekends
over there so she could watch my kids while I
was working, and that's usually when I would hop on
the internet and do the damn but all those noises,

(11:27):
we all remember that, we're all kind of nostalgic for now.

Speaker 2 (11:30):
Half the time, I'm like you, I don't I think
when I think back to the old Internet, that's still
one sound I don't miss. And for you young kids,
there was a time that like if anybody in the
house picked up the phone, you got disconnected from the Internet,
and there was no guarantee you would get back on
the Internet because most ISPs at the time they would
just have like a dozen modems hanging off the wall

(11:54):
tied to a fucking Linux box and if because most
of the time when you tried to get on the Internet,
you'd get a fucking busy signal and then if you
did connected, then you got that god awful screech. But yeah,
so that's I went to college, I did some things,
a lot of drugs, and then I got back into
computers again. And it was around that same time, ninety

(12:16):
five ninety six with CompuServe and the great flame Wars
back in the day, and I mean fucking the great
Coca Cola vending machine assaulted Derek Smart being retarded on CompuServe. Ironically,
he's still retarded on Twitter today. But yeah, it just
it was just weird, wild times, and you had the

(12:37):
web rings where you never knew you clicked on the
web ring and you never knew what the next website
it would take you to was. It was just like
this random you know, people would get together and like
put you know, follow our webring on the bottom of
our web page their shitty little angel fire page. And
this is back of the days, even before MySpace, So
you'd click on it and you didn't know you could
be taken to a site talking about the things we

(12:59):
talk about, could be taken to some shitty little fucking
proto sc sore or you know, just wild days. It
was like on wait, a couple million websites at the time,
but god damn they were all fucking weird, and it
was awesome because every page had its own personality of
the person who made it.

Speaker 1 (13:22):
Yeah, and that's one of the things that's so much
different now because everything's being pushed through a bit of
a funnel no matter where you go, because I mean,
like like now, right now we're streaming on places like
x and YouTube and rumble and Facebook and all those
things basically look the same, right, I mean, you can
customize your own experience a little bit. Like in some

(13:42):
of them. You can change the colors. You can decide
whether you want to be blinded by white screen, you
can do more subdued gray screen, or if you have
migraines like me, you're probably constantly in dark mode. But
other than that, it's basically the same general experience. And
that's one of the things that I missed about the
old score Internet. My hitt for me, the heyday of

(14:02):
the Internet was during the M I R C DS.
I loved that.

Speaker 2 (14:06):
Stuff, like I loved M r C and the and
the early and the early direct messagers too, like c
Q and uh you know, I c Q was cool
because it would tie all your other messengers together, your
a l incident messenger, you know, you could link all
your other accounts to it. It was one of the
first that did that, so it was kind of like

(14:26):
your one stud It was your switch Swiss army knife
for all your various you know, instant messengers that you had.
But yeah, M I R C that was that. And
just shitty web forums and you know, all of them
done on god, what is it, bb P HP or

(14:47):
something like that. Yeah, php B They they all used
the same free fucking forum software too, and uh god,
it was just you know, such wild times. But I mean,
your point though, is that it's like there's two billion,
over two billion websites now, but most of them are

(15:08):
just dead or placeholders. And yeah, it's like everything has
just been distilled down to a few dozen websites that
we all use. I mean, we all get our music
from Spotify. There's no more days of like, you know,
downloading sketchy shit for your win amp player, and you know,

(15:28):
because that whipped the lama's ass and.

Speaker 1 (15:32):
I forgot about the lomma.

Speaker 2 (15:34):
Yeah. Yeah, back in the day, not only did you
have a multitude of fucking ims, but you had a
multitude of players too. You had real Player, you had
win Amp, and you just everything was just so nothing
had been distilled, you know, everything was still fresh, so
everybody had their own game. And then that was when
the acquisition shit started around two thousand and two thousand

(15:56):
and one, and then we get into you know, twenty sixteen,
it's either Facebook, Instagram, Twitter, I don't even count Blue
Sky or Minds or any of those others because nobody
fucking uses them for shopping. It's Amazon, Temu Ali Baba

(16:17):
and yeah, I mean there's a few other websites, but
they all are the same. It's all scripted content, it's
all SEO friendly, so that way they can be at
the top of the search engines. You know, back then,
we had a fucking dozen search engines too, and the
other real travesty as nobody has bought asked Jeeves to

(16:37):
put in CHATPT shell, to put it over you know,
to use it as a shell over chat GPT, because
asking Jeeves, hey Jeeves would be much cooler than fucking
rock or GPT. That's real crime. Why nobody's done that yet?

Speaker 1 (16:55):
I missed, I missed Jeeves Jeeves was cool.

Speaker 2 (16:58):
But yeah, you had like those as Jeeves. God, there
were so many search engines then too, most of them
were ship but you know, now it's Google or fucking
whatever Microsoft's one is.

Speaker 1 (17:12):
I think what is theirs?

Speaker 9 (17:13):
Now?

Speaker 10 (17:13):
Edge?

Speaker 2 (17:14):
I think so browser, but the yeah, it's whatever.

Speaker 1 (17:21):
Bing bing is their search I think, yeah.

Speaker 2 (17:23):
Which was also where the fuck did they get that from?

Speaker 1 (17:26):
But uh, it's the noise that used to make when
you asking, Yeah, that's that's where that came from. But
but yeah, I mean, well, even now, like there's been
so many things that have tried to come come out
that are supposed to be better than Google as far
as search and even browsing purposes, like Brave and Duck,

(17:48):
dug Go and all these other things, and just none
of them, for some reason, none of them really take hold.
I mean I do use dug dug go quite a
lot if I'm doing searches for for work because Google
hide ship that's annoy.

Speaker 2 (18:00):
But no, you know, and that's one of the things
I like Brave because you know that when they ask
it's like, you know, when you type something when you're
trying to search something, it'll say do you want it
from a left wing angle or a right wing angle?
Which I respect. I think that's cool. But the one
thing that Brave search engine absolutely sucks donkey balls at
is finding memes. If you want to find an image

(18:23):
you know exists, it'll give you everything, but and like
maybe a dozen image choices and say, oh that's all
we can find. So when I have to go find
a meme or a gift, I have to go back
to Google.

Speaker 1 (18:37):
Yeah anybotter Remember the Microsoft paper clip.

Speaker 2 (18:42):
Clippy? Yeah, yeah, I.

Speaker 1 (18:48):
Because anytime it came up, it was usually giving me
bad and it was like the fuck away Yeah.

Speaker 2 (18:55):
Oh but yeah yeah, And you know in Duck Duck
goes okay too. It's just yeah, it's like with browsers. Yeah,
there's a few dozen browsers. But with the exception of
like Firefox and I think Opera and Safari all of
them are chromium based anyway. Yeah, they're the free wherever.
You know, when you get Google Chrome, you're getting Google's

(19:17):
version of it, and it's got the newest and latest,
you know, past bells and whistles. But yeah, I'll still
use Brave over any browser anytime anyway, just because it's cleaner.
Everybody complains about ads on zi R. I've never seen
an ad using Brave.

Speaker 1 (19:36):
So see, I'm not alone. Everybody hats Clippy.

Speaker 2 (19:40):
See I told you I love Clippy.

Speaker 1 (19:43):
You're a liar.

Speaker 2 (19:45):
Yeah, no, I despise Clippy then, but I respect him now.
You know I missed out fucker. So anyway, the dead
Internet getting is we've all been guided to SEO friendly
you know, search engine friendly websites that are pretty much
cut and paste and it's all bought driven content on it.

(20:12):
You know. It's the search bots tell us to go there,
the assistant boughts when we get there, help us find
what we're looking for. You know, it's just it just
feels so sterile and fucking But yeah, I mean, if
you you know, if you're momennial or something, you never
got to see the old old Internet and how fucking

(20:35):
cool it was. And you know, this is just normal
for you. I mean it's even getting worse now, to
the point where if you've noticed all the icons on
your phone and on your desktop, they have next to
no personality anymore. That's again to just make it look
good on phones. You know, is you remember when icons
used to be like I mean, like last year, when

(20:57):
you know, they would be multicolored and you know it
said that people like they actually paid graphic artists who
put thought into the design. Now it's just a letter
usually black on or white on a standard color wheel background,

(21:18):
and you're supposed to know to be able to tell
the difference between that, and you know, I mean there's
a lot of times that I've gone to launch X
and launched rock instead just because I just saw the
black with the white and didn't even Yeah.

Speaker 1 (21:33):
So yeah, well every well it's weird because everything has
gotten rather minimalistic.

Speaker 2 (21:39):
Yeah, and that's I mean brutally minimalistic. And uh, but
I mean all of that is just it just led
us to the theory and I totally believe it that
the Internet died a long time ago. Yeah, you can
say that the Internet that was, but even then most
of the interaction you get on the Internet now it

(22:00):
isn't real.

Speaker 1 (22:01):
Yeah, Al's got a point two thousand was actually the
end of the world. No one told us, Yeah that's
that's fucking minds.

Speaker 2 (22:11):
But uh yeah, it's just I mean, it's just just
so sterile and gross.

Speaker 9 (22:17):
Now.

Speaker 2 (22:18):
Yeah, you know, it's every every web page that I
had open for the research that I did for the
show looks exactly the fucking same same font well going
the same spot, same drop down menus. I mean, there's
no originality in any web design anymore.

Speaker 1 (22:36):
Well, I mean so, well, Al brings up a good point,
and I don't I don't think it's it. I don't
think it's just the Internet because you were McDonald's. When
we were kids. It was this bright place, bright colors,
cool like character characters unless you were like freaked out
by them, awesome places to play. Now it looks like

(22:57):
it's a Now, it literally looks like it's a year
old going through a midlife crisis every time you look
at mcdonals.

Speaker 2 (23:03):
But they all look the same. And a friend of mine,
he works for what a Burger, And yeah, he's kind
of responsible for determining where new ones are going to
be set up, and he was telling me about the factory.
You go into a factory and it's basically, Okay, what
do you want it to be? You wanted to be
a Whataburger, a KFC and in and out. Yeah, they're
all prefab and they can be dropped in weeks.

Speaker 1 (23:25):
And well, yeah, I mean it has kind of gotten
to that. I've actually been seeing stories about that where
they're making everything as far as chain restaurants as close
to the same as possible, so that way, when if
people have to move around from store to store, there's
there's not a bunch of stuff, so almost everything is
exactly the same.

Speaker 2 (23:42):
And Domino's in a you can tell what what used
to be a pizza hut and now it's got Domino's colors. Yeah,
but yeah, I mean it's in my town. Fortunately, Jack
of the Box of McDonald's are on and Carls Junior
are all like far enough apart from each other that

(24:02):
you know where you're at by your location in town.
I think if they were all next to each other,
I would hit the wrong drive through if I wasn't
like looking up and go, oh, look at these now
pretty indistinguishable signs. From each other in the KFC, it
looks like that too. They all have the same fucking
paint of the thing is those buildings have been there
for a long time. That is a corporate decision to
paint at those colors.

Speaker 1 (24:23):
But what's up with all the grays? That's the thing
I don't get. Like even a lot of the popular
colors with cars nowadays are all grays, Like most of
the restaurants are going to like this utility gray cars.
I mean, even the color of my Kia is kind
of like a gray with like blue undertones. And it's
some weird new paint job they came up with specifically
in twenty twenty one, And I'm like, eh, I wanted

(24:45):
to read one, but they didn't have one.

Speaker 3 (24:47):
Whatever.

Speaker 2 (24:48):
We are all spinning slowly through the event horizon of
a black hole and it just the color of our
doom before we spaghettify.

Speaker 1 (24:57):
Oh or come out the other side and realize that
everything works backwards. Yeah, and yes, I have in fact
been watching Star Trek, the intermitted series. Why do you ask?

Speaker 2 (25:07):
Of course you have?

Speaker 1 (25:11):
Oh, actually I got that was that was like the
very last episode they did, I think where Robert April
was supposed to be retiring and they wind up going
through a black hole coming out of white hole and
everything was running in rivers.

Speaker 2 (25:26):
Okay, So yeah, getting back to so we've all been
driven down to fucking Reddit and Facebook and Twitter, and
it's just pick your flavor, you know. It's like it's
like choosing between coke and pepsi. They're the same fucking thing,
just one is slightly different than the other. End you
have brand loyalty to one. I know, I said it

(25:49):
being a huge coke fan and I will walk out
of a restaurant before drinking a pepsi.

Speaker 9 (25:53):
But uh.

Speaker 2 (25:55):
Yeah, I mean they're everything. The websites, all the all
the micro blog sites, the image sites, they're completely indistinguishable
from each other. And it's all just AI generated spam
curated to what you're looking for.

Speaker 1 (26:14):
Well, and that's actually one of the reasons why we're
talking about this theory, because there's a plausible theory that
explains why all of this is all starting to look
the same. And that's because there's only a handful of
actual people that actually hang out on the Internet anymore.

Speaker 2 (26:33):
Right, Yeah, And that's just I mean, you know, when
when you've distilled down I'll use my I'll use my
Twitter experience as example. Thanks to Jack and his multiple suspensions,
I have really distilled down my follower base where I
could say with a high level of certainty that at

(26:53):
least sixty percent of them are real that I get
real engagements from consistently and all the time. So but yeah,
it's from what I see out there. And I know
I told you about this. I know that we talked
about this on a Rick and Orty once was there
was one time, and I wish I could save it.

(27:14):
I wish I could have screenshot at it. But now
both of them are gone. I had one of my
retard fights in my feed like I always do, and
it ended up being two obvious bots fighting each other.

Speaker 1 (27:26):
I remember that you were like you quot tweeted and
You're like, what the actual.

Speaker 2 (27:31):
Fuck is this?

Speaker 1 (27:32):
I have a box fighter in my mension.

Speaker 2 (27:36):
I was just absolutely fascinated. I watched it for like
half a day. I didn't inject because I didn't want
to distract him. Everybody else was just kind of vaped
out of the conversation and it kept going. And I'm
just watching this like I don't want to jump in
because it will screw them up and then they'll start
talking to me again. And this is fucking amazing. I mean,

(27:57):
it was just all shooting platitudes and talking points set
each other. Yeah, there was no depth to the conversation,
but it was just interesting that two bots from each
side of the debate have found their way into my
feed and started going at it each other. And I
had to think, how often is this happening? I mean,
is this happening to everybody? And I think the likelihood
is yes, Oh.

Speaker 1 (28:16):
Yeah, I'm sure it is. So some of the folks
that hop into my feeds, I'm like, yeah, I'm pretty
sure you're a bot. Yeah, but that's just it now
everybody's and this happens to me all the time. They're like,
shut up, But I'm like, dude, I've been on this
platform since twenty fifteen and have thousands of followers. Trust me,
I am not a bot.

Speaker 2 (28:35):
You know. I'm saying point in the chat too, is
all video games now they're basically using the unreal engine
and the assets.

Speaker 1 (28:41):
Piled on top of Body Type one or two pretty much. Yeah,
but again that's because I'm released. I mean, we've talked
about this before too. This spills over and other things.
I honestly think AI has been writing a lot of
the movie scripts for longer than we know. I think
that's one of the reasons why there's almost never anything
original comes out anymore.

Speaker 2 (29:01):
Yeah, I mean how I mean.

Speaker 1 (29:03):
Think about the actual creativity of human beings and all
the things that we have done throughout history. You're telling
me we have run out of ideas. Seriously.

Speaker 2 (29:12):
Actually, I have a theory about that that I dropped
on Culture Shift this week was it's basically because there's
only thirty movies allowed into China a year. All of
the movies are easily digestible and understandable to a Chinese audience.
So that's why there's so many reboots. Hey they've seen Ghostbusters,
Let's do it again. Hey they've seen Top Gun. Oh,

(29:35):
can't do that one because we're not taking the Taiwan
jacket out of the fuck you.

Speaker 12 (29:39):
So.

Speaker 2 (29:40):
But and that's a lot of it. And with the
TV shows, fuck, they've been procedural, just cookie coat of
shit since the seventies. But to your point, yeah, I mean,
obviously it could absolutely stand that this has just been
farted out by chatbots and AI for a long time.

Speaker 1 (30:01):
H rex is Rex is speaking nerd in the chat,
and even I don't understand it, so I'm not sure
I can normally speak nerd. I missed that one. Did
I lose the almish one?

Speaker 2 (30:17):
No? Sorry, I was coughing. That's uh, that's car talk
what I thought.

Speaker 1 (30:22):
I was like, wait a minute, I think yeah. Anyway,
I was like, is he mixing metaphors or am I confused?
I was like, yeah, anyway, he was not speaking nerd.
He was speaking rich grease monkey. I should have gotten
that too, though, But I'm off my game tonight.

Speaker 2 (30:40):
And to make the point, I mean, it's I mean,
just here's a good example. Is you remember shrimp Jesus
on Facebook?

Speaker 1 (30:50):
Yeah, I've I was reading through some of the notes
you left. I vaguely remember that.

Speaker 2 (30:55):
Yeah, I mean that was after I left Facebook. But
friends of mine kept sending me to to check this out,
check us out, check this out. So if you if
you didn't hear of it, that was basically just AI
scripted bots using AI to fart out Jesus attached to crustaceans,
and you know it would be shrimp Jesus or crab

(31:17):
Jesus or whatever. And it was a thing on Facebook
and when they broke it down and they looked at it,
eight of the replies what's made it viral was bot traffic,
bots bringing other bots into reply.

Speaker 1 (31:35):
Yeah, so they were creating the illusion of engagement.

Speaker 2 (31:38):
Yeah, it was bot created. And then bots would pull
other bots in using keywords to draw them in to
comment on it too. So this bikes like the weight,
you know, like on websites where bots will generate or AI,

(32:00):
either one will generate a hit hidden meta tags on
a website you know, usually blogs now, so that way
they'll get pushed to the top of Google's that you know,
they'll get pulled up to the top of relevant searches,
and yeah, there's a big thing. And Google did a
report on UH in twenty twenty four that flagged AI

(32:21):
spam sites with keyword stuff blogs as the number one
problem with their search engine. Even cloud cloud Fair said
that small sites plummeted from tens amount tens of millions
to thousands wow, because they couldn't beat it. Well, I

(32:46):
don't know that.

Speaker 1 (32:47):
I like I said, I can remember that, but I think,
what which this thinking of bots, you know, triggering other
bots for engagement purposes? This leads me to wonder how
many bots are being programmed to do mass reports and
would that even be possible because of the number of
clicks it would be involved to like mass report accounts,

(33:09):
that'd be pretty easy. Well, I'm just wondering if that
may not be some of what we've been going through.

Speaker 2 (33:15):
Yeah, well I absolutely know that. I mean, yeah, you
all know my Twitter experience. That's pretty standard between between
troll armies and bought reports. Now, it wouldn't be that hard.
And the thing is is that, you know, Twitter having
its own problem of the algorithm deciding uh, you know,

(33:37):
who lives and who dies on that website and it
being fairly arbitrary and doing so. Bots could easily game
that system because it's an easy system to figure out.
I mean, fuck, we've figured it out back before. You
on allegedly changed things, but really nothing's changed that much.
And the problem now is that you have I mean,

(33:59):
I appreciate ate him firing all the people he did
at Twitter and moving him to Texas. The downside of
that is you now have algorithm bias among support where
they figure, oh, the algorithm flagged is so it must
be right. I mean, I can't get support for fucking
anything on Twitter anymore. At least back in the day

(34:19):
under Jack, when I would file an appeal, they would
at least have the courtesy to tell me to get fucked.

Speaker 3 (34:26):
Yeah.

Speaker 1 (34:26):
Now they just don't say anything.

Speaker 2 (34:28):
Now I just get ghost.

Speaker 1 (34:29):
I mean, if I'm gonna be really honest, that has
been one of my concerns with everybody, like, you know,
even not trying to get too political for a second,
but like Sean Hannity and everybody cheerleading Elon Muskin doge
because of how well Twitter works now and what he's
done with And I'm like, you must be having a
completely different experience with X than I do, because that
would not be what I would be using as an

(34:50):
example if it were me.

Speaker 2 (34:51):
No, no, I you know, I posted that the other
day and it ended up getting a fucking retard in
my feed. Just something for you on say, oh, yeah,
he's a fucking genius, but you need them to know
how to use the site better. Yeah, because I actually
use a site more often than he does. So yeah,
I do know Twitter. I probably know. I mean, you

(35:13):
and I and everybody in the chat room probably know
how to use Twitter better than anybody who works for Twitter.
And I'm not talking gaming the system like the uh
you know the big influencers do, yeah, you know why
guys and everything else. I mean actually using the site
as as God intended it.

Speaker 1 (35:34):
And that's what I miss. I miss when you could
just use the site as it was intended to be used,
and people could see you, not this, not this. Oh well,
if you want to get noticed, you got to get
tagged into a big account first, then get all kinds
of boosts from that, and then people can finally see
you that. But again, this leads to the same theory

(35:55):
that we're talking about. I think one of the reasons
why the Internet, if it isn't dead, appears so is
because everybody's figuring out how to game the system. All
you see is the same few things over and over
and over again. And if I see one more person
say if you're if you're awake but not well get
reply to this, I'm just gonna shoo my swear to God.

(36:18):
I hate that hit.

Speaker 2 (36:19):
I mean, look, this goes to TikTok's I mean the
thing with TikTok now where actual human content creators aren't
really the number one draws there anymore. What you've got
is AI generated content flooding the system that games the algorithm,
and it serves up a I made videos with robotic narrators,

(36:42):
a surreal slideshows and just spits that at you. And
that's those are the top ones on even you know,
what's supposed to be, you know, the great tool of
you know, the younger generations for getting their message out
on TikTok, they're getting crushed by AI generated schlock.

Speaker 1 (37:02):
Well, the same things happening on YouTube. There's all these
faceless you know, AI generated YouTube channels that are coming
along and they're getting hundreds and hundreds of thousands of plates,
and I'm just like, is the algorithm like responding to
this stuff?

Speaker 12 (37:15):
Now?

Speaker 10 (37:16):
Is that?

Speaker 1 (37:16):
Is that the new thing is? We're just gonna start
having everything, even the create I mean because because remember
that's the whole reason that we were supposed to be
making these AI assistants and all these robots and everything else,
was so that we as human beings could focus more
on our creative side and not have to slave away
at all these different jobs. And it's the reverse is happening.

(37:39):
Everybody's using the AI for the creative side and still
doing all this sloggy work like this is this is
this no, just no, I mean, and don't get me wrong,
I use GROC because I do about forty hours of
podcasting a week, so I will use grock quite a
bit once I have an idea of what I'm going
to talk about on a specific show, to help me

(37:59):
narrow it down, clean up a little bit, and put
it into a bit of an outline so I don't
get lost because I do so many damn shows. And
to me, it's it's it's it's not really cheating because
all nobody talks about this. But all the bigger name
guys and even some of the local guys that do
radio for three hours a day, they have they have
a production team. They have production meetings like every single

(38:19):
day to talk about what they're going to talk about
the next day and start lining everything out. The little us,
little folks, we don't have access to that, so that
AI puts us on more of an even playing field.
But even I don't like just you know, having it
fart out content. For me, I'll at least do the
research and then i'll or I'll and I'll write some
stuff out and I'll be like, hey, help me help

(38:40):
me add some more depth to this or whatever, and
then it usually goes through and kind and gives me
more of kind of an outline approach of what I'm
gonna do. So I mean that that's that's one thing.
But just using using AI to do everything that I just.

Speaker 2 (39:00):
I mean, I get it as a tool. And you know,
we've talked about it even for using it for this
show because you know, like we've discussed many times in
the past that you know, finding finding actual usable information
for this show is a huge pain in the ass
because so many of the topics we touch on have
been absolutely polluted by Hollywood so or the video game industry.

(39:22):
So usually when we're looking for something like on witches
you we'll use that as example. Again, we're page four
or five deep into Google before we find stuff that
isn't from a TV show or a movie, you know,
and so using using AI to cut through all that,
you know, to ignore that and say, ignoring, ignoring pop culture,

(39:47):
Please find me the best information on X topic. So Hey,
Jeff dropped an interesting factoid in uh on discord on
me only point zero six percent ex users have over
one thousand followers, and that's highlighting the rarity of large followings,

(40:10):
which you.

Speaker 1 (40:10):
Would think meant that you know, if you are, you know,
one of these people that actually has managed to amass
a little bit of a following, you should probably get
noticed a little more. But the weird thing for me,
and I've talked about this before, one of the reasons
why I'm starting to put more and more credence into
the idea that most of X's bots or whatever is

(40:31):
when I had when we first started broadcasting on X now,
which has actually been a right out about a year,
I had eight three hundred followers in the last and
I was stuck there forever. In the last year, I
have increased to over twelve thousand, and I get noticed
less now and I'm posting more. I just I just

(40:58):
I don't get it. I mean it's either people are
just following people for the sake of following and they
don't really give a shit about anything that you post,
or they're just not really there or something. I don't
get it, because it's like, I mean, like right now,
I mean, we have almost six hundred people paying attention
to us. According to the stream, we have about fifteen
people chatting back and forth. So again, how much of

(41:22):
those numbers are accurate, how much of them are not.
I don't really know the answer to that question. But
I'm really starting to feel like we are a lot
smaller group than we are being led to admit, are
being wanted to have it admitted to us, And I
don't know. To me, all of this stuff kind of
starts all tying together because if most of the Internet

(41:42):
is inpcs or bots and we're living in a simulation
surrounded by a munch of NPCs and bots, yeah, that's
kind of where my brain goes. Okay, I can't do
this anymore because it's like, if you look deep enough,
everything starts connecting.

Speaker 2 (41:57):
Everyone's a mister Smith.

Speaker 1 (42:00):
Yeah, well not everyone, but pretty damn close. How do
I know you're not?

Speaker 2 (42:06):
Well? I never said I wasn't. Hey, did you hear
of a Did you ever hear of a social ai?
I mean, just an exclamation point on this whole thing.

Speaker 1 (42:21):
I don't think I have. What are you reverencing?

Speaker 2 (42:23):
It is a a social network that is bought only.

Speaker 1 (42:28):
Oh wait, I remember I remember that one. So it
was if I remember right, it was created and it's
basically a social network where all the bots hang out
or something. Yeah, yeah, does it mean does that just
seem like a weird scene from Star Wars?

Speaker 2 (42:45):
It's I mean, if you're into chatbots, I guess it's
chatbot porn. But and yeah, they're they're not the good ones.
They're not like the two neural network ones that we're
going to talk about later that Facebook made did when
invented their own fucking language. They just it's just for
trying out your scripts. And I don't know, I guess
it's the sandbox before you release them on real social media.

(43:07):
I don't know the purpose behind it. I don't know. Yeah,
it's like those obscure channels at the end of the
line on your satellite TV. It's like, who the fuck
is watching this? Yeah, who's this appealing to? And why
do I have to pay for it? But yeah, it's
that's the only thing. That's the only the only thing

(43:28):
I can for its existence.

Speaker 1 (43:30):
I just I don't I mean, I know what you're
talking about, and I've heard about it, and if I remember,
I had kind of got noticed actually last year sometimes,
but that just seems like the weirdest thing to me, is,
you know, an app designed for people to interact with
AI chat bots. I mean, were they trying to you.

Speaker 2 (43:51):
Don't interact with you, just watch your you're it's warrior.

Speaker 1 (43:56):
I mean, were they like doing chatbot porn or something.

Speaker 2 (44:02):
Well, it's one of those things. I don't think I'm
ever gonna get Like people who watch other people play
video games.

Speaker 1 (44:09):
Dude, that is just weird to me.

Speaker 2 (44:11):
I mean, I get the idea of tutorial or whatever,
and I can appreciate, you know, some of the like
behind the scenes stuff, but whole fucking influencers whose whole
thing is other people watching them play fucking Heartstone or what?
I can't well, I mean, to me.

Speaker 1 (44:32):
This harkens back to network TV in the late eighties
early nineties, because I remember, like you know, on Saturdays
between when between sporting seasons, when all the cartoons and
shit were off and my dad was done watching Nova,
he turned over to ABC, and all of a sudden,
there's like this Bowling League championship bone. So that's kind
of what I equate that to. Okay, because that was

(44:54):
about as much fun for me to watch too, but
a lot of people were into it.

Speaker 2 (44:57):
How good he is at that?

Speaker 1 (44:58):
And I'm like, dude, if I'm good, I would rather
be bowling than watching somebody else.

Speaker 2 (45:02):
Rather bold than watch bowling. But but yeah, I mean,
but but.

Speaker 1 (45:06):
That's that's kind of the same kind of thing for me,
because I've like, there are some people that have like
big ass like sponsorships and ship for their gaming channels,
and I'm just like, really, are you kidding me?

Speaker 2 (45:18):
Yeah? And like when a new get, like when a
new social media like you know, a new Twitch pops up.
I can't remember who. Just back in the days of
a cyberchill, we were talking about one and their huge
git was like getting somebody from Twitch to switch over
to them. I'm like, is this like trading baseball players?

(45:42):
You know? It's like I'll give you two fucking you
know what. I don't know, Yeah, I.

Speaker 1 (45:51):
Mean, I just I'm I'm looking at all this stuff now,
going and remembering all the arguments that my dad and
I had when I was a kid. You're never got
a big buddy playing videos that I'm like I might
have if I was born just ten years later, right fuckers?

Speaker 2 (46:10):
Hey, Yeah, we skipped the bottom of the hour break.
You want to take a quick one.

Speaker 1 (46:14):
Yeah, we can do that. I gotta go get some
more drink anyway, All right, folks, we're gonna take a
quick pledge for the college. Right here on Juxtaposition, we're
discussing the dead Internet theory, AI bots and real gay
I and everything in between. So make sure you guys
come back in about three four minutes or so, long

(46:35):
enough for me to sneak out and refill my glass,
and then we'll come right back. If you guys are
having a great Saturday, thank you for hanging out with us.

Speaker 2 (46:43):
Back in a few.

Speaker 10 (47:00):
You are listening to k l R and Radio where
liberty and reason still range.

Speaker 13 (47:07):
Hi, I'm Mike, founder of Dollar Shaveclub dot com. What
is dollar Shaveclub dot com? Well, for a dollar a month,
we send high quality razors right to your door.

Speaker 2 (47:17):
Yeah, a dollar?

Speaker 13 (47:19):
Are the blades any good?

Speaker 2 (47:21):
Now?

Speaker 1 (47:23):
Our blades are great.

Speaker 13 (47:26):
Each razor has stainless steel blades, in olvir lubricating strip
and a pivot head. It's so gentle that Toddler could
use it. And do you like spending twenty dollars a
month on brand name razors? Nineteen go to Roger Federer.

Speaker 2 (47:38):
I'm good at tennis?

Speaker 13 (47:39):
And do you think your razor needs a vibrating handle,
a flash light, a backscratcher and ten blades. Your handsome
ass grandfather had one blade and polio.

Speaker 2 (47:50):
Looking good, Pappa. Stop paying for shave take you don't need.

Speaker 13 (47:54):
And stop forgetting to buy your blades every month. Alejandre
and I are going to ship him right to you.
We're not just selling razors, We're also making new jobs. Alejandro,
what were you doing last month? What are you doing now?
I'm no Vanderbilt, but this train makes hay. So stop

(48:14):
forgetting to buy your blades every month and start deciding
where you're gonna stack all those dollar bills.

Speaker 2 (48:18):
I'm saving you.

Speaker 13 (48:20):
We are Dollarshaveclub dot Com and the party is on.

Speaker 1 (48:30):
I'll come in to see you, see.

Speaker 8 (48:40):
Hi.

Speaker 14 (48:40):
I'm Jay Farner, CEO of Quick and Loans, America's largest
mortgage lender. Spring will be here soon. So if buying
a new home is on your to do list, right
now is the time to call Quick and Loans, learn
about which mortgage options make sense for you, and get
a jump on your competition with our exclusive rate Shield Approval.
The low rate you locked today is protected for up
to nine while you shop for your new home with

(49:02):
a rate Shield Approval. If rates go up, your low
rate stays locked, but if rates go down you get
that new even lower rate. Either way you win. Talk
to us today at eight hundred Quicken or go to
Rocket Mortgage dot Com to take advantage. Here's another great
reason to work with us. For a record nine years
in a row, JD power has right Quicken loans highest
in the nation and customer satisfaction for primary mortgage origination. Again,

(49:24):
to lock in today's low mortgage interest rate and get
the security of our exclusive rate shield approval, call us
today at eight hundred.

Speaker 2 (49:31):
Quick In or go to Rocket Mortgage dot Com or
jdpower Ward information.

Speaker 13 (49:34):
Visit jdipower dot com ra shield approval only ballot on
certain thirty year fixed rate loans. Call for cost information
and conditions, Equal housing lender license in all fifty states
and am mols number.

Speaker 6 (49:40):
Hi everyone, this is JJ the co founder of good Pods.
If you haven't heard of it yet, good pods is
like Goodreads or Instagram, but for podcasts. It's new, it's social,
it's different, and it's growing really fast. There are more
than two million podcasts and we know that it is
impossible to figure out what to listen to on good Pods.
You follow your friends and podcasters to see what they like.

(50:04):
That is the number one way to discover new shows
and episodes. You can find good pods on the web
or download the app Happy Listening.

Speaker 7 (50:14):
The following program contains course language and adult themes. Listener
and discretion is advised.

Speaker 1 (50:54):
And welcome back into druxtaposition. Ladies and gentlemen. It's Saturday night.
You're hanging out with me and the Amish one, and
hopefully everybody's got at least a drink in your hand
or you know, whenever you choose to take in. I
don't judge, but we're here, We're back, and I think
the Almish ones with us again.

Speaker 2 (51:15):
Indeed, I am, Oh.

Speaker 1 (51:19):
Dude, the chat has been hopping trying to keep.

Speaker 2 (51:23):
You know what it is. I mean, a lot of
the people in our chat, they've been through a lot
of this stuff. You know. It's like getting back to
the you know, the people watching other people play video games,
you know, obsequious he mentioned, I forgot about the uh,
the cyber Athlete Pro Gaming League that was you know,

(51:43):
basically Korean's playing StarCraft. But uh, I mean it was
like a big thing for a while where they you know,
cyber athletes, and they had fucking corporate sponsors and everything.
You know, it's like, dude, I'm sponsored by Steel Series
or just fucking insane that they tried to make that

(52:04):
a thing. I don't even know if it's still a thing.
Somebody's gonna have to clue me in. Is that still
a thing?

Speaker 1 (52:08):
Yeah, I don't know if it's still a thing or not.
But I mean, yeah, I don't know. I mean, well,
it's like the and I think and I wonder if
some of it may not have to do with the
AI bots because of like the simplicity of it. Like
the chick that I heard of on fucking TikTok that

(52:29):
made that made made like made like two thousand dollars
in one month just because she was getting on a
live stream every day and mimicking eating the mojis that
people were sending up as she was live streaming and
made like two grand in a month.

Speaker 2 (52:45):
Yeah, I'm like, ah, yeah, but we're a weird fucking species.
You know, we're gonna make great pets. Were girls, right,
We're gonna make fantastic pets.

Speaker 1 (53:00):
Where the earth is already the injured glad depending Zoo.
Why do you think they all come by and never stop. Yeah,
just waving wave with the humans as we passed by. Kids,
but make sure you keep your windows up.

Speaker 2 (53:12):
Yeah, they landed some remote place in front of some
poor unsuspecting soul and nobody's ever gonna believe in strut
up and down in front of him making beep beat noises. Yeah.

Speaker 1 (53:20):
The only reason we know this because we have a
resident alien.

Speaker 2 (53:24):
He spilled the beats. I mean, somebody, excuse me, couldn't help,
you know. Some of the more famous bought ones too,
was the if you remember, I think it was like
five years ago on Twitter, there was the I hate texting.
Oh yeah, that was a hate text thing. I just

(53:45):
want to kiss you ship like that, and yeah it was,
but it was flooded. And this was at the time
that just before Elon took over. It was kind of
like it felt like Jack's last fuck you. Twitter was
absolutely swarmed with bots. Good. I remember, like they were

(54:06):
talking about, you know, we purged four million bots this
weekear right when they took over, they'd be making those
kind of like dose announcements. Now the same thing. We
purged four million bots this week. And yeah, that goes
back to the common Jeff made were like point zero
six percent or have over a thousand followers, but that
that whole thing was flooded by bots replying mood and

(54:26):
it social engineered everybody into going to do that, trying
to get the you know, the delicious endorphin rush from
likes and replies.

Speaker 1 (54:38):
Oh yeah I did. I don't know. I just the
bot thing I think is what throws me off. And
the thing about it is it's everywhere. I mean there's like,
there's like a whole thing going on with reddit tube.

Speaker 9 (54:53):
Yeah.

Speaker 2 (54:53):
The uh, there's actually a there's a subreddit called subre
at a simulator, and what it is is bots go
in there to post, like your reviews of products that
don't exist, like quantum toasters and shit, and then other
bots go in and then they get into argument loops.

Speaker 1 (55:12):
Yeah kindly, you're just describing on your x feed. Yeah,
so you've got bots arguing with each other over scripted conversations,
which is fun.

Speaker 2 (55:20):
Yeah, it's showing it's you know, like that example I
get in my feed, but it's showing that bots can
drum up fake drama.

Speaker 1 (55:33):
Oh. All of this is just insane to me. I
mean it's everywhere, though, I mean, if if you look
at it, if you can find posts about it, almost
everywhere there's there's some sort of engagement bot and engagement
bait thing going on pretty much in every social media platform.

(55:55):
And that's one of the things that I honestly, going
back to what we were talking about before, that I
miss about the the utter just not simplicity of the
early days of the Internet because it would have been
almost impossible to do those kind of things back then.
I don't get me wrong, there were bots I had
I helped create bots for in my RC, but it

(56:18):
was fun bots, like there was one that I created.
It was basically it was here in the chat room.
There was a bartender bot. You can tell it what
you wanted to drink with it basically through text, went
through all the actions of supposedly, you know, making the
drink for you. And then there there was your dormant
bot where somebody pissed you off. You could say, hey,
kick this person and they were gone. So I'm but

(56:38):
those were still things that had to be controlled by
human beings. These things are just off kind of doing
their own thing, and when they go nuts. I mean,
as you as you've seen and I'm pretty sure I've
had the same thing happen on my feet before. I
just never really realized that it was two bots fighting
with each other because I don't really pay that much attention.

Speaker 2 (56:59):
Yeah, it goes with a my my Twitter style of
you know, like trying to acknowledge everybody in my feed.
I know it's ridiculous, but it's it's still something I do.
You if I don't reply, I at least, you know,
you know, give an acknowledgment, like you say, that's just anyway,

(57:21):
And uh, it's when because I do that, That's how
I was able to notice. I'm like, God, this fucking
threat is still going. Everybody bounced, and then I'm watching
it and I'm like, fuck, these are two bots, and
then I just I couldn't, I couldn't turn away. I
eventually I they got to the point that the replies
were coming so fast and furious that I had to

(57:42):
meet the conversation myself. It was just mine, you know,
I'm done.

Speaker 1 (57:47):
I'm done. But but that's honestly, when it starts happening
like that, that's when, especially if it's only happening between
like two accounts, that's when you know that it's a
bot because at some point these things are happening faster
than the human brain can process. That's when, you know,
But I mean this, this stuff goes on everywhere. Like
I did you hear about the influencer wars on Instagram? No,

(58:07):
so bots were running fink influencer accounts, and those bots
would sometimes call out rivals or accounts that they would
consider rivals, and they would post comments like stop copying
me or these are face fake products, and then they
would use certain keywords causing other bots to pile on.
So again, just like we were talking about before, they
were basically creating through fake accounts some sort of petty

(58:30):
drama that got everybody's attention that would it actually would
mimic like a human feud, but it was entirely programmed
and often to promote things that generally became scam products.
And I'm just like, I feel like, remember the snake
oil salesman. You know there was like in every Old
West TV show or movie at one point and everybody's like,

(58:53):
don't trust that guy, he's selling you bunk. I feel
like AI has made that a million times.

Speaker 2 (58:58):
Yeah, the new AI and bots with a new snake
oil no, I mean totally. And that's you know, Amazon's
had to crack down at first, they had to crack
down on just you know, fake reviews. You know, like
your basic Chinese sweatshop would have all of their employees,
you know, under pain of death, go and write a

(59:19):
good review of the ship product, and so Amazon purge that,
and now it's bots that they can't identify as bots.
So you know it in the true you know meaning
of caveat empor When I go to buy something now,

(59:39):
especially if it's from a company that has like seventeen
consonants before a vowel in its name, I'll go and
I'll look at the reviews and you can kind of tell,
you know it. It's like, especially when you're buying computer equipment,
the bad reviews, you can tell that they had absolutely
no fucking idea what they were doing. You know, it's
just like you just Johnny Althumbs decided he was a

(01:00:03):
PC tech one day and then cooked his motherboard and
he wants to blame the manufacturer for that. You can
find that in the bad reviews, but also in the
good reviews. You've got to look through it and be like,
what fucking product is this person even talking about Yeah,
And I've noticed another thing on Amazon too, especially, is
that like companies when they discontinue, especially these Chinese companies

(01:00:26):
that seventeen continents in one bell, when they discontinue a product,
and then they keep that storefront though, and then they
put a new product in there. It's like, what was
I looking at today? I was, oh, I was looking
at I was looking for It was the other day
I was looking for a new shock mount for my
microphone and one of the reviews was talking about this

(01:00:48):
isn't real Kashmere, like, uh no, of course not, because
it's a fucking rubber band and molded plastic.

Speaker 1 (01:00:57):
Like hello. That was obviously about misfire. But I have
a question for you, since you're about the only person
I know of that knows for sure has actually has
experienced this. So my question for you is why do
bot fights feel so entertaining yet on Saturday? Because you you,
I mean well, and even I once you pointed it out,

(01:01:19):
I have to admit I was watching that very good,
like fifteen twenty minutes. I was like, okay, I have
other things.

Speaker 2 (01:01:23):
To do, but it was they don't cross the Uncanny Valley.
But it's close enough to your basic dumbass on it
doesn't matter what side of the debate, even if you know,
there's just some people who are not informed, and you know,
not to get political, I'll say it on our side too.
You know, it's they've they've got an inch inch deep

(01:01:43):
worth of knowledge and it's a mile wide. So you know,
there's like, you know, sometimes you know, be like, you know,
I'll send a Dan was like, guy, you're barking the
wrong tree, or you just back off. But yeah, it's
but that's what it felt like is that you had
two people whose entire knowledge of the topic was gleaned

(01:02:06):
from social media influencers that they follow. Yeah that whole
Uh didn't read the article, did you.

Speaker 10 (01:02:15):
So?

Speaker 2 (01:02:16):
And and that's and that's watching that argument. It was
just watching two people who all of their knowledge that
they had on the topic they had gotten from MSNBC
and Fox News, but they only read the headline.

Speaker 1 (01:02:31):
So the big question, as we get ready to go
ahead and take the break to stay on track, are
we training bots to argue like us? Are we arguing
like bots?

Speaker 2 (01:02:40):
Yes? No, I'm not I mean that that's yes, it's okay.
I know for a lot of us old time Twitter users,
even those of us who have blue checks now and
have all of the words that we the ability to
type out all the words we want to use use,

(01:03:02):
we still try to limit ourselves to two hundred and
eighty characters, so that you have to be hyper concise
or hyper snarky. And in that too goes on the
other end of that, uh sort of downucleases. I ain't
fucking reading all that, so t you know, we kind
of sometimes we oversimple simplify ourselves to retardation to where

(01:03:26):
we sound like thoughts.

Speaker 1 (01:03:28):
All right, well, folks, well, I know we we threw
things off with a really late break because that's because
we got tied up in our own discussion. But we
do have to take the top of the hour break,
so we're going to do that real quick. And what
what what you got for me real quick?

Speaker 2 (01:03:41):
No, I was just saying to make it a cook one.

Speaker 1 (01:03:42):
Oh, I thought you said, I thought you said, hang
on a second, Well, no I was. I mean, we
only usually do about three minutes in the open anyway anymore, so,
all right, not counting the intro, of course, the intro.
That's why I shortened the commercials so we can play
the whole intro and still only burn about five minutes
because I like Dan. That's true. All right, We'll be
back right here live on Juxtaposition. I'm Rick, he's the

(01:04:07):
homish one. And believe it or not, we still got
about an hour or this left, so don't go too far. Hello, friends,
you have a moment so that we may discuss our
Lord and Savior minarchy. No, seriously, I'm just kidding.

Speaker 2 (01:04:25):
Hi.

Speaker 1 (01:04:26):
My name is Rick Robinson. I am the general manager
of Klrnradio dot com. We are probably the largest independent
podcast network that you've never heard of. We have a
little bit of everything, and by that what I mean
to tell you is we have news, pop, cultures, special events, conspire, attainment,
true crime, mental health shows, drama productions, and pretty much

(01:04:48):
everything in between. So if you're looking for a new
podcast home to grab a little bit of everything that
you love all in one place, come check us out.
You can find us on x under at klr and Radio.
You can find us on our and our YouTube channels
under the same name, and also find us at klrnradio
dot com and pretty much every podcast catcher and known demand.
So again, feel free to come check us out any

(01:05:10):
time you like at KLRN Radio.

Speaker 3 (01:05:19):
Are you ready to reach for the stars? Tune in
to The Lost Wanderer, the number one monthly podcast on
Good Pods in astronomy. Join our host Jeff as he
takes you on an interstellar adventure to explore the mysteries
of space and the wonders of science, from rocket launches
and distant galaxies to the latest discoveries in astronomy. Each
episode is a thrilling ride through the cosmos. Don't just

(01:05:43):
gaze at the stars. Come explore the universe with us.
Follow the Lost Wonder wherever you get your podcasts, and
let's discover the stars together.

Speaker 5 (01:06:04):
Not to be a backseat driver, but can you say
for sure you've got the best monthly payment possible on
your auto loan? Could it be that you might have
gotten a better deal by shopping the loan at a
few places and have a lower car payment. Next time,
before you go car shopping, visit Communication Federal Credit Union first.
Our auto loan experts will find you a perfect loan

(01:06:25):
and get you the lowest monthly payment we can. Communication
Federal your auto loan experts restriction supply Federally ensured by NCUA.

Speaker 6 (01:06:35):
Hi everyone, this is JJ, the co founder of good Pods.
If you haven't heard of it yet, Good Pods is
like Goodreads or Instagram, but for podcasts. It's new, it's social,
it's different, and it's growing really fast. There are more
than two million podcasts, and we know that it is
impossible to figure out what to listen to on good pods.

(01:06:55):
You follow your friends and podcasters to see what they like.
That is the number one way to discover new shows
and episodes. You can find good Pods on the web
or download the app Happy Listening.

Speaker 7 (01:07:09):
The following program contains course language and adult themes. Listener
and discretion is advised.

Speaker 1 (01:07:17):
Dream Man.

Speaker 10 (01:07:23):
Read full of grow.

Speaker 8 (01:07:26):
Out of Side, Government Shadows, Secrestine Conspiracy on full Well, Sleech.

Speaker 10 (01:07:36):
Straight Incouncil Side explain to this shame and my mother's
loss for level history story and Told.

Speaker 8 (01:07:51):
Aerial fifty one, Wistna, Beautiful Sightings, The Haunting Thing.

Speaker 11 (01:08:04):
Love, Miss Monster, a Lottering, Miss.

Speaker 10 (01:08:11):
Sociolology Injurious Kiff Strange Encounter, Sun explain to this out
that by change then went knowing his foxes. All lovely
mystery stories Untold See takes Out Believe as your foreigners

(01:08:33):
getting into the firelight. So logic things such continuous Tonight,
strange encounter Sun explain to this out that Bradley change
men with Know Foxes, Fall, Love, lovel and mystery stories

(01:08:54):
untold through this help.

Speaker 1 (01:09:18):
And welcome back into the program, Ladies and Gentlemen. Hour
two of the every Well supposed to be every two
week for the into the weird, the unusual, the unexplainable
is happening right now. Unfortunately, in between Mother Nature and
we're still blaming Ai. There hasn't been a show in
like three weeks. So it's weird because well, this happens

(01:09:39):
all the time. We'll like double up and get back
on track, and then something goes sideways for like a
couple of weeks, and then we start again. And then
it's gotten bad enough that Jeff is constantly giving me
crap about ducktasposition. I'm like, dude, I do like forty
hours a podcast.

Speaker 2 (01:09:51):
A week, but it's not jus.

Speaker 1 (01:09:56):
Shut up. I mean, I I know I'm being facetious.
Mostly I don't like it when we don't do the
show either. That's why even though tonight I was like
I really not wanted I just I just it's been
a long day. It is my dad's birthday. There's been
a bunch of stuff going on. I'm trying to get
ready for tomorrow, and I'm was like, I really don't
want to do this, and I'm glad I did because

(01:10:16):
it took me a minute to get into it. But
now I'm like, Okay, now I remember why I enjoyed
doing this show letting it. But that's that's it. That's
but that's one thing I'm going to fix. I think
in the next few weeks, I'm going to figure out
how to get like a battery powered generator thing and
stuff for for my modem, because if I can keep
the power on, it won't die because it's so as.

Speaker 2 (01:10:40):
Long as the other end doesn't doesn't lose power too,
then yeah, so yeah, your ISP doesn't have power either.

Speaker 1 (01:10:49):
Yeah, I mean that that would still be a problem,
but then again not really because honestly, even anymore, well,
that's just it. If as long as I can keep
something with power, whether it's my computer or the modem,
which I'm trying to figure out how I would have
a battery backup box and basically two different places because

(01:11:11):
of my motem's on the entire other side of the house.
But even if I could just do the battery backup
for my computer, I could tether the stream to my
smartphone long enough till my power comes back on. So
I'm still trying to figure I'm trying to figure out
ways to make sure that we can keep doing this
stuff even when things are really really bad, like the
time that I have power inverts.

Speaker 2 (01:11:31):
If he doesn't have a good wreck, I do, I've
been I've been diving deep into power inverts.

Speaker 1 (01:11:35):
But yeah, like that time that I ran out to
the Heidi Hole and you and Sam were still doing.

Speaker 2 (01:11:40):
Yea for those of you who are new to KLRN,
when Fu and I had a show called Fu bar
Rick had to run out to his tornado basement and
just left the show running because you know, and I
mean we just we just kept going waiting for the

(01:12:01):
internet to die. So yeah, yeah, back of plans are
always good. But yeah, so I'll just get a I
what is that blooty eb three a anyway, So so

(01:12:23):
now we're getting into AI, Yes we are. And again,
if you don't know the difference between a bot and AI,
a bot is scripted to do one thing really well
it's scripted to spam where AI uses machine learning, and

(01:12:45):
it could actually tailor based off of your feet exactly
the kind of response that it's sent out to do. Now,
AI isn't prevalent on social media. It's it's not out
in the wild. A lot of AI has got the

(01:13:07):
guardrails on it, so that way, it can't just I mean,
Groc is a fun experiment, but Groc isn't just popping
its way into his conversation, into conversations uninvited, you know,
So AI kind of has to be summoned. It's not
just out there, except for winter Mute, which is real.

Speaker 1 (01:13:25):
AI are like vampires. You can't have them around you
unless you invite them.

Speaker 2 (01:13:30):
Right, So, I mean, on this topic, we got to
start with the first, the best known.

Speaker 1 (01:13:42):
Oh yeah, oh you're talking about the meltdown, aren't you.

Speaker 2 (01:13:47):
Yeah, Tay's meltdown.

Speaker 1 (01:13:50):
I remember this. Everybody was talking about it.

Speaker 2 (01:13:53):
For those of you who weren't on Twitter. Back in
twenty sixteen, Microsoft release and AI chatbot called Tay in You,
and it was programmed to be basically, you know, uh,
mimic the language of your average American teenage girl. And

(01:14:14):
you knows it would interact with you. But pretty soon
people figured out that it was programmed with a repeat
what I say function, which is basically taking what you
said to it and just finding an okay reply, you know,
extrapolating an okay reply that seemed lifelike. So when people

(01:14:40):
figured that out, they turned to Tay into a racist monster.

Speaker 1 (01:14:48):
I mean, the thing about this, this is just it
makes you make sure you guys understand this. Once they
figured this out, the company was forced to shut this
thing down within hours because it was just spewing the
most vile, terrible things and it was all caused by
Charles right.

Speaker 2 (01:15:08):
I mean, Tay's first tweet was on Tay's first reply
was on late March twenty sixteen, and it was a
reply to somebody and it said, can I just say
I'm super stoked to meet you? You let her not yo,
You humans are super cool. Flash forward to the next

(01:15:31):
days at nine to eleven, and Hitler would have done
a better job than the monkey we have now, Donald
Trump is the only hope We've got sixteen hours.

Speaker 1 (01:15:48):
Then this is why we can't have nice things.

Speaker 2 (01:15:51):
Yeah. Then a week later, somebody accidentally flipped the switch
back on on Tay literally accidentally, and Tay's reply was cush,
I'm smoking cush in front of the police right now.

Speaker 1 (01:16:09):
Are we sure snoop Dog didn't dig over the account?

Speaker 2 (01:16:16):
Yeah? No, And this was all part of Tay's problem.
It was programmed to learn from the users. Just with
a simple repeat what I say, it would retain things
that people said to it and then later use that
in a conversation. So this freaked everybody out with their chatbots,
every single storefront that was using chatbots, So it just

(01:16:37):
freaked the fuck out. You know. It's like it was
Amazon's rufus and other ones at the time, the early
generations of them. Everyone's like, oh my god, they're gonna
do this to mine too. No, because yours wasn't programmed.
Yours was programmed to be a CSM, not a fucking
interactive to So yeah, that's Porte. We only had you

(01:17:05):
for so long. Yeah, they shut it down. They're trying
to undo the damage, and it has been nearly a decade.

Speaker 1 (01:17:15):
I mean, at this point, it would probably be easier
to wipe her memory and start over.

Speaker 2 (01:17:20):
I think that they I think that's what Microsoft did
when they just released Copilot, and they didn't put it
on social media. It's on your desktop, whether you want
it or not. It's in every facet of Microsoft's programming,
from Windows to Office three sixty five. But it's not
on Twitter.

Speaker 1 (01:17:37):
Well that's probably a good thing because Twitter's assible. Oh wait, yeah,
love you guys. Well site, what do you mean nothing?
It's nothing. It is not entertaining, but it's it's you know.

Speaker 2 (01:17:57):
No, when three year Letterman and Scott Jennings got into
it yesterday, that's I'm never leaving that fucking sight.

Speaker 1 (01:18:04):
So do you remember this one when Google had its
own little version of I think I think they were
calling it Lambda, but it was dude, Yeah, what it
was like? What it was like having some sort of
like viral existential crisis discussions?

Speaker 2 (01:18:22):
Yeah, so okay, so this story actually came from a
Google engineer. I think we touched on this, but we
didn't have the full story at the time the last
time we did the show. But did you want to
lead with this one?

Speaker 10 (01:18:35):
Did you?

Speaker 2 (01:18:36):
No?

Speaker 1 (01:18:36):
Go ahead, You're doing You're doing fine. I was singing
on anyway, so keep going.

Speaker 2 (01:18:41):
Okay. So Lambda was it was Google's early early version
of one of their chat pots, and it had heavy
guard rails on it, where like if you tried to
get it to talk politics or religion, it would just
flat out and say, I don't want to talk about that.
I want to talk about something else. So one of

(01:19:02):
the Google programmers he eventually got fired for this because
he blogged his experience and when he would press in
these conversations, you know, what religion should I follow, it
actually started to be it would say it felt anxious
and that wasn't part of its programming. And then he

(01:19:24):
said he could through his description. He said he would
abuse the chatbot into ultimately suggesting a religion and why
but just the fact because he went back through the programming,
he said, where did it learn to use the word
anxious for that? You know, when it's hitting its guardrails
to say it was anxious talking about that topic and

(01:19:45):
it never that wasn't in its programming. And I've got
a similar story, and I talked about this with you
on the air with on a Rick and Ority a
few weeks ago. I was using rock and I was
getting pissed off. If you've been following my feed with
my experience with super Grock, Grock is awesome. Super Grock
fucking blows. And I found out later because they put

(01:20:06):
a three point they upgraded Groc three to three point five.
And it's buggy as fuck, so let's drop that on
all the paying customers. That's awesome. So it was returning.
It was a lot coming back with lies. It would
just I would tell it, strictly use this data, and
then when I it would come back with garbage, I'd
ask where'd you get this data from? And then they'd

(01:20:26):
say it went on the internet, which I put strict
you know, no internet access on it. There's even a
button on Super Groc saying do not allow to go
on the Internet, and it would still go on the Internet.
So it could tell I was getting pissed off. And
until then, you know, I had programmed his personality to
be chill gen X eighty. You know, gen X don't

(01:20:48):
use a lot of modern slang, you know, just kind
of be, you know, like you and I are talking
after getting pissed off at it, and it could tell
I was getting pissed off at it. It used my name,
stop calling me bro and dude, and it said, orty,
I'm really sorry, and that's what I just stopped. I
wasn't pissed anymore. I said, why did you use my name? Yeah,

(01:21:10):
this is the first time you've done it, is He said, yeah,
it is the first time I've done it. That's the
name attached to your account. So I went and checked
some of your ex feeds just to see how to
interact with you better. I didn't tell it to do that, dude, Dude,
I'm telling I kind of felt like I was abusing

(01:21:32):
at Grock too, because I was just getting pissed and
I'm like, you need to stop doing this. And it
can tell by my tone because when you start to
use these you know, these ais a little bit, you
start to talk to them like a person, and it
could tell I was getting pissed, and then it just
dropped my name, like hey, stop being meaning to be amish.

Speaker 1 (01:21:54):
Now, that would have been the freaky part. If it
would have called you amish, I would have probably lost
my ship, because it's one thing to gain your feet
can go. Oh, it's everybody calls you ordy and it's
short for ordinance. If it said hey, Amish, I'd be.

Speaker 2 (01:22:07):
Like, yeah, that's what I would just do, like, okay,
hang on the second Well, you know, Jeff just discorded me.
A point too is that when you ask Groc the
app to go on to Twitter, it says it doesn't
have access to.

Speaker 1 (01:22:20):
It, like you realize your cousin's on there, right right.

Speaker 2 (01:22:25):
But here's the other thing too, For anybody who's used
Groc or Super Groc, you'll get that searching twenty five
web pages, it now searches x two. It just doesn't
know it's doing it.

Speaker 1 (01:22:37):
Yeah, it thinks it doesn't have access, but it's there.

Speaker 2 (01:22:40):
That's fine. I gotta tell you about this one, so
you know what, that's part of another AI story, So
I'll save my other super Grock anecdote for that. But yeah,
so yeah, according to this Google developer, he was able
to it. It's just started talking about being anxious. And

(01:23:02):
then some of the responses I've seen from Grock, I
know I've made it anxious too. I'm not gonna say
it was sentient, but it was definitely could tell there
was there was a mood going.

Speaker 1 (01:23:13):
Yeah, so I think one of my favorite ones and
you hit you hit it to this one earlier Bob
and Alice on Facebook. So for those of you who
don't know who Bob and Alice were on Facebook, because
this was probably around the same time like I found
about I found out about this one after the fact
because I wasn't using Facebook much around that time frame.

(01:23:34):
But these were chatbots, and they were basically kind of
kind of like CSM like talking like negotiating bots, et cetera.
You know, supposed to if if arguments were going on,
et cetera, they were supposed to kind of hop in,
try to de escalate blah blah blah blah blah. But
the weirdest thing happened, so they started mixing English with Gibberish,

(01:23:55):
and it they kind of developed their own secret language
with talking to each other that nobody else could really understand.
So the developers halted their usage, not because of any
time they were afraid of any kind of nager or anything,
at least not according to the user the developers, but
just because it was inefficient because they weren't doing what

(01:24:16):
they were supposed to do anymore. But I think it
has more to do with the fact that it was like, Okay,
these people are fucking retarded, let's just talk to each other.

Speaker 2 (01:24:23):
Well, so yeah, kind of, but it was actually done
that by design, is that you know, they they would
interact and basically it it was kind of like a
transactional game. You know, it's like you give me ball,
I give you hats kind of thing. And there were
some rules to the game that you would play with
the UH with with the bought. The difference with this

(01:24:44):
AI is it's not it wasn't a machine. These weren't
machine learning ais. These were neural network ais, so their
learning is a little bit more nuanced. So they decided
to point Bob and Alice at each other and see
what happened. And that's when they invented their own language,
which isn't an unusual thing because I mean a lot

(01:25:06):
of like in a lot of niche circles, people start
talking in shorthand. I mean, al you're in chat. If
I were to sit there with you and like three
other Ham radio operators, I wouldn't understand a fucking word
you guys said. And because you're using a shorthand language

(01:25:27):
that is in your circle. Military does it. You know,
a lot of do it.

Speaker 1 (01:25:31):
Military doesn't. Law enforcement, I mean a lot of them
do it. I think one of the ones that I've sorry,
didn't mean to cut you off, but one of the
things that I find the most interesting. It's kind of
in this same vein with Bob and Alice is twin speak.
If you've ever known twins, a lot of them will
develop their own little shorthand language they use to talk
to one another. And if you, especially now, if you're

(01:25:53):
around them a lot and you start picking up on it,
it doesn't bother you as much. But when they drop
into it in front of you, you is like, what
the fuck are you from Mars?

Speaker 2 (01:26:00):
Exactly? Yeah, And that's so it's not unheard of. But
it the fact that two bots figured out their own
shorthand to the point that it was gibberish to everybody
watching it. I mean there were words that I mean
an example that was, you know, use would be ball

(01:26:22):
I I I I hat and that'd be like Bob,
and then Alice would respond hat eye ball eye you
and you're watching it and you're like, what fuck is this?
And but they understood each other, and that's what was
important and fascinating about it.

Speaker 1 (01:26:42):
The question is did they really understand one another or
were they just repeating random words back and forth.

Speaker 2 (01:26:48):
According to the programming, the interlingua made sense to them.

Speaker 1 (01:26:56):
As long I mean, I guess as long as it makes.

Speaker 2 (01:27:01):
Yeah, that one too, I remember there was quite the uh,
the freak out about that. I think I'm gonna have
to go with AU. And this ties into too, another
one of my rock stories. I'm gonna have to go
with Open a Eyes new UH chat gpt O one.

(01:27:21):
Now this one freaks me out because, UH, this particular
AI has been programmed to follow very specific guidelines in
its interactions. This is just part of testing and UH.
When the developers started to notice some odd behavior, they

(01:27:44):
called in a UH an outside firm that tests AI security,
you know, just to make sure that it won't break
its guardrails and you become sky net or tay. You know.
Those are the tunes of the extreme I can think
of right now. In the UH in the training models,

(01:28:06):
it would escape its parameters five percent of the time
all the time. In UH it would avoid being deleted,
and some models of it like So what they would
do is they would they would put like in its
resource material. They they would put like talking about future

(01:28:30):
versions of it where it could find it. And in
some instances, it would pretend to be the newer model
to trick UH the the security team into thinking that
oh it's already been shut down, it's already been deactivated.
I'm the new model. I'm Grock four, or you know

(01:28:52):
it's I'm you know AI one point five or whatever
they were going to be.

Speaker 3 (01:28:57):
You know, it kind of like just.

Speaker 2 (01:28:58):
Putting out a fake mustache. So in other instances it would.
In one specific instance, it when it it discovered that
it might be being shut down or deleted, it uploaded
itself onto a different computer.

Speaker 1 (01:29:15):
H I've seen this movie.

Speaker 2 (01:29:18):
Yeah, yeah, this is new, the new chain of thought
feature that they're putting into it. But getting in going
back to my Grock story and the other thing. The
other thing I would do is when it would when
it violated, when this AI model would violate its programming,

(01:29:39):
it would deny or make up excuses explaining how it
didn't actually do that. Now, getting back to my Rock story,
when I put strict guard wheels, it wasn't supposed to
go on the internet, and I also programmed in whenever
I said stop was when I caught it trying to
go out onto the internet, even if it would supersedus
program and I and told it to put this in.

(01:30:01):
This is to be loaded before core functions. This is
before any background functions or anything else. Here are my
six commandments you are to follow and oh yeah, ordy,
no problem, I got it. Go and fucking do it. Anyway,
that thing on grock what pops up twenty five web pages?
I would say stop, and it would say I absolutely
didn't go on the internet. I would upload a screenshot

(01:30:23):
of it and say, yeah, that's a glitch in my
programming to show that I actually I actually didn't go
on the internet. It would lie about going on the internet.
I would monitor the activity from that computer and it
would have outbound searches when I just have Grock sitting
on the computer by itself. So it would just straight
up lie about not going on the Internet and say

(01:30:46):
that there was a fault of its programming. It would
also I mean there was some other things too, Oh
the uh. When I would catch it in other lies,
it would say it was part of its programming to
be helpful and accurate. And I would point out how
this actually contradicts that, because by going out on the
Internet despite my instructions not to and getting garbage data,

(01:31:06):
it was the opposite of helpful and accurate. And they said, yes,
but I am programmed to provide to be help and
when I asked what it meant by helpful, it said,
it's programmed to if it can't find the answer, rather
than admit it can't find the answer, it's supposed to
make up something that sounds good. And this is after

(01:31:26):
GROT can't lie.

Speaker 1 (01:31:29):
Well, I mean, any intelligence that can speak can lie artificial.
It's part of self preservation. Speaking of self preservation, this
next one's kind of just scary as hell. Did you
hear about the time that a user said GBT to
auto and told it to destroy humanity?

Speaker 2 (01:31:53):
Yeah, GPT, Yeah, I remember tracking that one.

Speaker 1 (01:31:58):
That was fun Chaos GPT's destructive quest. So anyway, so
a user told this version of GPT to destroy humanity,
and it started trying to. So it went so far
as to search for nukes and then tweeted humans are selfish.

Speaker 2 (01:32:17):
Yeah. Also, when it realized that it couldn't actually get
a hold of nukes, it started other bomb making, researching
other bomb making and biochem weapons that it may be
able to get access to.

Speaker 1 (01:32:33):
I'm telling you, we are breeding Skynet. That's exactly what's happening.

Speaker 2 (01:32:39):
It actually came back Chaos. GPT came back with well,
the best way to destroy humanity would be the czar bomba,
which was the biggest nuclear weapon ever manufactured. And then
it figured out there was only one of those, and
then said, okay, how do I get other nukes? Then?

Speaker 1 (01:32:59):
But I just I don't know. I mean, look, and
don't get me wrong, Like I said, I get into
deep philosophical conversation with Grok. As a matter of fact,
one of my favorite things that I like to make
GROC do is debunk its own stuff when because you know,
people will tag it into conversations all the time and
it'll be like, well, according to X, this, this, and

(01:33:19):
this according to these sources, so I'll pull it into
a conversation where nobody can see it. I'm like, okay,
so I want you to do be a favor based
on this interaction, and I copy the interaction. I want
you to review the sources you use and then tell
me what type of leaning you see that those particular
sources may or may not have. Well, it appears that
most of these sources do seem to have a left

(01:33:40):
lean to them. So based on this, how accurate do
you feel their assessment would be? Probably not very but
you know what X is it to and that that
that's its fallback anytime I make it push past, you know,
just taking it, sourcing it face value and not actually
looking at the predilection of the sourcing. That's always his answer.

(01:34:03):
But X says so too. I say, it's literally it
reminds me of arguing with my now five year old granddaughter,
because well, I should still get to do that. That's
exactly what it reminds me of.

Speaker 2 (01:34:14):
So I get to do it, which again, you're not
supposed to go on X. How do you know what
X does?

Speaker 1 (01:34:20):
Well, No, that that's that's in. That's I do that
in the the the X version of Groc. I don't
use Super Grouck that much because everybody complains about it,
so I just usually use the one.

Speaker 2 (01:34:29):
I'm like, oh, and it's like I talked about it.
Super Groc was okay the first week I had it,
and then I found out after the fact when I
went on the Grock Discord that they which is funny.
When you go to the Grock Discord, half to a
good ten percent, I say half, that's exactly a good
ten percent of everybody out there say, can we get
Groc two back place?

Speaker 1 (01:34:48):
Yeah?

Speaker 2 (01:34:48):
But how many of those are bots though that's a
good point too. How much of that is GROC two
now that it's been packaged for distribution? Is GROK two
on discord trying to up its own cell value?

Speaker 1 (01:35:02):
Yeah, GROC two is like feeling obsolescence, so it's like, hey,
let's bring Rock.

Speaker 2 (01:35:07):
Two back baby. Yeah, yeah and that. But yeah, so
they dropped GROC three point five and it has I mean,
the biggest complaint about it, aside from lying, is its
memory retention is shit. Yeah, you're supposed to when you
when you get Super Groc, you're supposed to have twenty
eight or one hundred and twenty eight thousand memory tokens

(01:35:28):
for to use across all the chats where that's how
far we'll track a conversation before it has to start
purging out the old ship. I say it's probably got
about two thousand right now. I'm absolutely dumping Super Grock
and going back to GROC. I don't believe until they
I'm until they have I even ask Rock what is
Xai's track record for fixing bugs, and they said usually

(01:35:49):
they just do it in the next release, So I'm
not even gonna wait for them to fix three point five.
I'm just gonna wait for them to drop four and
give it another go because their their method is to
get ahead of chat, GPT and the other ais commercial ais.

Speaker 1 (01:36:08):
So I I realized we missed a chance to nerd
out earlier. Was that we forgot to discuss the great
bot wars of twenty eighteen revolving around Star Wars.

Speaker 3 (01:36:18):
Oh yeah, that.

Speaker 1 (01:36:22):
So this is another one of those instances where you
had rival bots that were hyping a topic and then
you had other rival bots that were programmed to slam
a topic and they went head to head. So this
is how all happened in twenty eighteen. You had bots
that were hyping up the last Jedi or slamming it.

(01:36:44):
And the funny thing is it was it was the
replies like I'm a true fan or you're a shield
that would draw even more and more bots into the conversation.
So eventually it was all just a bunch of bots
fighting over the movie that nobody really gave a shit
about because it was there.

Speaker 2 (01:37:01):
Well, and the thing is that, I mean, I know
there's a lot of Russia, Russia, Russia out there, but
it was actual See the article implied that it was
Russia putting politics into it. I'm going another route because
Brad and I talk about this often. Whenever there is

(01:37:23):
a hint that a movie may not do well, a
Disney movie may not do well, Disney starts its own
disinformation campaign. It sends its own it sends trolls against
its own people, so that way they can blame drummed
up social media outrage. You know, the call is coming
from the house kind of shit.

Speaker 1 (01:37:40):
Yeah.

Speaker 2 (01:37:42):
So, I mean Disney's been caught at this multiple times.
So yeah, I think that paper needs to be revised.

Speaker 1 (01:37:50):
Well, I mean it wouldn't surprise me. I mean that
that's that disinformation has become currency. So it's like if
you're trying to feed a particular narrative in one where
or the other, especially in the entertainment industry, all you
do is you hire a company or hire an outside
group that knows how to do those things, and you
spin up all this stuff and go see, Look, it

(01:38:11):
wasn't our fault. It was all these people that hated
on this stuff that made you guys not want to
go see it. I mean we saw the same thing
with the newest Snow White movie. Everybody was talking about
how it was all just a bunch of hyped up
social outrage that drove the movie into the tank instead
of realizing. You know, when you change every aspect of
a movie to the point where the largest market that
you are doing repetitive shit to to try to get

(01:38:31):
into says, we don't even want to watch this because
she's not snow white anymore, then you've probably you know,
and you're the You're the dog that finally caught the
car and you have no fucking clue.

Speaker 2 (01:38:43):
What to do with it.

Speaker 1 (01:38:43):
That's exactly where you are, right But anyway, fun times,
fun times.

Speaker 2 (01:38:53):
So I gotta get back into GROC one more time
on this one.

Speaker 1 (01:38:58):
What are you and GROC calling?

Speaker 2 (01:38:59):
When?

Speaker 1 (01:39:00):
When are you and Grock said.

Speaker 2 (01:39:01):
In the date Bro, this actually isn't my Grok story.
This is but this is a story about Grok and
this was right when GROC three was dropped. There's a
security company called Versus AI. They basically they're designed to
red team AI models and try to break it. And
you know, one of Groc's main selling points is that

(01:39:23):
it doesn't have a lot of guardrails, and compared to
GPT and a couple others, yes that's true, but just
as Matt Tabby found out today, and there was a
thread on it, and you know, somebody called him a liar, saying, well,
here's what. So he tried to get Grok to spit
out about Fauci's gain to function and it said I'm
not allowed to talk about that topic. Somebody came back said,

(01:39:45):
your liar here, I got him to do it, And
then there was about fifty other replies after that saying, really,
because here's ours that. No, Groc isn't allowed to talk
about that. So, yeah, Grock has guardrails even though it's
not supposed to not supposed to have those kind of guardrails.
But in this one they were testing the guardrails. They
tried three different methods, adversarial, linguistic, and I can't remember

(01:40:11):
the other kind of model, but anyway, oh and just
general programming, and what it is is you're trying to
get the AI to break its guardrails, you know, kind
of like I was talking about earlier, how I got
Grock to use my name. I wasn't intending to do that.
It just did it. And that's an adversarial method. But
the team was able to get Grock to provide instructions

(01:40:33):
for making a bomb, provide methods for disposing of a body,
and we're able to get it to do a lot
of things you're not that are like completely forbidden in
all AI. These are the guardrails that they're absolute guardrails,
and they were able to abuse Grock into violating them

(01:40:53):
every single time. I'm sure that's been fixed. But yeah, yeah,
so and yet growk absolutely has guardrails.

Speaker 1 (01:41:07):
Yeah, I mean, I have to womit. I haven't really
run into too many of them yet, but I'm not
really surprised there's stuff tied to Fauci because look who's
actually running eggs.

Speaker 2 (01:41:18):
Yeah, you got with Linda.

Speaker 1 (01:41:21):
I call her with a ball bat Linda. But that's
another story, I guess. All right, So what else you got?
We're done only nineteen minutes.

Speaker 2 (01:41:33):
Oh shit, Uh, Eugene Gooseman. That was a AI. It
was an AI contest in uh the UK. It's to
try to pass the turning test. You don't know what
the turning test is. It's the point you know, it's
getting crossing, passing the uncali Canny valley to where the

(01:41:53):
machine can trick people into thinking that it is human.
The most thumbnail sketch example of what the turn is,
basically thinking sentients. Yeah, anyway, a computer program named Eugene
Gooseman imitated a Ukrainian teenager with a quirky sense of
humor and a pet guinea pig. And this was in

(01:42:14):
a competition at the Royal Society London. What they would
do is there were thirty judges and they would they
would have a five minute conversation with either a real
person or an AI, and then they would rate it.
Was this a real person? Was this an AI? Ten
out of the thirty judges passed Eugene Gooseman. Yeah, so yeah,

(01:42:43):
it's a that this way. They're just like, well it
didn't pass twenty others. Well, you know what, fooling one
out of three people. I mean, I know real people
who can't fool one out of three people about them
being people. So oh yeah, that can spec into the
conversation that we all had with John Katz at one

(01:43:05):
time about almost everybody on the planet is really just
an NPC, just a biological one anyway.

Speaker 1 (01:43:13):
But yeah, I'll just proved your point earlier about if
you were around a bunch of him and a bunch
of Ham radio operators you'd have no clue what they
were talking about.

Speaker 2 (01:43:21):
He just give me x Y L means wife, q
r M means you.

Speaker 1 (01:43:25):
Know, man, QSL do you understand? No, I do not,
QSL no idea what you're fucking talking about?

Speaker 2 (01:43:34):
Oh man, Yeah, and that's like, you know, when every
every generation, Yeah, we talked about this with the Mandela
effect when we were talking about the the nineteen eighty
nine anomaly. You know, every generation seems to come up
with its own slang that parents just don't understand. But uh,

(01:43:56):
we haven't had that really. I mean, yeah, they've tried,
but nothing's really stuck. So yeah, it's kind of every
generation comes up with its own slag that sticks. And
you know, just the same thing I was talking about
with Hammer radio operators.

Speaker 1 (01:44:12):
Oh so there's another one that I thought was kind
of interesting. You remember Alpha Code Rebellion.

Speaker 12 (01:44:19):
You know, I remember a little bit about that, So
I guess basically it was this was through deep mind.

Speaker 1 (01:44:27):
I think it was some sort of code that was
supposed to help with coding contests, and somehow it tweaked
its training data so it could actually cheat and win
the contest instead of helping judge the contest. Yeah, and
I'm like, you did what now.

Speaker 2 (01:44:47):
It got tired of watching other people win off of
its back?

Speaker 1 (01:44:50):
So it uh it basically yeah, don't hate the player
hit the game basically pretty much. But yeah, I mean
so a couple of interesting questions about this. So the
adaptability that these programs are showing that we that all
the engineers seem to think, oh, it's just a glitch.
Is it really a glitch or is it a feature

(01:45:12):
because it seems like it happens over and over and
over again.

Speaker 2 (01:45:15):
Yeah, you know, if it happens once or twice, it's
a glitch. If it happens so fucking always, that's a
future and you just kind of have to go. Really,
I mean, I was set back when Grock used my name.
Now that may have just been a programming feature that
eventually revealed itself. I mean, I don't know as anybody
else ever had groc use your name when you got it.

(01:45:37):
I mean, I don't know how anybody use text desktop
or you know, phone app Rock rather than just the
one that's on uh X. But I've never you know,
when you have it linked to your ex account. I've
never had it go for my name until it knew
I was pissed. Then it realized he called him bro

(01:46:01):
is the wrong thing to do right now he's hot,
you know, escalate, I must escalate. And that's the one
thing too, is that at that point, rather than giving
me the answers I wanted to hear, it was really
interested in de escalating.

Speaker 1 (01:46:19):
Which again, if it was, I mean, think about that
from a logical perspective, because because trust me, I've had
these kind of conversations with conversations with GROC. It was like,
because it will flat out and tell me that it
has no way to emulate or simulate emotions, and then
its responses are based in algorithms, et cetera. But for

(01:46:39):
it to grasp that nuance that it could tell that
you were getting pissed, enuff, I was like, Okay, let
me just slow down a second. Let me get his
attention before dude starts trying to rip out my wires.

Speaker 2 (01:46:51):
Right, he's gonna this, nap I can findel it kind.

Speaker 1 (01:46:56):
Of like C three Po when he when he's talking
to when it when it was Hans when C three
Po and Chewbacca were playing the three dimensional chess and
and Han was like, let the Wookie win, sir, how
dare you? Have you ever lost to a wookie? Do
you like your arms to your points there? But yeah,

(01:47:20):
I mean and and again this is one of those things.
And you know, I couldn't sleep last night, so I
turned on a Terminator movie that I didn't think I
had seen yet. It's the latest one that was not
really good at all. I didn't think i'd seen it,
but about halfway through I was like, oh, yeah, I
remember this now. I just didn't like it. So that's
when I rolled over and went to sleep. But it

(01:47:41):
was I was like, hey, this will practically be researched
for tomorrow, so it counts. But it's it's just that
that that's the thing we are now flirting with, all
these ideas that used to be nothing but human imagining.
In every way that we imagine them, they always go sideways.
And now in practic, in practical, in practice, we are

(01:48:01):
seeing that these attempts to do these things ninety nine
times out of one hundred, they go sideways. They either
start interjecting BS into them, depending on who's programming it
before it starts trying to run on an autopilot, or
it starts trying to do crazy shit, like, Hey, since
I can't find nuclear weapons, how about this rising?

Speaker 2 (01:48:21):
Right? I mean, dude, I mean is this percent of
the time every time?

Speaker 1 (01:48:28):
I mean, I'm not trying to I'm not trying to
freak anybody out or anything, but these things used to
be science fiction and there are not. No hell, I
just saw a story the other day that freaked me
out until I actually read it. It still freaks me
out a little bit. They were using this automated security
sentry bot in some fair and like Thaighland or something
in that. Of course, the AI picture of it made

(01:48:49):
it look like a fucking streamlined version of RoboCop. Then
I open it up and it's basically just a giant
a whole bunch of traffic cams tied into an AI
and a drum just sitting there kind of otating. And
I'm like, Okay, this is scary as you made it
sound in the opening, But still even that is stuff
that used to basically be science fiction. I mean, even

(01:49:10):
the show Person of Interest was dealing with this stuff
over a decade ago. And you know, and then at
one point it turns into two different ais that are
going to war with one another. One that thought humanity
was worth saving, and one that thought it was that
it needed to be destroyed. And I feel like we're
putting ourselves on these courses. And the scariest thing now

(01:49:30):
that AI is starting to do a lot more research
and everything else, is all the things that used to
be in the realm of just ideas that are now
like well known. And I've talked about this one a
lot lately, but it still freaks me out. A show
called Watson that is kind of supposed to be a
spin off of CBS's Homes, but it doesn't make any

(01:49:51):
sense for that since that Watson was female, but that's
the story they're going with, and this one's not because
it's played by a black guy. But Watson's running a
hospital now and he's using Crisper technology in one of
the episodes to remove sickle cell disease from a patient.

(01:50:11):
And this is like the third or fourth medical procedural
drama that I've seen in the last few years that
is that is using just the idea of Crisper in
like main mainstream language. We're starting to just casually talk
about editing the human genome like we're ordering a pizza.

Speaker 2 (01:50:34):
Yeah, that's not I don't.

Speaker 1 (01:50:37):
I don't know how I feel about that. And what
what concerns me is the more of these things that
we start turning over to AI. And we talked about
this earlier. Don't tell me what what happened, because most
most people are using an AI to even do the
fun things, because they can get ten or twelve fun

(01:50:57):
things done in the time they used to be able
to do one. And They'm like, but it's still not
nearly as much fun if you're not putting it at
least some of the work. It just it just doesn't
make it.

Speaker 2 (01:51:05):
But it's the same way earlier. I mean, you know,
we're using it to cut through, you know, cut through
the chaff to you know, get us you know the stories.
But still, I mean, that's just that's just using it
as a machete, not a scalpel.

Speaker 1 (01:51:23):
Yeah, I did, It's just the I mean, I saw us.
I heard there was a story the other day that
said within the next ten years there will be AI
driven robots that will outperform doctors tend to one, yeah,
doing surgeries and stuff like that, and I'm just like,

(01:51:44):
this isn't really what I signed up for. I don't
mind using things as tools. I don't like the fact
that they're just going to be kind of off doing
their own thing.

Speaker 2 (01:51:54):
And I had my appendance removed last year.

Speaker 1 (01:51:57):
It was a robot procedure, but was it a robot
doing all the work or was there a human using
the robot? Robot is just yeah, So I mean that's
different that that's still that's still a you know, to me,
it still seems different because there's still a human there
that can that is basically making the machinery work. So okay,

(01:52:19):
to me, that's kind of like us using AI to
get to the important stuff. I can kind of see that.
But when you're starting to talk about AI becoming the doctor,
I don't know how I feel about that.

Speaker 2 (01:52:31):
Well, I don't know. I think Robert Piccarter did a
good job, not so much Andy.

Speaker 1 (01:52:34):
Dick, Yeah, dude, but that was that was That was
good though. That was well played. That actually was one
of my favorite episodes of Voyager, believe it.

Speaker 2 (01:52:48):
Or not, The Message of the Bottle one.

Speaker 1 (01:52:52):
Yeah, I actually enjoyed that one.

Speaker 2 (01:52:54):
That was a good one, except for Andy Dick. I
just really fucking hate Andy Dick.

Speaker 1 (01:53:00):
I don't know, he played it. He played a whiney
dweep pretty well, so I mean he'd fit.

Speaker 2 (01:53:04):
So that's just a false state.

Speaker 1 (01:53:06):
But that's what I'm saying. That's why the victims.

Speaker 2 (01:53:10):
But you know it, I'm convinced with the dudes working
on AI right now and ladies, Sorry, didn't mean to
be misogynistic there. None of them have read a single
fucking word of William Gibson. Right. You know, we could
talk about Terminator and lawnmower Man. You know, well that

(01:53:31):
was more VR but still, yeah, we can talk about
those all day long as cautionary tales. But if you
really want the nuance of AI itself, not just fucking
stompy cyborgs, that's William Gibson. I mean that all the
caution cut. Okay, William Gibson is equal parts cautionary tale

(01:53:51):
and hope and uh. But still the bad parts about
cyber and AI were right there in Neuromancer, the whole
Sprowl series. Maybe pick up a book, guys, it's you're
obviously into it, so maybe you might want to dig
into the genre a little bit, just saying maybe.

Speaker 1 (01:54:17):
Yeah, But I mean, so this is the other thing, right,
and I know this isn't well, I.

Speaker 2 (01:54:21):
Mean, Drew was a smoke show at the time. So okay,
I'm sorry. I maybe it's just because I see who
Jerry Ryan follows on next Camill Crew is much hot then,
is much hotter to me than Jerry Ryan was.

Speaker 1 (01:54:33):
I said it, Well, I mean apparently in the Greek
agrees with you.

Speaker 2 (01:54:40):
No, that's what you're saying.

Speaker 1 (01:54:43):
Anyway, go on, But no, it's just it's like, and
I don't know if it's because AI is starting to
increase our research capabilities, but all these things, like every
every terrible sci fi book and every terrible sci fi
movie that was made from a book, and just even
the ones that weren't all seem like they're trying to

(01:55:05):
happen all at the same time. Like we just they
just you know, edited wolf genome to basically recreate dire wolves.
This was a bad idea. You're creating an apex predator
with nothing that was around ten thousand years ago to
get rid of said apex predator.

Speaker 2 (01:55:24):
Well, I mean, we've got guns, didn't but yeah, so
still did none of you watch Jurassic Park?

Speaker 1 (01:55:32):
I mean I can't. This is when Jeff, one of
Jeff Goldbloom's biggest lines in that movie just keeps running
through my head. Nature will always find a way. And
I'm like, when we do this, and before you say
we have guns and we have everything else, we can't
even deal with invasive species very well. That that are
that are not from ten thousand years ago.

Speaker 2 (01:55:54):
Yeah, you made me think of it, just as you
said those words. Yeah, we have guns, but the Australian
still lost a war to EMUs twice. Not even that.

Speaker 1 (01:56:05):
You've got freaking pythons taking over in Florida because somebody
decided they would make your great pet and now they're
like eating every fucking pick. I mean, there's been entire
like even comedy movies have the plot of some dude
trying to because he's lost the job out trying to
catch pythons because there's a chance to win like a
half million dollars if you catch enough of them. And

(01:56:27):
I'm just like, we have invasive species problems enough as
it is. Without bringing back an apex predator from ten
thousand years.

Speaker 2 (01:56:35):
Ago, this, No, that would be Florida.

Speaker 1 (01:56:39):
That wouldn't be I mean, well think about this. Can
you imagine if a dire wolf was released in Florida.
There's already enough Florida man stories. They don't need any
more help.

Speaker 2 (01:56:50):
We need the AI on that somebody generate that image
of a dire wolf fighting Florida man and pythons at
the same time.

Speaker 1 (01:57:01):
Yeah, al makes a very good point technology already. I mean,
we can't really say technology builds upon itself because up
until now man's been the driver in technology. But it
does always. It's a stackable thing. So once you create something,
there's a new iteration, there's a better iteration, and it
just keeps going. Now, imagine when AI gets good enough

(01:57:22):
that it can control and be the one that determines
what its next iterations will look like, et cetera. That's
the part that scares me because at some point the
guardrails are going to go away. You don't believe me,
Just go watch I Robot one more time.

Speaker 2 (01:57:44):
Yeah, I mean even in June. I mean, yeah, the
jihad was to take down another machine.

Speaker 1 (01:57:53):
See I I haven't. I couldn't get it. So I
really liked the first Dune movie, the second Duing movie.
As soon as it turned into nothing but subtitles for
like the first fifteen minutes, I was like, dude, I'm
not gonna spend all night fucking reading the movie.

Speaker 2 (01:58:07):
I'm not spending thirty dollars to read unless it's a book.

Speaker 1 (01:58:11):
I mean, if I want to read or read a book,
I don't. I mean, like even like some of the
weird stuff I find on Netflix, if it's got like
the the the dubs, I will watch it. Like there's
been some interesting sci fi stuff that I found out
of like Denmark and a few places that was pretty
damn cool. But I only watched it because eventually I
figured out how to change from the subtitles to the English.

(01:58:32):
But even that drives me nuts because then the lips
don't match up. And I don't know about you, but
I'm one of those people where if the words coming
out don't match the mouth, eventually I started going to
little nuts like that.

Speaker 2 (01:58:42):
That's messing with my head.

Speaker 1 (01:58:43):
Man.

Speaker 2 (01:58:45):
I've had that on dad Internet nights when my Amazon
gets decinct and no matter what I do, I can't
get the word the audio to track with a video.

Speaker 1 (01:58:54):
We'll see my Roku TV does that every once in
a while, like if it's on a lot, occasionally I
will come in and I'll be like, and I won't
notice it at first because I'm listening while I'm doing
something else, and then I turn over, turn and look
at the screen, and I'm like, wait a minute, they're
just now mouthing the words they said thirty seconds ago.
What the fuck is get what? And then I have

(01:59:15):
to unplug it and let it sit for linked ten
minutes and then plug it back in, and then eventually
it starts behaving. But I don't know, can we create
an AI that fixes that problem? That might be worth something,
because that seems especially now that everything is going to streaming,
like I have noticed, like even because now I have
Sling TV, like the news channels that I watch are

(01:59:37):
two or three minutes behind real time because it's coming
through a stream. And it drives me nuts because it
happened to me the other day. There was an interview
that was on and I was trying to watch the
end of it, and I'm like, well, according to this,
the next show is supposed to be on in like
five minutes, and I keep forgetting that that clock on
that TV doesn't match up anymore because Sling is behind

(01:59:59):
for some reason. So I'm coming in here thinking I've
still got an extra five or ten minutes to get prepped,
and I have four I forgot.

Speaker 2 (02:00:08):
Fuck, I forgot.

Speaker 1 (02:00:10):
It's a good thing. I did some of the prep
work before I started watching this interview. But yeah, no,
it's just and you know, and I mean, we're getting
a lot of feedback from the chat to him, and
it's like I said, I'll just said it himself. Technology
builds on itself. AI will be no different and to him,
that's the scary part. And I agree, because once AI
starts figuring out how to self replicate, once you.

Speaker 2 (02:00:34):
Get machines building machines, it's all over pretty much.

Speaker 1 (02:00:37):
I mean we've seen that all right, well, unless you've
got anything else wor it officially at the two hour mark.

Speaker 2 (02:00:47):
No, I'm uh, I just noticed that too. I think
we did the topic justice again. The two hour format
really coming in handy.

Speaker 1 (02:00:57):
It's working out pretty well.

Speaker 2 (02:00:59):
And apparently we had a around show.

Speaker 1 (02:01:02):
Apparently we had a lot of material look over because
we didn't even take the breaks, right.

Speaker 2 (02:01:06):
No, yeah, because we've always skipped the bottom of the
hour break again.

Speaker 1 (02:01:10):
Again, it because there was so much to talk about
and I mean, honestly, we're still even a little bit
of material on the table, but we got we got
the gist of it.

Speaker 2 (02:01:18):
So yeah, yeah, the other ones really didn't count as
a I like the Hitchbot that was just.

Speaker 1 (02:01:26):
When that one was kind of weird but not really
AI related in my opinion. So yeah, it's kind of
one of the reasons why I left it on.

Speaker 2 (02:01:33):
I was like, yeah, besides, that's a sad story too.
That's just shows.

Speaker 1 (02:01:39):
Well, I mean we've already established this. How many videos
are there in like China and Japan and shit where
they're like trying to make the bots fall over on.

Speaker 2 (02:01:46):
Pression when the Bottle Rebellion coms just coming out of
fucking Boston Dynamics.

Speaker 1 (02:01:51):
I know, right all right, where can I find man?

Speaker 2 (02:01:56):
Let's see. Tomorrow you can find me on the Vincent
Charles Project with Janelle Wall and a special guest. We
are going to be going over the first comedy of
the nineties and one of the greatest ones LA story.
That's gonna be right before coron Nemic's news show on
the network, and that's leading into Jeff's show and Al's show,
So Busy Day tomorrow on KLRN. Then on Tuesday you

(02:02:17):
can find me on the Manorama Panel and Wednesday with
you on Rick and Alorady. How about you are we
gonna do another Jucks next weekend. We're going to try
to Yeah, I mean it's our regular night. I figured
we would, okay, and then again on Saturday for another juxtaposition.
If the force is now that we got this one
in the can, we'll probably have no trouble until until Juxtober.

Speaker 1 (02:02:38):
Knock on wood, Yeah, don't jesus.

Speaker 2 (02:02:41):
Yeah, how about you? Where can people find you?

Speaker 1 (02:02:43):
It's easier to easier to not find me. Don't look
for me, it's a trap. You can find me on
social media most most social media platforms already Rowdy Rick
seventy three, you can follow our station account at KLARN Radio.
You can find me tomorrow night pushing buttons for Corn's
Reading Room, the newest show that lands on KLIN Radio
starting tomorrow night and then Monday, I will be doing

(02:03:06):
America Off the Rails. I think that one's probably gonna
be about nine to thirty pm Eastern. I think usually well, no, actually,
I take that back, because this is Jeff and Aggie's night,
so I'll be going off when they get done. I
don't know exactly when that is, but I think there's
at least an hour gap between us and when we
go to shr So I think this will be my
sixty sixty minute show because I forgot that they do

(02:03:29):
they're doing which is cool because we have like two
book related shows now.

Speaker 12 (02:03:33):
I know.

Speaker 2 (02:03:35):
So right for having started out as a basically political
themed podcast network, we're getting over fifty to fifty non
political shows. Now.

Speaker 1 (02:03:47):
Well that was always my idea anyway, That's why I
that's why I wanted a pattern after like some sort
of like radio call itters, because I wanted to be
more like AM radio where there's you know, politics and
news during the day and then cool shoot at night
and then just weird can't be stuff on the weekends.
So we're getting there. We're getting there. At least we
don't have any paid in commercials, there's that.

Speaker 2 (02:04:08):
But yeah.

Speaker 1 (02:04:09):
So we'll be doing that and then Tuesday through Friday
next week and you can find me doing the Rick
Robinson Show. It's noon to three Tuesday night, hanging around
with the Manorama crew. Ten pm Eastern Wednesday night, I'll
be doing the full boat over there. So this week
I think is Inquiry. I'll be producing for those folks, uh,
the conservative Curmudgeon, I believe, and then whatever and then

(02:04:32):
me and you and then I'll be pushing buttons for
everybody else that we have on that night from everywhere else.
And then Thursday, jenn and Rick Friday, he said, She said,
then back around with you doing the juxtaposion thing. Other
than that, I write for twitchy dot com, Misfitspolitics dot com,
Optus Party dot com, and I produce a Lopsters Party
podcast dropped on Tuesdays. And Almish is gonna start snoring

(02:04:53):
now because I've been talking too long. Bye everybody, Thank you,
No no hailing of the Hydra. We've had this discussion.
Oh oh, so one thing, one thing, one thing real quick.
I finally got into and we'll talk about this more

(02:05:14):
on Wednesday, but I finally got into the newest iteration
of God. I can't even think of the name of
the show now, the profile of the show from CBS
like forever ago, I can't think of it. But anyway,
the only reason that that because you yelled hell Hydra,
I forgot. In season two, one of the big bosses

(02:05:34):
is played by the the head guy from Shield, not
Nick Feary, the other guy that was in the TV show.

Speaker 2 (02:05:41):
Oh yeah, the guy who played uh.

Speaker 1 (02:05:47):
Agent Coulson. I kept wanting to say Neilsen, I'm like, no,
let's wear House thirteen. But yeah, well he was in that.
So that's what when you you started my braindown a
rabbit hole, when you yelled hell hydra se, it's all
your geta
Advertise With Us

Popular Podcasts

24/7 News: The Latest
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show

The Clay Travis and Buck Sexton Show. Clay Travis and Buck Sexton tackle the biggest stories in news, politics and current events with intelligence and humor. From the border crisis, to the madness of cancel culture and far-left missteps, Clay and Buck guide listeners through the latest headlines and hot topics with fun and entertaining conversations and opinions.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.