All Episodes

October 11, 2025 20 mins
Pod Crashing episode 396 with Dexter Thomas Jr. host of the podcast Kill Switch. Were we sleeping when everything changed? Seems like the technologically driven future is already here. On Kill Switch, we explain the right NOW of our supercharged technological lives. Ask ourselves who is in control? Can we beat the computer or are we going to need to throw the kill switch? New host Dexter Thomas, Jr answers questions big and small - like who made Shrimp Jesus and why? To how and why you very well could get arrested by a computer. And we'll be bringing the DIY back to tech - How to Now on everything from how to run your own LLM to tips to mitigate your exposure online. EPISODES AVAILABLE HERE: Https://www.iheart.com/podcast/105-kill-switch-30880104/ 
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Hey, welcome back to the conversation. Let's do some pod crashing.
Episode number three ninety six is with Dexter Thomas Junior.
He's the host of the podcast kill Switch.

Speaker 2 (00:10):
I am doing very well. How you doing?

Speaker 1 (00:12):
Absolutely fantastic? And I'm gonna start it off and I
hope I'm not being freaky weird. But first of all,
I don't trust AI. But this this thing that happened
over the weekend in Iran, do you think AI technology
had anything to do with it? Woo?

Speaker 2 (00:25):
I mean, do I think AI technology had anything to
do with it? In what level?

Speaker 1 (00:30):
To make it so precise?

Speaker 2 (00:35):
You know what I can say with more confidence, okay,
is that AI is absolutely being used to sway some decisions.
You know, people are able to generate videos that look
like it is actually of the scene people are in.
It's just spam those all over the internet. So I'm
very confident saying that, regardless of you know, what is

(00:57):
actually happening, AI is absolutely going to be changing our
perception of what happens. And you see that all over
the news, all over the news.

Speaker 1 (01:07):
Well, I use it for show prep. And that's the
thing about it. I'm not stealing their words I just
when I have a question, instead of going to eighteen
billion pages on Google, I'll just go to chat GPT
and all of a sudden, it's all in one area.

Speaker 2 (01:20):
Mm hmm. Yeah, you're playing with fire, my friend. Oh no,
you're playing with fire. Yeah.

Speaker 1 (01:29):
Because with chat GPT, I don't know if it's clickbait.
I don't know if I trust it. Yeah.

Speaker 2 (01:36):
Well, the really interesting thing about chat, JIBT or any
of these other large language models is from the standpoint
of somebody who's using it, they can be confidently wrong.
I mean, just so confidently wrong.

Speaker 1 (01:48):
You know.

Speaker 2 (01:48):
I'll give you an example. I was thinking, you know,
it kills, which you know, I know it sounds like
we're telling everybody don't use AI, don't use a computer,
or turn everything off. That's out what we're doing, right,
We've never advocate for that. But you know, I was
thinking about getting a new camera and I, you know,
just asked, hey, you know balancing, give me the pros
and cons of each of each of these models, and

(02:10):
it was telling me this camera has this really important
feature that I wanted, and I was thinking about getting it.
Come to find out, go look up, go to get it.
It doesn't have that extremely important feature. It doesn't have
it at all, And I almost spent a couple thousand
dollars on something that it doesn't have. So you're, yeah,
you always have to check it, and that's it. That's

(02:31):
on a good day that you can get away with
just checking something, and that there are people who think
of politicians even and numerous politicians have said this, Oh yeah,
I just use CHATGBT and it gives me the answers.
My friend, it does not give you the answers. It's
if it's not how this stuff works, do it? Something's
very mad at others.

Speaker 1 (02:50):
You're giving us a full vision of what we're missing.
That's what I love about kill switch is that you're saying,
by the way, I'm gonna tap you on the shoulder
every time where I think you're you're you're gonna need
an education here, and you're just so open and transparent. Dude.

Speaker 2 (03:04):
Well, you know, we The thing is, there are things
that we don't know. And when I say we, I
mean there are things that you know, society doesn't know,
you know, the non you know, technical people don't know.
But there are things, quite frankly that even the people
who make chat GPT don't fully understand why chat GPT
does some things, and if you sit them down and
you ask them why is it doing this, they don't know.

(03:28):
That is kind of scary and we should all be
very open about that. That hey, on the technology, and
there's some things we don't understand, but there's also some
some open questions. You know, would we like for AI
to be helping police arrest people? That that's not necessarily

(03:50):
a yes or no question for everybody, but we should
be aware that it's happening, which a lot of people
aren't aware that it's happening. It's not science fiction, this
is happening right now. And okay, how should this be used?
Should AI be used in quote unquote solving crime? And
we're not the general public is not being allowed into

(04:11):
those conversations and a kill switch, We say, nope, we
need to be involved in those. We should know what's happening.
We have the right to know.

Speaker 1 (04:19):
How do you feel though, when people say you've just
got to live with AI? Sometimes I feel like that
we're being dumbed down. It's like, because I love to
do research, I love to make sure that my spelling
is correctly, but it's like, why am I having a
problem with spelling because I'm being dumbed down?

Speaker 2 (04:36):
You know. It's funny you mentioned that an MIT study
just came out, I think last week that said that
relying on AI can potentially reduce your ability for critical thinking.
Oh boy, I mean think about it. It's it's like,
you know, it's like doing push ups. If if you
do you know, if you do a couple push ups,

(04:57):
you know, a day or something like that, and then
just for a year, you never, you don't lift anything
at all, and then you try and then you need
to do a push up, you can have a really
hard time doing push ups. Same thing. Your brain is
a muscle, you know. If I can speak metaphorically, if
you're used to making decisions about what kind of camera
should I buy? Should I let my kid eat you know,

(05:18):
these fruit snacks or eat this cake or eat this apple?
Should I say this to my boss or not say
this to my boss, And you leave all that to
the computer. Then one day when you're not buy a
computer or something else and you need to make a decision,
you're going to have a harder time making that decision.
So these are societal things that we don't necessarily really

(05:41):
think about where you know, this isn't is a computer?
You know, is a computer going to go full sky
net and destroy of all of us? Okay, you know,
let's leave that for a little bit in the future.
This stuff is having effects on us right now.

Speaker 1 (05:55):
Two major stories came out this weekend with AI technology
firmly attached to them. One the book that copied the
other author and all of a sudden, people are going,
what is going on here because now it's got book
talk and TikTok in trouble. And number two the Tesla
taxi that's not like all over the dang news that's AI.

Speaker 2 (06:14):
Mm hmmm.

Speaker 1 (06:16):
Yeah.

Speaker 2 (06:16):
And again this is this is one of those things
where I think we're being told, oh, yes, bye, we're
being told that listen, it's here, it's always going to
be here, deal with it, And that's not actually that's
not actually the case. That's not actually the case. We

(06:37):
we still have the ability to make decisions about how
much involvement we want the AI to have in our lives.
You know, maybe a robotaxi is a great idea, or
maybe better wages for drivers is a great idea, or
maybe more train lines is a great idea. I don't know,

(06:59):
you tell me these No, I just gave you three options.
Maybe there's ten options. But the idea and maybe those
three options that I gave you all sound great or
all sound bad to you. But the idea here is
that these are all decisions that we can make. And
what's happening is be a little flipping here. There's this
as a society, we've gotten used to this idea that

(07:20):
there's a small group of nerds in Silicon Valley. And
you know, I'm not being mean here, me being a
nerd myself, you know, growing up a nerd, you know
what I mean. I bought my first computer as soon
as I could, and I was programming, man, you know
all this other stuff. Man, that's me. But we've gotten
used to this idea that there's a small group of
nerds in Silicon Valley who handle all the computer stuff.
You do the computer stuff. I'll just look at what

(07:41):
you do on my phone, or when you release something,
I'll use it. Or that some politicians get to make
choices about what we do with our lives as respect
to technology, and that's not the case at all. We
get to make these decisions. We the people who use
these things or who use us for our data. We
make these decisions too, but we need to be informed

(08:03):
about what's happening.

Speaker 1 (08:06):
I'm a face to face person. If you can't get
me face to face time, we're not doing it. And
you know, because those things, when I get on the
phone and all of a sudden, I'm talking to an
AI voice, I can't do it.

Speaker 2 (08:16):
Oh yeah, oh, and you know, and weird kind of
being sold that. No, No, you'll get used to it.
You'll get used to it. And that's the thing. People.
People are humans that are an amazingly adaptive species. We
will get used to almost anything. Have you been in
a way mo, No, Oh my gosh, So you know
you know what I'm talking about, right, the same thing, robotaxis. Right,

(08:36):
I've been in one. You would be shocked. So I
didn't want to get in, you know, full disclosure. So
you know, I was in San Francisco and some friends
wanted to get in, and you know, I had to
get in with them. We're going to some bar or
something like that, and okay, I get in and it
went from in about two minutes, it went from this
is really weird to I forgot that there wasn't somebody driving.

(08:57):
Oh my god, humans are such adaptive species that something
that seemed like magic five minutes ago will just seem
normal to us. That being said, the things that are
being sold to us, the things that are being pushed
on us by you know, frankly, again, when I say
pushed on us, it's a This is not a government
that is doing this is not an organization of like

(09:19):
minded people. This is you know, these are companies that
stand to make some money. Is this something we want?
And if it is something we want, how should it
be used? It doesn't necessarily have to be used in
the way that Meta wants us to use it, or
Google wants us to use it, or the makers of
chat gbt open ay I want us to use it.
Perhaps there's other ways to use these possibilities, these technologies,

(09:39):
but we should be able to make those decisions.

Speaker 1 (09:41):
Scary stuff here, scary stuff. We'll be back with Dexter
Thomas Junior. Coming up next. The name of the podcast
is kill Switch. We are back with Dexter Thomas Junior.
I got to tell you how freaky this day has
been in the way that not even thirty minutes ago,
I was with George ta Kai who's talking about every
thing Star Trek, and all of a sudden, I'm having

(10:02):
this conversation with you, and I'm going, holy crap, Star
Trek could actually be real.

Speaker 2 (10:06):
You know.

Speaker 1 (10:06):
It's like, you know, it's so bizarre that it's it's
it's like on Star Trek it was always about the future,
and you're you're talking about that's where we're going, dude.

Speaker 2 (10:16):
Yeah, well, you know, I mean, think about it. When
when when George k was on, I mean, shout out
to him, man, I made amazing you get to talk
to him, you know, the the future sounded like a
great place. Yeah, I mean think about that. Think about Jetson's.
I mean, Jetson's was kind of a parody, but we
got everything but except for the flying cars now, but

(10:37):
it sounded great. The future sounded really cool and it
was optimistic. Now, ask almost any about the future and
they're scared out of their mind.

Speaker 1 (10:44):
Yep, yep. Yeah.

Speaker 2 (10:45):
And I can tell you why. It's not because AI specifically,
it's not because of computers or whatever. Specifically, it's because
we feel like we don't have control and largely it
has been taken away from us because we've all out again,
say it again, a small group of nerds and Silicon
Valley to make some decisions, and it doesn't have to

(11:06):
be like that.

Speaker 1 (11:07):
I did some research on uncertainty, and the reason why
we many of us are are uncertain is because we're
trying to get back to a past that we would
like to rewrite. And so when you say that we're
afraid of the future, of course we are because we
don't know what's what's up there. We would rather live
in the past that we already know.

Speaker 2 (11:23):
Mm hmmmmmm. Again, I think that makes a lot of sense.
And I think there was there was also a time,
you know, when we were you know, think of the night,
you know, think of how we were so into futuristic stuff.
You know, there was a little but there was the
Y two K thing, you know, the Y two K
but we were a little scared about that. But you know,
think about how music videos look back then. It's everything's
silver and shiny. We're going to be in space and

(11:44):
you know, we're gonna be able to talk to each
other using computers. We do that now and we all
kind of resent it, you know, And yet no, but
I think you're totally correct. Yeah.

Speaker 1 (11:56):
One of the scariest things with AI recently happened a
month ago, and I was at this big retirement party
and they played a song that sounded so amazingly perfect
in the way that it was very Frank Sinatra, and
I said, I went up to the person, I said,
who is that actually singing? Nobody? It's a computer. It
was AI generated And it scared the bajebi's out of

(12:18):
me because that I was so convinced that that was
a great piece of music.

Speaker 2 (12:22):
Mm hmm, yeah, you know. And again this is somewhere
where I'm really encouraging us to think about what role
do we want technology to have our lives. For some people,
having fake AI music is totally fine, They're totally cool

(12:43):
with that. For other people that it's it's almost blasphemy, right,
that freaks me out. That freaks me out somebody who
loves electronic music. I love techno, I kind of still
want my techno made by a person. Yeah you know
what I mean. I kind of still you know, I
want those decisions to be made by Well, I wonder
why they did this, Why did they use that weird little,
you know, squeaky sound in there that really buzz sound.

(13:06):
That's really interesting. There's something human about it. But we're
I think if we look back at today, ten years
in the future, fifty years in the future, we will
think of this period right now, right now, this period
that we're living through, as a turning point, and what
decisions did we make? And you know, not to throw

(13:27):
the name of the podcast out here, but are there
some things that we need to throw the kill switch on?
And kill switch is you know, being a technical term,
but you know it sounds like exactly what it is.
Do we need to shut the system down for just
a second, rethink this and maybe do something else Again,
I'm not telling people don't use AI. And as a
matter of fact, I've interviewed plenty of people about AI

(13:48):
and none of them have said AI is universally bad,
don't ever use it. Nobody's saying that, just maybe we
need to pause, maybe we need to hit you know,
maybe we reset and think about this.

Speaker 1 (13:58):
Would you say that we're living in a beatless moment
when it comes to AI?

Speaker 2 (14:03):
Ooh Beatles' moment?

Speaker 1 (14:04):
What since? Maaning our lives are going to change, just
like they changed when the Beatles came out. I mean
everything changed.

Speaker 2 (14:12):
Yeah, Oh, I mean, you know, this is one of
those things where I think Beatles plus Michael Jackson plus
James Brown plus I mean, you know, the Beethoven at
all of it, right, and not necessarily because we are
it's going to advance our ability to solve certain diseases,

(14:35):
which is possible, very possible. Actually, not just because it
will take jobs, which is possible. That's also possible, but
also because it's kind of starting to warp our perception
of reality any news event. I mean, I'll just put
it like this, I would. I'm in Los Angeles, as
you know. There have been a lot of protests here, yep.

(14:57):
And I've I've been out there with the camera day
as a journalist documenting it, and I have seen and
I've posted videos, hey, here's what I saw, Here's what
I saw, and look at this. But I've also seen
people posting videos that look very realistic. They're also much
more dramatic than my videos, you know, showing police and

(15:18):
protesters beating each other up, fires in the background, really
dramatic stuff didn't happen AI generated. But it's more exciting.
And there are a lot of people who believe that
or want to believe it, and those images are the
ones that travel further. Is this okay? I don't know.
You tell me.

Speaker 1 (15:40):
Don't you think that you're lifting the game of journalism
in what you're doing today, because I mean, there's so
many Edward R. Morrow. I mean, it's like this is
like I would love to see what his reaction would
be if he were here today to find out what
people like yourself are doing. You're adding to the soul
of a story.

Speaker 2 (15:56):
Well, you know, I mean again, the real big thing here,
and the thing that frustrates me about tech reporting and
technology reporting in general, is again we've allowed ourselves to
feel like we're not allowed to participate. And I'm just
trying to bring the audience. You don't bring people back

(16:18):
into it. You understand what I mean. I mean, I'll
put it like this. My grandma just turned ninety yesterday.
Called her on her birthday, and if I make an
episode that she doesn't understand, she will call me and
yell at meh. And so my grandma understands everything, you know.
Let me give you my grandma some credit, like she's

(16:38):
very sharp, super smart. But I fully expect anybody to
be able to understand what we make because I think
that's a duty that we have. I think, you know,
it's my duty as a host of the show, but
I think also it's a duty as people. And I
just want to make sure that we are all involved,
we all understand. So just in the way that you

(16:58):
read about the economy, and if you read an article
about the economy and you think, ah, there's something wrong here,
I don't understand it. Part of that is, you know
on the reporting, or part of that is you should
be able to understand it. You should be able to
understand the article about politics. We shouldn't allow people to
tell us you don't understand technology, leave it alone. No,
all we all have the right to understand. And that

(17:21):
same approach that's taken in every other angle of journalism,
every other place. Technology is no exception. We all have
the right to understand how this affects our lives.

Speaker 1 (17:31):
So when you're out there getting the story, how do
you put an in line in? Because I mean, the
one thing that that ak come to you as a
brilliant idea today and then and you're going, oh my god,
I still have seven or eight episodes before this one,
how do you have you can relay with me. Can't
you that you Yes, I mean I could bust the
door open right now, but it has to wait. Mm hmm.

Speaker 2 (17:53):
You know, I mean for this at least for kill Switch, right.
We we actually we focus somewhat on being timely, but
not as much as some other places. My hope is
for people to be able to come back to an
episode in a year even and it still feels relevant.

Speaker 1 (18:16):
Yeah, it does.

Speaker 2 (18:18):
For example, I mean the most recent one. Again, speaking
of the protests, a lot of people were not aware
that there were predator drones flying over Los Angeles, and
predator drones if you, for your audiences who's heard this before,
if you've heard of a predator drone, those are the
same things that are used for military strikes in the

(18:38):
Middle East, and the same technology. Now, granted, we have
no reason to believe that these things were actually armed
or carrying missiles, very probably unarmed, but the same technology
was flying over and watching the protests. But the governmenties,
you know, border patrolled, home security, were not telling anybody

(19:01):
that they were flying these things. Some independent researchers had
to figure this out by looking at you know, they
were flying the strange hexagonal pattern over two areas of
Los Angeles where the protests were. Now, maybe you think
it's a great idea for these things to be monitored,
or maybe you feel a little bit weird that military
hardware is being used to watch again, everybody. This includes protesters,

(19:25):
This includes business owners, This includes somebody who says, you
know what, I don't believe the news. I want to
go and just watch for myself. I met lots of
those people. Everybody was being watched. Are you comfortable with that?
For some people this is great. For other people, they
feel a little odd. We should know, though, Yeah, this
is going to be this is going to be a
story in six months, in a year, and so I

(19:48):
try to make something where Okay, what can I make
that is still going to feel relevant in a little while.
That adds to the conversation.

Speaker 1 (19:55):
You've got to come back to this show anytime in
the future. I love where your heart is, dude, Oh
my gosh, I.

Speaker 2 (20:00):
Would love to. I can talk about this stuff all day,
because again, it's you know, it sounds like technology, but
really at the heart of it, it is it's about people,
and it's about our decisions and our right to make
these decisions.

Speaker 1 (20:11):
I'd love to will you'd be brilliant today.

Speaker 2 (20:12):
Okay, hey you as well. Thank you so much,
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.