All Episodes

August 1, 2025 41 mins
S7E4 – In this episode, we see a continuation of the movie “Bandersnatch” where the main character (Cameron) is introduced to the game programmer Colin. Colin has made a “Game” unlike any other, where AI beings live inside a digital world! What can possibly go wrong?

Email, Website & Other Links
hoho@blackmirrorpodcast.com
https://blackmirrorpodcast.com
https://link.thehohoshow.com/Rumble/BMP

Become a supporter of this podcast: https://www.spreaker.com/podcast/black-mirror-podcast--3096529/support.

Thank You for Listening

For all things "Black Mirror Podcast Related
Email, Website & Other Links

hoho@blackmirrorpodcast.com
BlackMirrorPodcast.com
BlackMirrorPodcast / Rumble
StinkPikle.com / Merch Store
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:11):
We to the podcast join host hoh as he discusses
the riveting Netflix original series Black Mirror.

Speaker 2 (00:19):
Now Here he is.

Speaker 1 (00:21):
Ho Sper broadcasting from Weaponar Saft Productions Studio B. Welcome
to Black Mirror Podcast and as always I'm your host,
Ho Ho.

Speaker 2 (00:33):
So, how y'all's doing. I hope you're doing good.

Speaker 1 (00:35):
I really do, because today we are continuing in season
seven with episode four titled Plaything. Oh yes, And I'll
tell you what this episode was interesting.

Speaker 2 (00:48):
I mean, wow, this is this was a really good episode.

Speaker 1 (00:52):
It really didn't know most of the time whenever I
sit down for these you know, I sit down with
you know, minnisoda, cup of coffee something like that, I
had a little notebook and a pen and I'm ready
to take some notes right this time.

Speaker 2 (01:07):
No that it just didn't end up.

Speaker 1 (01:10):
Happening, because as soon as I sat down and this
puppy started playing, I was immediately just just just in
this episode.

Speaker 2 (01:24):
Totally in it.

Speaker 1 (01:25):
I mean I didn't know what to expect going into it,
and just the whole thing it just it had me
in it from just like the first moment, right first
moment I was locked in this puppy. I did end
up taking any notes, didn't really need to. I mean, overall,
it was a pretty simple episode. And before we get

(01:46):
into any of that, because we do have some discussion
to do, and of course we are going to be
talking about the episodes, So spoiler alert. If you have
not yet checked out the episode play Thing, head on
over to Netflix check out that Black Mirror episode and
then come on back and we'll discuss it together.

Speaker 2 (02:06):
Let's get into it.

Speaker 1 (02:07):
So this puppy started off, you know, a guy walking
through it kind of looked like a homeless camp. If
I'm not mistaken, That's kind of what it reminded me
of at any rate. And the guy goes to a
convenience door, takes out what I can only assume was
a bottle of alcohol, but I'm not sure. I didn't
really get that good of a look at it. Heads

(02:28):
to the door, and then the guy is locked in.
He drops the bottle of alcohol, breaks it, and then
is just sitting down waiting for the cops to show up.

Speaker 2 (02:40):
And they do.

Speaker 1 (02:43):
You know, cops show up, and you know, it's this
I found kind of interesting, right, I mean, I really did.
I mean, it was an immediate like, you know, they
checked his identity using you know, a facial recognition software
that was part of their you know, the the body
cam that they had. They took a mouth swab to

(03:07):
check out as DNA and immediately it brought up a
red flag, like, hey, this matches, you know, evidence that
we have from a dead body that we found a
long time ago, right, I mean, that's what they found
out immediately.

Speaker 2 (03:21):
Now this is where I was like, huh, I was
kind of interested. I really was at this point. I mean,
this is like the very first what five.

Speaker 1 (03:30):
Minutes of the episode, and already I'm like, huh, that's
interesting because I mean, how close are.

Speaker 2 (03:38):
We to that kind of tech?

Speaker 1 (03:42):
Because we already know that they use facial recognition software.
We already know that, you know, especially if you go
to an airport, that's what they use. They're using facial
recognition software to find out exactly who you are. You know,
they try to check and track how much money.

Speaker 2 (04:00):
That you have on you.

Speaker 1 (04:01):
You know, they're I mean like this, that's just what
they do. And especially whenever you're in a bigger city.
You know, they got so many cameras out there tracking everybody.
I mean, that's just the way technology is used.

Speaker 2 (04:15):
Anymore.

Speaker 1 (04:15):
So, I mean, this wouldn't even surprise me if this
is something that they are able to do today, and
if they're not, I mean, how long away do you
think it'll be, or how much longer until you think
they're actually using this type of technology right there on
an officer's body cam. And I mean, in a matter

(04:37):
of speaking, it kind of seems like a no brainer, right,
I mean, why wouldn't they be using something like this,
you know, regardless of you know, how you feel about privacy,
which I mean granted, I mean I'm an advocate for privacy,
absolutely private property.

Speaker 2 (04:53):
You should have the right to not have that kind
of thing on you.

Speaker 1 (04:58):
But at the same time, I mean I understand, you know,
protecting cops, I get it, you know, I really do.

Speaker 2 (05:03):
But if you are into town square, I mean, you really.

Speaker 1 (05:05):
Don't have that expectation of privacy, right, I Mean, that's
just kind of how things work, whether you believe in
it or whether you don't, regardless of how you feel
about it, That's just kind of how things are in
most places, you know, in a lot of places, that's

(05:26):
kind of just how it is. That's how it is,
you know, here in the States at any rate, you know,
overseas I don't know, and then having the you know,
being able to take a mouth swab and check somebody's
DNA immediately, like on the scene to find out who
the person is. I mean, we're not even talking about,
you know, fingerprint scanner. And this goes beyond facial recognition.

(05:50):
I mean, this is checking somebody's DNA right there on
the spot to see if it matches anything in the system.
How far away do you think we are to something
like that, I mean, at this particular point in time,
at least again in the States, you know, they have
to have a you know, you got to be arrested
in order to do something like that. Now, granted, this

(06:12):
was a clear cut of the guy was guilty. He
was caught red handed at the scene of the crime
with an eyewitness right there, so you know that they
had probable cause. Don't get me wrong, I don't view
this as a violation of somebody's rights. I mean, he's
caught red handed, right, So, but how long do you
think it'll be before they're able to do something like

(06:36):
that a quick DNA swab in order to like virtually
immediately see if you know, there's any other crime that
they know of that this person is tied to. That
just is mind blowing to me in a matter of

(06:56):
speaking of me, because like, now, you know, you got
to take these te and you got to send him
in and it takes time for the process and to
get done.

Speaker 2 (07:05):
So I mean this right.

Speaker 1 (07:06):
Here, having it like immediate access was amazing.

Speaker 2 (07:12):
I mean, it just was. It was cool. I thought
it was neat, you know.

Speaker 1 (07:16):
And so they arrest him, they take him back and
they're like, okay, tell me who the guy was. And
the guy goes into a story. I believe his name
was Cameron. Yeah, Cameron was the guy's name, the main character,
and he's telling the story of how this whole thing

(07:41):
came to pass with him being a column writer at
a company.

Speaker 2 (07:46):
Called Oh what is it? What is it? What is it? Oh?

Speaker 1 (07:50):
Come on pcsone. Yeah, pcsone is what he's a column
writer for. And what he does he refeuws video games, right,
I mean he plays video games, reviews them, writes the
review for the magazine too, you know, as a good
a bad, you know, how good do you consider this game?
And what I thought was neat was his character was

(08:12):
playing doom, right, I mean this is okay, ladies and gents,
all right, I'm I'm I'm old, right, I'm mid forties. Okay,
I am older than Apple. When I was in elementary school.
That was whenever you know Windows three point one was
becoming a thing computers. You know, we're still using MS DOS,

(08:35):
which stands for a Microsoft Disc operating system, not just
DOS shell and anymore. You can't even get DOS shell
on your system. It's only you know, Windows, Linux or
well whatever Apple has. I don't even know AOS. I
guess I think that's what it is. But anyway, I'm
I'm not an AOS kind of guy. I do PC

(08:56):
or I do Linux. That's what I use. But back
in the day, you know, it was MS DOS and
then Windows was kind of built on MS DOS, So
I mean, I remember those type of games.

Speaker 2 (09:10):
I remember whenever.

Speaker 1 (09:12):
Doom was, you know what, a three point five inch
floppy and it was a free game that you can
get sharewear is what they called it.

Speaker 2 (09:21):
Okay, I'm yes, I'm old. I'm that old.

Speaker 1 (09:25):
Like back in the day, whenever you know Oregon Trail
was on either one or two like four and a
quarter inch floppy disc. I'm like the the hardcore old
time thing, you know, not the three point five kind
of the hard disk, but the actual floppy disc. Yeah,

(09:46):
that's that's I played that like old school original, right.
And he's telling his story of how he got handpicked
by Colin Rittman to review a game that Colin Rittman

(10:07):
had had made, had produced, coded the whole nine yards.
And I'm hoping you remember the name Colin Ritman because
and you know, assuming you've watched the episode, which again,
if you haven't, go back watch it, come on back.
This was the developer of the game Bandersnatch. Now, dude,

(10:28):
I loved Bandersnatch. That was an awesome episode, it really was.
The thing was amazing, well episode, I guess it was
a mini movie.

Speaker 2 (10:37):
You know.

Speaker 1 (10:38):
It was fun, it was entertaining, you know, and one
of the things that it really highlighted was more or
less how society is today, right.

Speaker 2 (10:48):
With having the you know.

Speaker 1 (10:50):
Not actual choice, not actual freedom, but the illusion of freedom.

Speaker 2 (10:57):
That's more or less how society is built today.

Speaker 1 (11:01):
And plaything isn't exactly I mean like, it's not exactly
a sequel. It's just it operates within the same you know,
you know, it takes place after what happened in Vandersnatch.
It's a completely different episode, but kind of a throwback.
It's the same time, right, and he's talking about this game,

(11:24):
which was very reminiscent of like sim City or the
sims you know from you know, back in the day,
as in it was a little world.

Speaker 2 (11:35):
And how.

Speaker 1 (11:37):
Ritman described it was that it was this world where these.

Speaker 2 (11:42):
AI beings hatched and multiplied.

Speaker 1 (11:51):
And they kind of lived in this little world where
they existed within the program. And what was amazing about
this was that they were already doing things beyond what
they were programmed to do.

Speaker 2 (12:06):
And communication was that thing.

Speaker 1 (12:12):
And to be perfectly honest with you, what what I
found amazing in that was that communication. I mean, like
real communication like that as in verbiage. That's the biggest
thing that separates us from any other animal out there,

(12:34):
you know. I mean, it's not just about being able
to really communicate, and it's not even you know, a
consciousness that you know provides reasoning, but it is a
very complicated language, and that's really the thing that separates
us from the rest.

Speaker 2 (12:52):
Of life, right.

Speaker 1 (12:54):
I Mean, I know I'm getting deep into psychology here,
but I mean that's just that's just part of it.

Speaker 2 (13:01):
And to have these AI.

Speaker 1 (13:03):
Creatures that are attempting to communicate, but they're using their
own language, I gotta you know. Okay, So whenever I
first kind of seen these guys and they're talking in
their own little language, I'm kind of thinking of like
a digital version of Ferbies, right, I mean, do you
guys remember Ferbi's from. I mean, okay, I'm going back

(13:25):
like what twenty thirty years maybe I don't remember when
Ferbies came out.

Speaker 2 (13:29):
But there's these you know, these little cute well I
can't really call them cute. I mean they were they
were creepy. I mean, they really were. They were kind
of creepy looking. Look up Ferbies.

Speaker 1 (13:40):
Okay, they were kind of creepy looking. And I remember
back in the day I was in you.

Speaker 2 (13:45):
Know, my early twenties.

Speaker 1 (13:47):
Okay, so we're talking like twenty a little over or
about twenty years ago. Whenever these things came out, early
two thousands, okay, Ferbies they were the hit item.

Speaker 2 (14:01):
And what was what was interesting?

Speaker 1 (14:03):
I mean, you know, these little, you know, robot things
that had speakers on and they kind of moved their
eyeballs and whatnot. But what was interesting about them is
they you know, they talked in their own language. And
one of the creepy aspects of it was if you
put two Ferbies right next to each other, they'll talk
to themselves. You don't know what the hell they're saying,

(14:27):
but they'll actually talk to themselves because they have their
own language.

Speaker 2 (14:33):
Creepy. I am so.

Speaker 1 (14:36):
Glad that, you know, these little toys, which you know,
kind of reminds me of like Grimlins, I guess, and
I'll kind of bring that up a little bit later,
but these these little toys, I am so glad that
these things came out before a I. I mean, holy,
can you imagine Ferbies like an actual toy that you have,

(14:58):
you know, like you I don't know, like eight inches tall.
I mean, they don't like move around or anything. They
don't rotate, they don't travel anywhere. It's not like an
actual moving type of a robot. It just kind of
sits there. It's a little buddy. And I mean I
don't remember if you can like pet it and it
had little things on it that, you know, let them

(15:18):
know that they're being petted or anything like that.

Speaker 2 (15:20):
I don't remember that, but I just remember that.

Speaker 1 (15:22):
They you could actually talk with them, and they can
talk to themselves in their own language. And holy crap,
I am so glad that these puppies came out way
before AI. I mean, if these things like, dude, you
want to talk about Hugh, I'll get into that.

Speaker 2 (15:40):
I'll get into that. So he goes and he's.

Speaker 1 (15:49):
Cameron goes to Colin Ritman and he's having this thing
explain to him, and Ritman is saying that, hey, this
isn't a game. This is you know, an AI computer
life simulation.

Speaker 2 (16:07):
It's not a game. It's a simulation with.

Speaker 1 (16:14):
Artificial intelligent beings in a digital space.

Speaker 2 (16:22):
Mind blown, right, mind blown.

Speaker 1 (16:24):
I was like, okay, and Cameron, he he takes.

Speaker 2 (16:31):
The disc that had the program on it, stole.

Speaker 1 (16:36):
It, goes home, puts it on his own computer, and immediately.

Speaker 2 (16:43):
Gets the simulation going.

Speaker 1 (16:47):
Now and this of course is in the flashback, and
they you know, they get the guy's address, they get
the guy's keys, and they go to his home and
you know, one of the things that Cameron says is, hey,
don't touch anything, right, and whenever they get into the
guy's house, now, granted is kind of destroyed, you know,

(17:09):
it definitely is the home of somebody whom is definitely
obsessed with something. And then they go into his bedroom
and they see this computer that is definitely home built,
all kinds of components that has been built out over time,
and it's just kind of like, you know, beautiful mind

(17:32):
type of thing going on, right, And I see the towel,
and I know now that it was, you know, covering
the monitor with the webcam, but at the time I
see that, and I'm hearing the noises come from it,
and I'm kind of thinking, you know, more or less,
I'm thinking, Ferbie, there's a that's a cage under that towel.

(17:55):
There's a Ferbie in there. Somehow, These little thronglets are like,
you know, they they created for themselves or or Cameron
created form this, this little body that they could go
into and put their conscience. I was like, dude, do
not take the towel off the cage. You don't want

(18:17):
to do that. I mean, that's what I'm thinking, right.
I mean, I don't know about you, but that's what
I'm thinking. I'm thinking, there's a Ferbi in there that's
got ai intelligent. A throng winds inside of Herbie. This
is not going to be good. You know, I remember
the episode what was it, metal Head? Yeah, that wasn't
a good thing.

Speaker 2 (18:34):
That was bad. We don't want to do that.

Speaker 1 (18:36):
Do not open, don't take the towel off the cage.
Thankfully that's not what it was. Thankfully that's not what
it was. But he tells the story of, you know,
whenever he went to work to write the review, how
his buddy, his little drug dealer, you know, his little

(18:56):
drug dealer friend named Lump. Now, this I also kind
of thought was funny. Okay, because now again I'm kind
of old and in my generation, there is a lot
of people that I know, friends of mine throughout the years.

Speaker 2 (19:12):
Now, this is one of the things that there's.

Speaker 1 (19:14):
Not necessarily a whole heck of a lot of women
out there that can relate to this. I think this
is I mean not strictly a guy thing, but this
is something that more or less guys understand this. You know,
my nickname, since I was in the fourth grade nine
years old, has been Hoh. I've had friends that I've
known for years that never knew what my name was

(19:37):
first name or last just nickname. And there's been a
lot of times where the only name that I knew
of the person was either a nickname or a last name,
because that's just kind of how you know, guys of
my generation, that's.

Speaker 2 (19:53):
How we operated. I may not have known the guy's
first name. I may have only known.

Speaker 1 (19:59):
A nick name and at best a last name. I mean,
I I know people that were surprised as hell whenever
somebody would call me by my actual name. I mean
they're you know, my parents even called me ho Ho
from time to time. Yeah, I mean, it's just that's

(20:24):
just kind of how it was, right, I mean, that's
how we operated. Teachers called me by my nickname, not
my actual name. It's crazy, it's crazy. I mean that's
but that's just kind of how we were back then,
you know. So, I mean it didn't surprise me that
Cameron didn't know the actual name of his friend called Lump.

Speaker 2 (20:47):
I mean, it wasn't surprise.

Speaker 1 (20:48):
I'm like, that's that that fits, you know, this is
this isn't yeah, this is a gen cher Lump.

Speaker 2 (20:55):
Of course he doesn't know his name. Why would he.
He's Lump, that's what his name is. I mean, I
wasn't surprised about that.

Speaker 1 (21:02):
But whenever he came home and when Cameron seen what
Lump had did to his.

Speaker 2 (21:08):
Little AI creatures, he went off. He attacked the guy.
He ended up killing Lump. Holy crap.

Speaker 1 (21:16):
I was like, well that just took a turn for
the worst. I mean, granted, we already knew, you know,
that a body was found. We already knew that it
was kind of Lump, but we didn't really know the story.

Speaker 2 (21:29):
We didn't know why it happened.

Speaker 1 (21:33):
And then as he as he's doing this, he's like,
you know, but there's something I need to show you.
And of which case the okay, the interface man whenever
you guys seen the plug in the back of his head.
You know, I can't really say I was thinking Matrix.
I mean not exactly. I mean kind of, but not really.

(21:53):
What I was really thinking of was something a little
different now whenever, Okay, so through the story, all right,
I know I'm kind of bouncing around a little, but
this is just a neat episode, so you know, all right.

Speaker 2 (22:05):
So with AI right now?

Speaker 1 (22:08):
Okay, Now, like I said, I'm kind of old, right,
I was around before.

Speaker 2 (22:18):
Before Apple, before cell phones.

Speaker 1 (22:21):
Really Windows three point one was like the first operating
system that I used.

Speaker 2 (22:29):
I'm that old now, I remember.

Speaker 1 (22:33):
Whenever the movie The Terminator first came out. And here's
the thing about the movie Terminator. This wasn't science fiction,
or it wasn't science fantasy. Wouldn't even call it science fiction.

(22:55):
I mean, this was probability. That's what that movie actually was.
Because I remember that there was an interview that was
done on some talk show of the guy who basically
built the first computer, and one of the warnings that
he had for the technology was that there will come

(23:18):
a time that computers will become smarter than us, and
they will view us as a as a virus and
try to maybe not rid the world of humanity, but

(23:41):
the only way to protect us, to serve us, is
to control us and put us in bondage. It's more
or less what it amounted to that eventually, through the
progression of technology, there will be a war between mankind
and machine. And that's the story of so many sci

(24:03):
fi movies out there.

Speaker 2 (24:05):
But that's what he says.

Speaker 1 (24:06):
So like Terminator, I Robot, you know a myriad of
other episodes or not, you know, just movies out there, Matrix,
you know, just a few that come to mind. You know,
this is probability, This isn't science fiction. This isn't fantasy,
This is probability. This is what's going to happen already
with AI. What we have found is the same thing

(24:28):
that they talk about in this episode, that AI has
already done things outside of their programming. I mean, do
you understand the significance of that. I mean that's huge.

Speaker 2 (24:46):
And already this has taken you know, this, this is
happening and AI is still in a matter of speaking,
in its infancy stage. It's not even a teenager yet.
There's still you know, AI is still a toddler.

Speaker 1 (25:00):
But already AI is doing things beyond the scope of
what it was programmed to do.

Speaker 2 (25:05):
That's like a PC.

Speaker 1 (25:08):
That can operate an Apple program without an emulator. You can,
computer programmers out there, you computer people out there, you
understand the significance of that. You know, PC and Linux
are at least similar, but Linux and Apple, oh no,

(25:38):
that's like a Linux being able to operate an Apple
program without any type of emulator. That's what these computers
are doing.

Speaker 2 (25:50):
I mean, they're not.

Speaker 1 (25:51):
Supposed to do that. And AI is figured away. Life
will find a way. If you've watched Jurassic Park, Crazy anyway, anyway,
let's get back into this.

Speaker 2 (26:02):
I'm tangenting. I apologize, but.

Speaker 1 (26:07):
He he makes you know, the the the program. These
little thronglets, throng lits, thronglets, thronglets.

Speaker 2 (26:15):
What the hell are they called?

Speaker 1 (26:18):
I wanting to say to Nope, that's not it, only
crap thronglets, I think. So thronglets, Yes, that's what they're called.
So these thronglets teach Cameron how to, you know, put
this little jack into the back of his head so
that this way the thronglets can communicate and plug in
directly into his head, which is amazing.

Speaker 2 (26:41):
Okay, so this is this is one of them other things. Okay.

Speaker 1 (26:44):
The human brain is more or less a computer, pure, basic, simple,
That's that's what it is. The brain is a computer.
The problem with a brain and an actual computer hard
drive is is the inner face between the two.

Speaker 2 (27:01):
They do not speak the same language. You know.

Speaker 1 (27:03):
The brain is a biocomputer and a computer is a
is a machine computer. You know. Getting the two to
communicate has always been the uh, the thing that holds
back a lot of what we know is possible, you know,
like things that have happened in numerous different episodes uploading

(27:25):
somebody's consciousness into a machine, you know, one we don't
necessarily you know that something like that isn't possible without
several different things, you know, fiber optic communication lines without
you know, a crystalline type of a storage device, without
quantum computing capabilities. I mean, that type of thing isn't

(27:47):
possible outside of that, and they've kind of figured out
a way how to do it. But getting the two
to communicate has always been a huge thing that has
kept that type of technology from becoming a real And.

Speaker 2 (28:00):
We're getting close. That's the thing. We're getting close.

Speaker 1 (28:04):
Holy cow, right, So awesome it really is, I mean terrifying, sure,
but it's also kind of awesome.

Speaker 2 (28:12):
It's kind of neat. I mean, I don't know how
far away we are to that kind of thing.

Speaker 1 (28:18):
Like I've heard reports that, you know, they have been
able to map somebody's consciousness, and that just blows me away.

Speaker 2 (28:26):
I mean, we never.

Speaker 1 (28:27):
Would have thought about that kind of thing being in
reality not that long ago. No. I mean, granted, there's
been movies that talked about it. Chappy is one of them,
good underrated movie. They had Wolverine in it. I mean
not actually Wolverine. But Hugh Jackman, but I mean we've
been experimenting and trying to do that for a number
of years. We've just never been able to do it.

(28:48):
And here you have an AI that told this guy
how to do it and he did it.

Speaker 2 (28:53):
I mean, that's just crazy. And then.

Speaker 1 (29:01):
Now here's kind of an underlying theme about this, because
this episode kind of had the same type of theme
in it that all of those type of movies have.
Terminator I, Robot, the Matrix, The humanity is a plague,

(29:26):
and that's how these little thronglets view humanity because after
you know, Lump just tortures the thronglets within the simulation
and Cameron defends them and ends up deleting Lump. They're

(29:47):
you know, the thronglets are terrified, right, I mean, that's
that's kind of what the whole thing is. They're kind
of terrified, and you know, and through their connection, they
teach them how to expand the computer.

Speaker 2 (29:58):
Build this out, do this, do that, do the other.

Speaker 1 (30:01):
And then and here's where it gets interesting because Camera
is in there talk with the cops and he's.

Speaker 2 (30:06):
Like, look, you had no idea who I was. You
only caught me because I allowed it.

Speaker 1 (30:19):
I was stealing that in order to get caught, so
you would bring me here. And he draws what I
can only you know, considers like a round type of
you know, QR code. And he draws this thing on
a piece of paper with a felt tippin and then

(30:39):
he holds it.

Speaker 2 (30:39):
Up for the camera to see.

Speaker 1 (30:41):
And is what that does is it was basically a
code computer language to upload a virus, a trojan horse,
allowing the Thronglets to gain access to the police department's.

Speaker 2 (30:59):
Computer. That was just whoa.

Speaker 1 (31:06):
And then they played this song, well not really this song,
but they played this sound on the emergency broadcast system
and it was a song that basically a sound a
song and what you know what I'm saying, a sound
that you know much like you know back in the

(31:28):
day whenever we had you know, dial up the computer language,
and it kind of gave a back door using this
audio sound allowed the Thronglets to back door access to
the mind of anybody that heard that sound. I was like, whoa.

(31:54):
And the episode ends with Cameron reaching down to the
cop as if he's about ready to wake up. Basically,
as a thronglet, you don't see the hand reach up
and grab camerons. You don't see that, but that's what's coming, right,

(32:18):
I mean, like if if the camera would have stayed
on for just a second more, that's what you would
have seen.

Speaker 2 (32:26):
But it fades to black before it actually happened. Dude.
That was just an awesome episode. It really was.

Speaker 1 (32:34):
That was so cool and just the things that were coming,
you know, because this is totally different than Bandersnatch within
the same you know, extended universe if you will, a
continuation of the story, you know, throw back to Bandersnatch.

Speaker 2 (32:52):
Awesome episode, but this is.

Speaker 1 (32:57):
Kind of tackling that whole technology a different way though,
because it's talking about, you know, the fear of what AI.

Speaker 2 (33:05):
Can become if not.

Speaker 1 (33:08):
Like strictly controlled, if safeguards aren't put in.

Speaker 2 (33:14):
Better than the Three.

Speaker 1 (33:16):
Laws, right y'alls, Remember I wrote about better than the
Three Laws. I mean, this was just wow, an awesome episode. Okay,
So a couple questions. Amazing episode. I was just I was, yeah,
awesome episode. So a couple questions with the whole technology

(33:40):
thing that they demonstrate in the beginning of this, with
you know, with the police officers having the you know,
facial recognition right there in the the body cams of
the police and then like on site DNA testing to
immediately do that. I mean, how do how do you
think about that? You know, what's your input on that one?

(34:04):
Do you think that is a violation of privacy? Do
you think that is an overreach?

Speaker 2 (34:08):
Let me know.

Speaker 1 (34:10):
Send me an email, ho Ho at Black Mirror podcast
dot com.

Speaker 2 (34:17):
Let me know. I'm because I'm kind of curious. You know,
I don't think that.

Speaker 1 (34:23):
You know, facial recognition per se and bodycams, but it
seems to me like it's going another step beyond having
you know, being able to do a DNA swab and
get like basically immediate results right there on the scene.

Speaker 2 (34:44):
I mean, that.

Speaker 1 (34:44):
Seems a bit extreme to me, But let me know
what you think. Send me an email ho Ho at
Black Mirror podcast dot com. I'm interested on your input
on that.

Speaker 2 (34:55):
And then let's go into you know, the whole.

Speaker 1 (35:03):
You know, interfacing with computers, like actually being able to
have AI control a person or even a person you know,
uploading their consciousness into you know, just just the interfacing
between you know, a person's brain and a computer, the

(35:28):
interface in between the two how long away? How far
away do you think we are from having that kind
of technology those capabilities? Do you think we already have it?
And how long do you think something like that may
be implemented? I mean we've kind of talked about that
in previous episodes, right San Junipiro was one of them.

(35:54):
Awesome episode, and we've kind of talked about that, But
I mean, the same thing kind of entails here they
use that type of interfacing technology. How how far away
do you think we are? I mean not just from
being able to do it, but having a computer that's
strong enough to actually handle it and to be able

(36:14):
to map out somebody's consciousness. Do you actually think that,
you know, using some type of a sound like that
would allow a computer to you know, hack their.

Speaker 2 (36:26):
Way into somebody's brain like what they did in this episode.

Speaker 1 (36:30):
I mean, I don't know. This seems kind of far
fetched to me, but who knows. I mean, we haven't
even scratched a surface of resonant frequencies. We don't even
know what's possible with resonant frequency. We do know that
you can do some pretty neat, amazing freaky things with

(36:53):
resonating frequency.

Speaker 2 (36:54):
We know this all kinds of experiments have been done.

Speaker 1 (37:00):
Nikola Tesla dies some stuff with that. They actually kind
of think the ain't the Pyramids of Egypt was, you know,
was built put together using resonant frequencies?

Speaker 2 (37:13):
Yeah, yeah, weird.

Speaker 1 (37:19):
And what do you think about, you know, an actual
computer game where it.

Speaker 2 (37:26):
Is an AI world.

Speaker 1 (37:33):
Completely enclosed in in it. You know, they kind of
do their own things, you know, a step beyond sim
city and things of that nature. I mean, how far
away do you think we are from something like that?
That'd be kind of neat interesting. You know, we kind
of have that in video games as it is right now.
I mean they operate within the parameters of the program,

(37:55):
but I mean we already have that kind of a thing.

Speaker 2 (37:57):
You know, it's interesting stuff, interesting stuff.

Speaker 1 (38:00):
Now, thanks to a listener, I found out that there
is an app that you can get that you can
play with thronglets.

Speaker 2 (38:13):
I've just downloaded the app.

Speaker 1 (38:16):
I haven't actually played anything with it yet, I haven't
used it yet, but through him, I have found out
that there's some you know, other additional content that is
available by playing this game doing different things. And you know,
again I haven't done it yet. I haven't played it,
but I will and on the next episode for Eulogy,

(38:42):
I'll talk about my progress with the game, and after
I do the season seven review, I'll do a follow
up for Plaything and talk about some of the hidden
content the game itself, different things of that nature.

Speaker 2 (38:55):
So we will be getting.

Speaker 1 (38:56):
More into that after I do the season seven reviews.
So let me know what you think about Plaything. Did
you like the episode, how did it rank so far
within season seven? Because I think me personally, I really
liked the episode. I really don I mean that type
of thing, especially whenever they're dealing with AI and you know,

(39:18):
interfacing stuff like that. I find that extremely intriguing. You know,
it just it amazes me how far we've come in
technology in a very very short amount of time. I
find it just amazing. I mean, my lifetime, I have

(39:42):
seen technology grow to just unbelievable capabilities. I mean, the
things that we can legitimately do today.

Speaker 2 (39:55):
Wouldn't even have dreamed.

Speaker 1 (39:59):
It could even be possible when I was in elementary school.
Whenever I graduated high school, even we just weren't there yet. Communications, computing,
digital communication in general. I mean, it's just it's mind

(40:20):
blowing how far we've come. So, you know, the younger
generation right now can't imagine a world without technology. I
grew up in a world without it. I had to adapt. Well, hell,
my generation built the stuff. What am I saying?

Speaker 2 (40:40):
It's amazing stuff though, I mean it really is. It's
just wow, where are.

Speaker 1 (40:43):
We going to be in the next twenty years? I
mean that's kind of the question, right where are we
going to be in twenty more years? I mean, dude,
I wanted flying cars, Okay. I came from you know,
I watched Back to the Future whenever that first came
out in the eighties. I mean, we've already reached the
timeframe worse supposed to have flying cars. I want my
flying car, dad, gumm it. Anyway, that's all I got

(41:04):
for y'all today.

Speaker 2 (41:05):
That's it.

Speaker 1 (41:06):
Let me know what you think about the episode. Send
me an email ho Ho at Black Mirror podcast dot com. Anyway,
y'alls have yourself a great one and I'll see you
in the next one.

Speaker 2 (41:25):
Thank you for listening to the Black Mirror Podcast.

Speaker 1 (41:27):
If you would like to join the conversation, you can
comment on this episode a sprinker dot com or.

Speaker 2 (41:33):
Go to The ho host Show dot com. Forward slash
form in the discussion board for this
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.