Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to Grub and Grace, where we believe the best conversation happens around the table.
(00:05):
I am your host, Mark Flower, and I will be sitting down with authors, pastors, and others who are passionate about topics such as
faith, religion, pop culture, politics, as well as personal testimonies, all while allowing an open and safe space for good faith dialogue.
We don't always have to agree, but we're always willing to listen without demonizing or shutting the other down.
(00:30):
We believe that we can learn from one another through active listening as we navigate faith and life.
Our goal is to enter into every conversation with the intention to learn while leaving our pride and our ego at the door.
So, pull up a chair and stay a while.
(00:56):
Welcome to Grub and Grace. I'm Mark Flower. I'm here with my guest, Daniel Thompson.
He is a mad scientist, a writer, director, actor, and editor, all in film, just does fantastic work.
Welcome to the show, Daniel.
Happy to be here. Thank you for having me.
Of course. So, today we're going to be talking about movies, pop culture, and the core of it, AI,
(01:20):
and how it all kind of integrates and comes together and creates this beautiful thing or, in some perspectives, a monstrous thing.
Daniel, why don't you tell me a little bit about yourself and just kind of what my viewers can learn about you.
Well, first off, I'm just a filmmaker that is totally obsessed with movies.
(01:44):
So, to give you like a little backstory, this is a small backstory.
Back in 2020, like literally when the pandemic started, you know, we were all stuck in our little bubbles.
I was so bored that I was like, you know, it would be cool if I like put myself in a scene from Star Wars
(02:06):
and it was like the scene where like Han Solo is like talking with Greedo and I just shot it.
I was trying to match the lighting and stuff and at the time, Photoshop had some new like tool
where you could like remove any object from the image and it was like a very rude.
It was a rudimentary version of what it is now, which is heavily AI.
(02:29):
But yeah, I tried that out, had way too much fun making it and, you know, people were like, you got to do some more, some more.
And then fast forward to now, I'm like a Jedi master at putting myself in movies, but adding a comedic spin to them.
And here we are.
Here we are. I got to say, I love you. So whoever's watching this or listening, you have to go check them out.
(02:54):
So his profile on Instagram is NerdyDan1.
It's N-E-R-D-Y-D-A-N-1, the number one.
And he's got some fantastic content, like just like he was talking about with the Greedo and put himself in Greedo and in place of Han Solo.
And he's got some recent stuff. He's put himself with like talking to like Wolverine.
(03:15):
He's been himself Wolverine talking to the other X-Men and everything.
And it's just it's so funny the way he puts everything together and just finds these little moments where you can kind of sniff these little things in there.
Thank you, man. Thank you. I appreciate that.
It's I think it's a wish fulfillment thing.
Like when you're watching a movie, you're like, what would I do if I was in that scene?
(03:36):
You know, and yeah, that's kind of the root of it.
I think I think everybody really doesn't want to watch a movie that was kind of especially when you're watching actors like Keanu Reeves or someone who's just a very blank slate.
You know, they're just such a I mean, I don't want to say bland, but in a way, their characters are kind of bland in a way.
(03:57):
And it makes it easier to put yourself into their shoes and kind of imagine yourself as that character.
Kind of like like I said, Keanu Reeves, like John Wick.
Like I oftentimes do that. I hear from other people like John Wick is a very easy one to kind of put yourself in the shoes or like Neo or something like that.
There's plenty of other actors and actresses that they're like that, too.
But you're 100 percent on point. I mean, I've watched John.
(04:18):
I've watched John Wick so many times I've been like, yeah, I'm thinking I'm back.
You know, but I just I just, you know, it's just one of those things you can't help but do.
Like, yeah, it's a it's an addictive process to watch things versus like, you know, listening to like music or something like, you know, the lyrics, but you're not necessarily like trying to be that person.
(04:41):
That makes any sense?
Totally makes sense. Yeah.
So speaking of like the Matrix and Neo and everything like that, we're going to be talking about AI today.
And this is my my try to be smooth transition into our topic.
Great. You're doing great. Love it.
So I'll kind of jump in. So cinema cinema and pop culture, where does AI come into all of it?
(05:03):
So I know that you use a lot of AI tools within what you do.
Obviously, you use green screen. But I from some of the content you've you've made, I've actually noticed that you use some voice replacement.
So you actually take out the audio from the actors and then you put in your own audio to match, you know, whatever you're doing with the character.
(05:24):
So where do you think like AI can intersects with everything?
Well, you know, honestly, I think a lot of people, they kind of give AI like a bad rap, like it just it does everything for you. But that's like the worst case scenario.
Right. Like if you have a great idea and then you can meet in the middle with AI, it's like the possibilities are endless.
(05:50):
Like, you know, when I started doing this, the only real like AI thing I could do was just put myself in the scene. But, you know, fast forward to now, like you said, you know, I can have an actor say something they didn't say before and create like a whole new movie scene to where people watch it.
And they're like, like, I don't remember this in the movie. Like, I don't remember this. Like, maybe I just need to rewatch it. You know, that was my thought when I was watching your stuff, too.
(06:18):
And it's an interesting thing because, you know, now, you know, if you think like two, three years from now, that might get way worse.
Like we might actually have a version of your favorite movie with a completely new actor.
Like, like this is the worst case scenario. Right. Like if Hollywood just was like, you know, let's just save money and get like the newest tick tock star, a bunch of tick tock stars and just put them in the matrix in these roles.
(06:47):
But we don't have to actually film anything except for them on a green screen, replace it.
And then you have a remake or reboot or whatever. And it's like, that's the worst case scenario. Right.
Like people would be like, that's not creative. You didn't do anything new except put new people in it using AI. Right.
(07:09):
But the cooler end of that would be, you know, using AI to create something mind bending in terms of like, OK, we're making we're making the matrix five or six or whatever.
And you have I don't want to say chat GPT, but something like that. Right. And you're trying to make a meta script or something.
(07:36):
I really think it could be used in a way that is beneficial to creatives so that it's not I'm talking a lot.
I'm talking a lot. But it sounds it's starting to sound confusing.
But I just think, you know, meeting in the middle with AI is the best solution.
(07:58):
Like, don't use it for everything, but use it as a tool for sure. And there's so many tools out there to where like deep bags is one where, you know, it basically pulls in a bunch of different video clips.
It reimagines things and tries to line things up so that you can create those new emotions and match the facial features and everything like that.
(08:19):
And then you like, you know, the voice replacement that we were talking about.
And then there's even like, you know, text to to visual so you can create whole visual scenes. I just sent you that one on Instagram earlier today where it's it was an entire short film about the early cavemen or people.
And it's like it looks realistic. It looks like real people looks like real shots that were shot.
(08:43):
Some of them you can you can easily tell, like when it's zooming out to the rocks and the mountains where everything starts to kind of fold in on each other.
Yeah, but it's crazy how you can create these scenes just by text prompts and even video games to there is that other one that I sent you about is an entire three 3D world that you can immerse yourself just by creating a text prompt and it'll create it itself.
(09:05):
It's crazy where it's become where it's going.
And even on top of that, too, it's like, like, even if you did like a text to video prompt, like you could have that as a basis. Right. And now there's some new tech out there where like even if you just have like a still shot of that image and you acted out like just on your webcam just acted out what you wanted.
(09:29):
Like the person in that frame to do like you could do that. Run it again. And then it would have your like fake video created actor like move.
Yeah. So it's like a form a different form of motion capture. Yeah. So I put my hand up and I move my finger down. It would capture that it would basically make a skeleton framework of your hand and then it would be able to move those fingers down like that.
(09:54):
Really cool. Yeah. And I know there's probably like some like video animator like cringing at this listening to this right now.
Like, oh, no. But at the same time, it's it's evolution. It is.
I mean, you probably know a lot more about this than I do. I'm I've immersed myself as much as I can. I've done a little bit of video work.
(10:16):
But you you this is like what you do. This is what you do. And so I feel like you have a better understanding of how this kind of work. So like you said, someone's probably watching this and cringing.
But but I feel like, you know, there's probably a lot of people that are listening that's going to be completely new to them maybe. And here's some of the stuff.
So I just like this whole conversation. I love A.I. and love where it's going and where it's coming from. But I know there's a lot of criticism, too. There's a lot of fears about A.I. too.
(10:47):
And especially with like like deep fakes and creating videos of people where they didn't actually say what they said or or doing something that they didn't actually do.
Yeah. What are your thoughts on that? OK, well, I have so many. Now, this is a great one for me because, you know, there's a great line in the Spider-Man mythos, the Spider-Man mythos that says with great power comes great responsibility.
(11:15):
So if you know how to do all the stuff that I can do, you have an ethical and moral responsibility to really think about like what you're going to do before you do it. And a great example for me was I did a video last December
that was like talking about the P. Diddy situation because that was when it was starting to come out, like the rumors of it were the lawsuit, I think, had just come out. And I was like, I want to do something with this, but I don't want to put any words in his mouth.
(11:47):
Right. I want to just use what exists already. And there's a movie called Get Him to the Greek where he basically plays himself. And I found that scene, a specific scene, and I just wrote like some lines for me to say, just like, oh, yeah, you know, 50 cents, like offered by your company.
And I'm just saying you might want to take that because, you know, the FBI, the feds might come and arrest you, you know, and, you know, I was just using a lot of his responses in that scene.
(12:21):
But the once it once it went viral, like, I think 50 cents saw it and he reposted it. Really? And then it went crazy. And then, you know, I was getting like people like sending me death threats and stuff like that and just like other people going like, dude, this is crazy.
Like, I don't even know if I like, I don't know if this is real or not. And then, you know, it traveled through the grapevine to me eventually that like Diddy himself had saw it. And he had like a whole legal team and stuff like that.
(12:54):
He was like trying to figure out like, can we get this guy like, is there any way we can like, you know, do something with them? But for me, it was like, I'm protected under like a parody law or something.
It's like a creative life, creative attribution or something like that.
Yeah, so it was like, you know, and I didn't, I didn't put any words in his mouth. Thank God I didn't put any words in his mouth. But I think, you know, you know, I can see somebody that didn't have as much restraint as I did taking it way too far.
(13:26):
And, you know, if you take it way too far, you know, there's people that don't like their voices to be used because, you know, AI is a data based thing right like the more you give it the more it can learn so imagine if you're an actor, and you have, you've been working for like 30 years, and I have 30 years worth of material to learn how you would be sad how you would say something happy, you know, all your nuances can just be assembled.
(13:55):
With AI and I think that's very scary.
To some people cyber bullying I mean you could do so much dark stuff with with this stuff.
But you know I think on the, the brighter end of it is like, oh no, what if a better example of the brighter side of things is there was a person, or there were people who saw that movie Spider-Man No Way Home, and you know the scene where all the Spider-Man come through that portal for the first time you see
(14:26):
the movie come in and Andrew Garfield and you're like, oh, this is crazy. There was a bunch of people who felt like man I was really wishing Miles Morales would have shown up and that's the you know, the biracial Spider-Man.
I did movies. Yeah, it's he's great right and so I decided, wouldn't it be cool if I like, just got Miles Morales suit and I came out of that, like, thing, and just created my own little scene.
(14:56):
And, like, the Miles Morales and Spider-Man community they were like, oh my gosh, this is so great. My son, he is so happy right now watching this video. He wants to see more and I'm like, and they were like you got to make some more of this and I'm like, I'm, I can only make things that pre exist, you know, in a movie so I can't really like make a full length thing of it right.
(15:19):
But I think like that is that shows the potential of how you can really like make your dreams come true with AI. Like speaking of the racial thing, like a lot of the characters, especially around like superheroes.
It's very like audience driven, the fans driven, like if there's a lot of hype, they're going to make it. You look at Deadpool, I mean they released that test footage, the internet blew up about it, and then it got made.
(15:48):
And then you look at like Miles Morales and the hype has always been high. It's super high. Yet, he hasn't made his debut yet in live action. I mean, they've made the, you know, the Spider-Verse two of them so far, but which were amazing, by the way, but.
Like, when is this going to happen? All we have is, you know, Black Panther and was it Monica Rambeau from Marvel? Yeah, Ms. Marvel. So it's like, it's like, where do you see that divide where, you know, the hype is high, the want to have characters like that in the universe is up there, just like how Deadpool was and they got made.
(16:34):
Like, do you like how do you how do you how does it kind of impact you and people around you that like when you when you just, you know, know that that's kind of going on?
No, that's a great question. I mean, honestly, I think, you know, we've heard it for for years where people are like, we want to see more representation across the board. We want to see, you know, somebody that looks like us on the big screen or whatever, you know, like, and, you know, a lot of people feel that the Miles Morales character is, you know, he's so fresh and so new, really, like, I think he didn't really, he wasn't really even created until like 2000.
(17:15):
Well, or 11, something over that time. So he's very fresh. And I think, you know, everybody is very precious over like the interpretation of that. You know, some people think, you know, they put him in live action. He needs to be like a kid, right? Needs to be young, like in the animated movies.
But then some people would say, you know, well, he needs to be like a little older just because, you know, like, we don't want to see that same story told again. Right. So it's just it's kind of a it's a pickle to do things with characters that have so much weight attached to them because you want to do it right.
(17:53):
And I think, you know, when we do eventually see Miles Morales, it'll be in a way we weren't expecting. Like, I think we'll see him sooner than we think. Like, in the next three or four years, I think we'll see him pop up, not in a Spider-Man project, but in like a Marvel thing.
Because, you know, it's the multiverse right now. Not to get geeky right now with you.
(18:15):
Well, I guess technically we did see him in live action. What's his name? Childish Gambino.
That's right. I forget his actor's name, but we did see him, I guess, technically.
Yeah, he played his uncle. He played Miles Morales. He actually showed up in the second Spider-Verse as Miles Morales. That's right. As the Prowler.
(18:36):
Yeah. So he wasn't Miles Morales in that scene. Right. Okay. He's the Prowler. And he played Miles Morales as his uncle both times. So it's like, that's really cool to see.
Because, you know, I think Donald Glover was like an inspiration for the creation of Miles Morales and, you know, had a little hype before they cast Andrew Garfield.
(18:58):
They were like, oh, maybe we'll cast, you know, Donald Glover as Peter Parker. Obviously, the world would have went into complete chaos if that had happened.
But I think, you know, I think the byproduct of people being curious to see that version of Peter Parker that looked like Donald Glover was Miles Morales. So, like, everything has like an interesting canon to it.
(19:25):
So I kind of want to use this as a pushing off points to this next thing. I kind of want to talk about like biases within AI. So bringing it back to AI full circle. The bias of AI. So you get like facial recognition is kind of the main thing I think about.
There's been a lot of controversy when facial recognition came out with like, you know, the iPhone or other phones where you get facial recognition to get in and other things that use facial recognition.
(19:53):
And it was very biased towards white people that that it was. And basically it was the whole thing that they only tested the facial recognition software on white people or mostly.
And so it basically excluded anyone else that wasn't white. And so if you get someone that from like Mexico, someone from Africa, someone from Asia, like, you know, Korea or China or someone,
(20:20):
the facial recognition doesn't recognize the face, even if you've put your face in multiple times. It still has trouble recognizing the face on a regular basis. I know they've gotten better because of all the backlash of it.
But do you see any other like biases within AI, you know, with the other tools that we kind of talked about or anything else within AI?
(20:43):
Oh, yeah. I mean, like the voice, like sometimes like even like a voice generation, right, of something is interesting because this is kind of a weird conversation to have with like, like normal people who don't look at like the AI stuff in that way.
But I'm like, you know, there was one time I was talking to one of my friends about something and they were like, do you think this this AI generation sounds black?
(21:08):
And I was like, I don't know what does black sound like? You know, like, like we could really get into it and be like, you know, like if you're thinking of like a like 70s exploitation flick and you're like, I'm shaft, you know, or what Robert Downey Jr. did in Tropic Thunder.
Oh, God. Does that sound black? I mean, it's debatable. It definitely did sound of an ilk of black, right. But it's like, what is a modern day black person sound like? And I think you can't answer that question, right? Because like, where are you from?
(21:39):
Like, that's the thing like that with the AI, you have to have to be so specific to create a voice that sounds like whatever ethnicity you want, because at the same time, you have to put in where they came from, you know, like where they grow up, like, like, where are their parents from Boston and where the other was the other parent from like, I don't know, North Carolina, you know, what does that accent sound like?
(22:06):
And I think AI at this point doesn't have like, it doesn't have the necessary complexity to process all that information. So you get like a very, like you get a very weird sounding AI that's like supposed to be like a black guy. It's like, Hey, I'm Chad. Cool. Hey, nice to meet you. I'm like, that's cool. That exists. But it's like, is there any other preference? Like, is there any other models of voices that you have?
(22:34):
Yeah, it's like, cool. We have like 50 white people voices to choose from, then we have like two black people voices to choose from. It's like, okay, there's a little misrepresentation here.
Exactly. And I want to, and I want to preface the voice I did previously. It's not a white person voice. I wasn't saying that. I was just saying that that voice that they put on the black guy, you can like they like they have gender, they have like a AI that you can like see the visuals along with the, the voice, right?
(23:03):
And so when you see the visuals with the voice, it like doesn't match up. Like it doesn't really, it doesn't feel like it's alive. And I think that's like an interesting thing there too, is that, you know, like human beings know human beings when they see them, right?
So we're going to get to a point eventually where like we're visually looking at like something over zoom, and we might be talking to an AI generated person. And it might be, you know, pretty good.
(23:35):
Yeah, and it makes me think of the movie Her with Samantha, the AI assistant for walking Phoenix's character. And, you know, it's you're talking to somebody and you almost kind of start to bond and create this relationship with this AI, which isn't real.
And it's, it ordered I came across it, I didn't type down or anything I came across, there was this survey that I forget what what organization did it, but they served like 1200 people are a little more than 1200 people.
(24:07):
And they asked me all these questions about AI and, and I think they only came out with like, it was only like, like 30 to 40% of people like would know that AI was integrated in certain things within their life, like they had no idea that AI was integrated in like social
media, or other tools that they use like even search engines, and it's just, it kind of showed me they'll put in the notes or something like that where I found this but it just kind of showed the disconnect of people's understanding of AI and how integrated is with
(24:42):
society.
And there's a phobia, a real phobia of something that's not that they don't think that they don't think is conscious, or has a benefit to them but as we see is affecting every, like you said every like facet of their lives I mean, like, I'm trying to imagine what school would be
(25:04):
like for me, you know, in this time of like generating, you know, documents, documents, you know paperwork, you know you're like oh I got to write an essay and then you get the essay and then you just, you know, read it yourself and then make your adjustments to make it more
sound more like you.
You know, I'm like, you know, how do we tackle that on the educational level.
(25:29):
You know,
on the inverse of that it's like oh if AI, if we all had like our own personal AI assistants that knew us deeply.
Then we have, we could have AI teachers that know how to teach us best. Right, but at the same time it's like do you want to be taught by something that's not.
It doesn't have a soul, quote unquote, you know, I don't know.
(25:53):
I think the whole survey thing they actually asked, like, would you, would you trust AI to take care of a loved one so I think specifically like a, like a parent in like a nursing home or something like that, or someone that just needs a little bit more assistance
and help.
I don't think it would be safe to trust AI to take care of someone I think I think it was overwhelmingly like decline I don't, you know, I don't think it would be safe to do that, which, which is good I think people, I'm glad that majority of population or at least the majority
(26:24):
of the people that were in the survey would agree no I don't think we're it's at a level that we could actually put AI into the hands, someone's life into the hands of AI.
I mean, okay, okay I got a fun one. I got a fun example of this sort of I recently. I got a I got a Tesla. Recently, I must say I don't I'm not in the political I don't lean politically in that direction, but I love that I bought the full self
(26:52):
driving package with it and I wanted to test. Could it get me from North Carolina to New York City, could it drive me all the way there without any any, you know, interventions by me.
And surprisingly, it got me there, no intervention, I didn't have my hand on the wheel. I mean, I was like hovering a couple of times to just kind of, you know, see if it was going to do anything weird but it behaved exactly like I would have behaved, if not better.
(27:24):
And that was kind of eye opening because I was like okay here I am not trusting this thing to make choices for me but it's got millions of inputs of data from previous drives of other people driving in this constantly learning.
And so it's like, it's becoming or is starting to become like the ultimate driver like it might drive better than I would. So, in that instance like, you know, I'm thinking about like the car I had before was a gas powered car didn't drive itself, and I'm like, man, I actually like it's actually crazy to think that like, you know, we might be in a future where cars are safer
(28:10):
by driving you versus the human element.
Mm hmm. It'll be like a minority like Minority Report or something like that.
Yeah, you could sit back and have a whole conversation not to worry about anything. Have a spot of tea or something like that.
It's crazy, because it's like okay well you know how much control, but this also goes into like Terminator or The Matrix, you know, like where it's like, how much control, are we as human beings. Okay with giving up. Like, that's the, that's like the kind of the scary conundrum where it's like, yeah, I feel safer with it doing stuff but I also like the free will of just being able to have my hand on the wheel and just, you know, to beat a little bit you know if I want to down an open road, but
(28:53):
is that like responsible. No, I don't know.
That's the question.
No, there's this, there's a theologian that I have been following got to meet him in October.
And he came up an episode about AI he was talking to someone in the field.
And what trip said was, I think he was talking about more political, the kind of the AI and political spheres are as far as things as far as like media and whatnot but they could apply here just like the development of AI, it's, I kind of see it in like what he was talking about is it's kind of like an above ground swimming pool, you get a bunch of people in there, they start rotating around you create this whirlpool.
(29:39):
And it's difficult to stop that whirlpool, all you can do is really, you know, go along with the speed of it or continue to make it faster. And the only way to really stop that is, you know, there's going to be people having to throw themselves in the way of it getting, you know, swept under the waves and linking arms in hopes that you can at least you know slow it down so it's.
(30:01):
But with AI it's I feel like it's that whirlpool it's that it's going to keep going, people are going to keep doing that and that's where it starts reaching that scary territory it's like, it feels safe right now, but are we reaching that point of like Terminator style where, you know, it's going to be, it's going to go beyond that point of no return.
What was I watching recently it was, I think it was the Terminator animated thing that just came out Netflix this year. Did you see that one.
(30:30):
I've seen bits and pieces of it everybody tells me it's really good. I just haven't had a chance to just sit down and view it but it's like what Terminator zero or something like that zero so it basically it's it's that point when the AI goes online and decides to nuke all of humanity, but it takes place in Japan and they talk about what's happened in America is all the way I've seen with Terminator is everything that happens in America.
(30:53):
But this is a unique perspective because this is like an alternative AI that was created in Japan that was in contradiction to the one that was created in America.
So and then the scientist is having these conversations with the AI AI debating whether to put the AI online and he's like expressing all these things of like you know I'm concerned because I don't want you to you know decide that humanity is bad nuke us.
(31:30):
AI is this whole philosophical debate that he's having throughout the whole series of that and it was just really interesting.
That's pretty rad now that's actually cool because it is kind of a nuclear arms race right now with the technology AI I mean you look at, you know, you know, like the tech in China, like there's some smartphone that's like band here.
(31:52):
That's like, I think it turns into like a tablet or something something crazy. Really like unfold a piece of paper. Yeah, you can fold it like a trifold thing, and it's super thin and yeah that like brand is banned in the US because they think it's like spyware or something but I think really it's because it would kill a lot of its American competitors, you know, like the Apple devices.
(32:19):
Like I remember when like the iPhone when that came out that was like, that was like the peak of like human technology a computer in your hand, right, but now it's like, you know, people's tastes have kind of evolved to where it's like oh well I want.
I want a camera inside of the phone with this resolution but I also want the screen size. And if you were to show me like, you know, that brand in China that that smartphone that turns into a tablet or basically a computer, you know, I would be like, well, I don't really think I need the iPhone anymore.
(32:55):
You know, reminds me of this comedy video I saw on YouTube years ago it was, it was. So they came out with several models the iPhone, and they're like, this is the iPhone model 15 or whatever and then it's like, it's like this like five foot long phone and it fits down like the entire leg of the person's
pants, and they're like trying to walk inside all stiff leg but it's kind of like that where it's like, did the demand just keep getting bigger and bigger and it's like do we need that.
(33:20):
Okay, here's where it's scary. Okay, I gotta ask you this because this is going to be really the test to see if you like how creeped out you are with AI is a. Have you seen the neural link stuff that Tesla is doing neural link.
Yeah, I've heard something about people making fun of that with like that's the chip of the Antichrist and everything like that.
(33:41):
So okay, so it first started off with them showing. It's a cranial implant. So yeah you put it in your, you get implanted in your brain.
But they showed a sheep that had some type of brain disorder like where like it could only move to the right, like it could only move sideways it could move like straightforward.
(34:02):
So like a Zoolander type thing. Yeah. I can't turn left. But yeah, they put it in his brain and it released a certain amount of voltages in his brain, so that it could walk normally.
And then they were saying well we could do this to people that have like Alzheimer's, you know because that's all that that is. Ampties or something like that.
(34:26):
Exactly is it like if you if you're just putting this thing in someone's head to just give out the right amount of voltage so that they don't lose their memory.
So that like the gray matter doesn't just die. You know, then you get into like data right like now there's a guy who I think he's like there's a guy who couldn't.
(34:47):
He can't use his body at all right and they he got he got the implant and now he's like moving cursors around. He's controlling his computer playing video games with his mind.
That's crazy.
So I'm like, so I'm like, like, when you put AI into that device that's already in his head. He's not dead. I think he's had it for like three years or something like that.
(35:11):
Nothing weird has happened with him. I'm like, you know, we might have people that there might be like a whole like class thing that happens right like the people who get the cranial implants.
They're going to be technically better human beings because they can speak any language they have any knowledge that they want to access.
But at the same time, like I said earlier, how much control would you have to give up potentially to have that in your brain? Like would you trust the company to be like we're not we're not thinking we're not reading what you're thinking.
(35:45):
I don't I don't know if I have enough trust. I mean, it really depends on if the devices has the ability to be connected outside of your body to like a network, or if it's or if it's 100% offline, off grid.
I don't know if I could trust the company even if they told me it was off grid and like it's no it's 100% local. It's just it's just basically telling your body.
(36:08):
I don't know if I could trust the company, especially nowadays, because you're seeing you're seeing how much modern day AI is being used to hurt people and how much the media is not reporting on this kind of stuff.
You see you see what's happening in like Palestine, using drones now. You know, what I hear is that there's still pilots but they're still being flown by AI as well. And they have guns on them. They you know, they're able to do all that.
(36:36):
Even the the missile systems they use its AI driven to select targets in in the areas that they want to bomb. Even like you mentioned China with some of their technology, they just came out with a a self driving AI, like police bot that can
saw that it's like a ball like a wheel. It's like a single wheel with like a little casing around it. And it can it can drive up to like 35 miles an hour can follow up suspects if they're fleeing on foot can shoot the nets and apprehend the the perpetrator until
(37:11):
you know police are able to get to the person but it's crazy you know maybe not the police thing might not might be used you know wrongly but like I think the thing with the the Palestine thing you people a lot of people are worrying about you know with like Amazon their little
drop ship drones that technology is actually being used in Palestine for that kind of thing. So so it's kind of scary to like when you when you hear about that kind of stuff. It really makes me doubt that if those ocular implants are actually going to be used for good or if they could potentially be hacked or you know driven by the company that that put them in.
(37:48):
So yeah it's kind of scary territory. I kind of see like somebody not paying their subscription fee and then you just get them or you just paralyze your some like that. Yeah it's like oh my gosh I used to know you know like all these complex math problems but now I just I just want to watch TV.
I'm going to watch TV. I'm a cheetos. Yeah yeah you know so it's like or you know I think there's a show there was a show called Year Million and it was giving all of these like hypothetical things that were likely to happen in the future like when we reach that like technological point and it was like people would stop speaking and it would just be people would just be
(38:31):
communicating with their like minds to each other and people that couldn't afford that they would be looked at as like the new like like poor class where it's like oh you're talking with your mouth that's just wasting so much energy you know.
It makes me think of your alien videos that you've been doing. Oh my gosh yeah I love those. I love that video just because it's so simple. Something about an alien that just doesn't talk like with his mouth is just kind of funny and like I think human beings we view aliens.
(39:07):
It depends on like what your theology is some people like view aliens as like this like demonic thing. Other people view it as like oh well maybe they're just like us but like they don't have they might not have like our problems like maybe our problems are just like human problems.
But you know I'm kind of cynical where I'm like you know if there's life there's going to be conflict so you know any type of life form on a different planet would probably have similar issues maybe.
(39:36):
Yeah and like you said there probably would be conflict is it because you look at fear if you have another conflict of interest you fear that you're going to lose something from that conflict of conflict of interest so that's what drives behavior to to keep your position.
A lot of people I think it was the love of money is kind of the root of all evil and I think that's where it kind of stemmed from that it's not what it's not money what people fear to lose it's the fear of just losing anything in your life.
(40:06):
And it's kind of like where people that have you know not very much you don't really have that that fear driven life whereas you know people with with abundance in their life they have that fear and you know that that could be where the AI comes in where maybe that's that's what's kind of driving some of that fear with AI because it's so easy to you know like we were talking about just the fear of AI you know where AI is going where you know it could be used to kill people.
(40:35):
It could be used to paralyze people or make them dumber you know whatever the case.
But you know where it stands AI is a very helpful tool.
Like we're talking at first so it's it's trying to find that middle ground of, you know, talking about hypotheticals but then seeing them as hypotheticals and not kind of get lost in the weeds of, you know, revolving your whole life around this like fear of what it could be.
(40:59):
Yeah and who police and that's the like you kind of said it's like who who polices this right like ethically I feel like we all as human beings have a say in like what direction it goes towards right.
But you know when you look at like like you know people that you know are not like middle class or whatever their view on AI is like infinitely more scary like they don't even want to learn how to use it because they're just like no I'm happy with like doing things the manual way and then you know me I'm like well you need to know it just so you can you know know what you're dealing with you know like keep it is a competitive market right now it's competitive in the job market.
(41:41):
With like do you know how to use any like AI stuff like if you're a creative person or if you're a numbers person it's like it affects so many like lives right now on like it's not it's not it's not super scary but I feel like there's people that just that like within companies they're trying to figure out how to figure out how they can use AI and still keep their jobs.
(42:10):
Right. But I feel like the ultimate end to this AI stuff is for us to just be creative.
Like honestly like I think this is like my wild idea so take this with a grain of salt. I think eventually we'll get to a point where basically everybody will have access to everything there probably won't be classes like like levels or whatever like right now billionaires are trying to figure out how to like control this AI stuff but to a certain point once
(42:46):
it gets to like its peak, it's just going to be like well we know everything we don't need you to do. We don't need you to make decisions anymore.
So, you know, it has all the data is probably listening to you right now if you have Google home or whatever like these things can know us intimately right so it's like, imagine a technological Renaissance, if you will, where it's like, you just have unlimited free time,
(43:18):
what do human beings do with free time.
I don't know, because he might need to stay busy, right, like we do need to stay busy doing things have goals, but it's like if you are a construction worker and you like construction, you know like what does that mean when there's a robot that can do your job, you know, 24 seven.
(43:41):
You know, like what are we as human beings without that those purposes anymore.
Of course, I don't know.
It's a tough question to answer. Yeah, sorry to get dark night that was kind of dark but I don't know. That's totally fine. I mean I could even go way darker with all this stuff I feel like I've got notes on here that I'm like, I'm like, should I even go this deep and I'm like, I don't want to take up your time
(44:08):
either.
Open book I'm like the guy that loves talking like philosophy about this type of stuff because it's endless right and if you have a jillion questions, and you, you know, don't want to get into it I understand.
I mean I was going to bring something up about how like you're talking about, you know, the, the AI models and the pro and the software is kind of, you know, it's created by these companies, and I was going to say that, you know, it's driven by the shareholders
(44:41):
and not driven by the needs for the people. And it's kind of a dangerous territory to think of where that could go because I mean, especially in a very capitalist society, it's the, the wills of the people aren't necessarily in the forefront of the those leaders minds
(45:02):
that are creating stuff like this. And so, what's the future of that look like and where does that kind of, you know, bring us what's that what's the outcome going to look like, you know, is it going to be used against people.
We can find plenty of examples already of how that's happened.
Or is it going to benefit society is going to become like a Star Wars or a Star Trek.
(45:26):
You know, you got dystopian or utopian. Yeah. And, you know, I even wrote down one of these questions, are we going to see a Terminator style future, are we going to see more of a, you know, her type of future where it's more beneficial or is it going to be
Matrix where we're it's some it's somewhere where like, you know, we believe that we're in a good spot, but we're actually being controlled. So it's I think it's all three. It's all three. No, totally. Because even in the Matrix is lower, right? It's like they had a technological boom, like in the real world.
(45:59):
And, you know, things were going great, but human beings kind of rebelled against the artificial life forms and the life forms those those AI, that artificial intelligence wanted to have their own existence.
And then we got into a fight with the thing we created. Right. And then the thing we created didn't destroy us. We tried to destroy them. But in doing so destroyed our environment.
(46:25):
Right. Like our whole ecosystem got messed up. The sun disappeared. Like we just nuked the planet or something trying to get rid of them.
But the AI knowing human nature, human beings decided to take all the humans that they could and put them into a simulation that wasn't perfect because human beings can't take perfection, clearly.
(46:49):
So they're in this simulation. But at the same time, you know, you get to the point that you get to in the Matrix where there's people revolting to know the truth because they're like, well, the the ugliness of our choices, you still want to know the truth versus being lied to.
Right. That's like the ultimate like human consciousness question, like the truth or the lie.
(47:14):
I don't know. I'm sorry. I went off on a weird tangent. But yeah, it's all three. All three. I love it. Well, why don't we kind of wrap this up here? Like I said, I don't want to take up your time too long.
But but why don't we finish up with this? So I kind of interested like what's your favorite representation of AI in a film? And then I also put do you enjoy seeing like a benevolent benevolent?
(47:39):
How do you pronounce that word benevolent? I don't know. But I love it. I think you got it right. Yeah, I like while you're RGD to or Samantha and her or you prefer to see like the menacing death machine like, you know, Terminator or or even like what's that?
Ex Machina where she kind of revolts against her master. I like the ones kind of in the middle like that kind of look evil, but maybe aren't like like like like in the second Avengers movie, you got Ultron, right?
(48:10):
And he's just been fed all this data of all the world's events. And he's like, yeah, you guys are going to kill each other if I don't do something about it, which, you know, I just need to kind of wipe the slate clean a little bit.
Not kill everybody, but kill enough people to where we can start over and do something better. But to like a human person with like a conscience, you would be like, that's wrong. You can't just kill people. But like to artificial intelligence, it's like, it's just simple numbers. You know, it's not it's not cruel.
(48:41):
But there's no actual like ethics behind it. Like you said, it's this makes sense mathematically. Yeah. And that's hard to argue like that's hard to fight against. Right? Like, I don't know. But that's that's that's my answer. I like I like those type of artificial intelligences.
Okay, seems closest to what we have now. Yeah. I don't know what I would choose. I think I like a mix of both, like the helpful AI versus like the the death machines. I think I like them for both different reasons. Like you were talking about like there's a lot of truth, even though it's lacking some of that, like empathetic or ethical side of it.
(49:19):
Like it's like, no, they have a point with what they're saying. And it brings up these conversations like, you know, it is it, you know, should we bring the ethics into it? Or should we kind of go by these numbers? And yeah, maybe that's why you love it. Because it brings in those conversations. It makes you think a little deeper. Yeah. Yeah.
(49:41):
I'm kind of on track with you on that. Yeah, no, you're 100% spot on, man. This is fun. I really enjoyed this conversation. Dude, seriously, though, this has been such a pleasure. Like, like, I never know what to expect going into these conversations. I've reached out to so many people. And like, I've gotten turned down by a lot of people like, no, I'm just I don't want to be on anything like that. But then, like, like, like you and a couple other people, it's like, no, let's talk like, you know, we can find a really good conversation. So the it's, it's
(50:11):
this is just been such a good one. I'm happy to I'm happy to have delivered for you. Okay, I think, you know, I mean, I've done a few podcasts before. And I think this was a fun one, just because this one was like very philosophical. And also, techie, you know, like, we didn't really get together. Yeah, yeah.
I grew up in a very, like, religious household, very Christian household. So like, we're always kind of thinking about like, morality and, you know, living in the world, but also keeping your values, like, you know, how do you find how do you find that balance and stuff like that? So very interesting.
(50:49):
So, Dan, where can people find you? Because I know you have, you know, social media presence. I don't know if you like you have a website or anything like that.
Yeah, so I don't have a website right now. But that's in the works. But you can find me on Instagram at nerdy Dan one. That's N E R D Y D A N one and then you can also find me on tick tock with the same username and then on YouTube as nerdy Dan. So, you know, if you want to see me do some cool movie stuff and continue to see where it takes me, please give me a follow up.
(51:25):
Please check him out. He's got some really cool stuff. Well, I'm gonna I'm gonna log us out here, but it's really been a pleasure. Thank you so much for coming on here. Thank you. It's not the pleasure was all my man. Thank you.
(51:55):
dot com, as well as our socials on Facebook, Instagram and X. If you like the content that we create and would like to support us, make sure to subscribe or follow wherever you can and leave us a review on Apple podcasts. And as always, until next time, stay curious, keep an open mind and celebrate the traditions that bring us all together.