Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
Welcome to Off The Wire, the podcast that helps you curb your cybersecurity risk and tackle technology challenges.
I'm Tanner and I'm joined by Anthony.
We're two IT executives with 35 years of experience.
Let's start the show.
welcome back everyone to Off The Wire.
Today we're gonna do something a little bit different.
(00:22):
We're actually just gonna discuss a single topic, not a lot of outline, just kind of go back and forth and kind of hear each other's opinions on something that I think everyone is talking about.
But, uh, maybe we haven't talked about as much on the show, but before we jump into that, if you're not subscribed already, make sure you subscribe.
it's the best way for you to be able to find our content.
(00:43):
if you subscribe internal notifications, as soon as we post new content, you can find it.
and it keeps you up to date with everything we're doing.
And if you find value in the show, we'd like you to share it.
Share it on social media, share it your personal social media or to a colleague or maybe just a close friend who you think also would get value from our show.
(01:03):
So Anthony, we are doing something different tonight.
We, We decided to kind of just bring up a topic and go to work talking about it.
16
00:01:11,414.666666667 --> 00:01:18,24.666666667
I think originally it was kind of your thought on this episode, so you want to introduce what we're gonna be talking about and get us started.
17
00:01:18,74.666666667 --> 00:01:25,709.666666667
Yeah, so we're gonna discuss AI and what we think the impact on the job market is gonna be and, and really all aspects.
18
00:01:26,504.666666667 --> 00:01:30,174.666666667
I do get the feeling that, Tanner and I probably won't see a hundred percent eye to eye.
19
00:01:30,174.666666667 --> 00:01:49,194.66666667
I'm sure there's gonna be some common ground that we, uh, to share this is something that really, it's been on my mind quite a bit just where I've got a daughter that's 18 years old and I mean, I've got some that are a few years older and a few years younger I really think about the impact of AI and, how's it gonna impact our children? not so much worried about myself.
20
00:01:49,354.66666667 --> 00:01:56,669.66666667
I'm not saying it's not a little bit worried out there, you know, maybe 15, 20 years from now, but really about our children and which direction that goes.
21
00:01:56,669.66666667 --> 00:02:04,189.66666667
And, you know, talking to my daughter and giving her advice on what jobs to take you might be surprised at some of the things I've, I've told her.
22
00:02:04,689.66666667 --> 00:02:06,999.66666667
Yeah, so I mean, dive into that a little bit, Anthony.
23
00:02:06,999.66666667 --> 00:02:15,699.66666667
I mean, a young daughter as well, and, so it's a little bit, I guess she's a little ways away from, deciding her career path, but, we know a AI is gonna be an impact.
24
00:02:15,799.66666667 --> 00:02:16,909.66666667
we already see impact.
25
00:02:16,959.66666667 --> 00:02:24,524.66666667
I think it's in what we do specifically, and I'm not saying every industry, I think some industries are already more impacted than other industries.
26
00:02:25,49.66666667 --> 00:02:27,179.66666667
but in what we do, I think we're seeing impact.
27
00:02:27,179.66666667 --> 00:02:30,929.66666667
It's not, I would say drastic, and I don't know when drastic is.
28
00:02:30,929.66666667 --> 00:02:34,829.66666667
That could be a year from now, or or 20 or who knows.
29
00:02:34,929.66666667 --> 00:02:42,149.66666667
so what do you tell her there? are you just giving her only, her advice on like only roles that you think won't be impacted, or are you, Yeah.
30
00:02:42,179.66666667 --> 00:02:45,944.66666667
how do you give her advice? Well, I've been pretty fortunate because.
31
00:02:46,574.66666667 --> 00:02:54,914.66666667
she's looking at the healthcare industry and she's looking, at becoming a nurse and maybe even doing like a nurse practitioner, after she gets her hours in.
32
00:02:55,274.66666667 --> 00:03:01,154.66666667
But I told her up front, so anyone that wants to be a nurse, like you can't just jump into school and get accepted into nursing program.
33
00:03:01,154.66666667 --> 00:03:03,714.66666667
Like there's, a wait time everywhere you go.
34
00:03:03,714.66666667 --> 00:03:15,904.66666667
And I was telling her just upfront, I was like, don't be upset when you start, going to UNCW and just can't immediately get into the nursing program We went through orientation and stuff, and they made it blatantly obvious that you can't do that.
35
00:03:15,904.66666667 --> 00:03:20,714.66666667
there's 200 to 300 applicants every semester, and they only accept 60 anyways.
36
00:03:20,794.66666667 --> 00:03:29,734.66666667
she said, well, maybe I should just go for, computer science, or it, and surprisingly I suggested, I don't know that I would do that.
37
00:03:30,219.66666667 --> 00:03:31,869.66666667
I don't know that I would do that right now.
38
00:03:31,869.66666667 --> 00:03:33,639.66666667
any job that you know.
39
00:03:34,374.66666667 --> 00:03:37,154.66666667
There's like physical interaction, you know, I feel like it's safe.
40
00:03:37,494.66666667 --> 00:03:45,29.66666667
I think there's ways that, they can leverage AI to help those positions, but there's a lot of jobs that, they will impact more so than others.
41
00:03:45,29.66666667 --> 00:03:55,89.66666667
And surprisingly enough, I went to the doctor just, a couple days ago and, he's a pa and he's got a son that just graduated and somehow we got to talking about ai.
42
00:03:55,589.66666667 --> 00:03:58,139.66666667
I was like, well, I said, at least she's going for the healthcare industry.
43
00:03:58,169.66666667 --> 00:03:58,889.66666667
Like she's safe.
44
00:03:58,889.66666667 --> 00:04:01,229.66666667
And he is like, yeah, I think a nurse might be safe.
45
00:04:01,229.66666667 --> 00:04:08,489.66666667
He's like, but a job like mine, he was concerned about, you know, PAs down the road and anyways, we gotta talk about regulation and stuff like that.
46
00:04:08,569.66666667 --> 00:04:11,419.66666667
I think there's a lot of people out there that have a little concern.
47
00:04:11,419.66666667 --> 00:04:15,639.66666667
I've heard it from lawyers, from doctors, technical people.
48
00:04:15,789.66666667 --> 00:04:16,359.66666667
I do think.
49
00:04:16,859.66666667 --> 00:04:17,609.66666667
Cybersecurity.
50
00:04:17,724.66666667 --> 00:04:21,594.66666667
I see a lot of people that are in cybersecurity that are looking for jobs, which is kind of scary.
51
00:04:21,624.66666667 --> 00:04:30,114.66666667
'cause you know, it was just six months ago to a year ago, we're seeing that there's a national shortage for cybersecurity people, which I didn't really agree with when I saw that article.
52
00:04:30,144.66666667 --> 00:04:35,804.66666667
'cause I knew a lot of people were looking for jobs in cyber, but I think it's somewhere between, they are using ai.
53
00:04:36,254.66666667 --> 00:04:40,44.66666667
To streamline, how many people they need in like a SOC and stuff like that.
54
00:04:40,544.66666667 --> 00:04:44,144.66666667
And it's in conjunction, it's them outsourcing, overseas.
55
00:04:44,474.66666667 --> 00:04:51,164.66666667
So I, it's really hard to say how much of an impact AI is having on cybersecurity right now, with the outsourcing happen at the same time.
56
00:04:51,164.66666667 --> 00:04:53,174.66666667
But I do think there's an impact right now on it.
57
00:04:53,234.66666667 --> 00:04:54,254.66666667
I just don't know how much it is.
58
00:04:54,754.66666667 --> 00:04:58,534.66666667
Yeah, I mean, there's definitely an impact.
59
00:04:58,564.66666667 --> 00:04:59,584.66666667
I think it's hard to say.
60
00:04:59,634.66666667 --> 00:05:06,614.66666667
at least from my perspective, there are things as I've, as we're learning more about AI at work, there are.
61
00:05:07,424.66666667 --> 00:05:17,779.66666667
Things that I, feel like AI is there's almost like a binary response, There's either completely blown away or they're frustrated that it's not working Yeah.
62
00:05:18,214.66666667 --> 00:05:21,724.66666667
I feel like I get a lot of like, oh my gosh, I can't believe how much time that saved me.
63
00:05:22,564.66666667 --> 00:05:26,804.66666667
And then the same thing, and the same breath almost, or maybe even the next prompt.
64
00:05:26,924.66666667 --> 00:05:27,374.66666667
I get it.
65
00:05:27,374.66666667 --> 00:05:29,24.66666667
I'm like, oh my gosh, it doesn't understand me.
66
00:05:29,369.66666667 --> 00:05:36,89.66666667
Doesn't understand what I want or it does, but it's like too wordy and even the other day I was doing something that, really did save me some time.
67
00:05:36,89.66666667 --> 00:05:50,664.66666667
I would say probably saved me an hour, but it's not the a hundred XI think that sometimes people claim, like I had to do, I had to have it rework, like some questions I was coming up with, and kind of an outline and.
68
00:05:51,164.66666667 --> 00:05:52,214.66666667
did save me like an hour.
69
00:05:52,494.66666667 --> 00:05:55,214.66666667
but I still spent like, I would say like 40 minutes.
70
00:05:55,314.66666667 --> 00:06:00,504.66666667
I had to back and forth with it prompts, you know, changing certain things, reworking questions.
71
00:06:01,344.66666667 --> 00:06:09,574.66666667
I have found also, and this is just my personal opinion, but there's still, at least in LLMs, large language models, there's still an innate like.
72
00:06:10,399.66666667 --> 00:06:21,199.66666667
even when I get a response, I can almost like, and calling anybody out, so if you're listening, not calling you out if you do this, we, you know, we use AI to help us, with our social media posts and stuff too.
73
00:06:21,199.66666667 --> 00:06:28,29.66666667
But you can very easily tell someone that has a social media post completely written by, I mean, I can spot it.
74
00:06:28,239.66666667 --> 00:06:29,709.66666667
It's almost like there's a blinking light.
75
00:06:29,769.66666667 --> 00:06:30,519.66666667
I can see it.
76
00:06:30,549.66666667 --> 00:06:31,569.66666667
I can see exactly.
77
00:06:31,569.66666667 --> 00:06:34,709.66666667
And just using LLMs enough that it sticks out.
78
00:06:34,709.66666667 --> 00:06:43,979.66666667
And what I have found, and maybe this is kind of another topic, enjoy using AI to help me, but I don't enjoy seeing AI content.
79
00:06:44,819.66666667 --> 00:06:45,119.66666667
Yeah.
80
00:06:45,989.66666667 --> 00:06:51,479.66666667
Like, I think it helps me, but if it helps somebody else and their content, I completely blow through those stuff.
81
00:06:52,229.66666667 --> 00:06:56,99.66666667
when I see like a post completely generated by ai, like blow through it.
82
00:06:56,729.66666667 --> 00:06:58,769.66666667
I just like, eh, whatever, go to the next one.
83
00:06:59,39.66666667 --> 00:07:10,459.66666667
To me, I like, I want the original thought and maybe we're just, you know, kind of in an early form of ai, but it, I feel like there's still, it feels like the original thought.
84
00:07:11,254.66666667 --> 00:07:14,314.66666667
Is us, and it's basically playing a game of copycat with us.
85
00:07:14,314.66666667 --> 00:07:14,674.66666667
Yeah.
86
00:07:14,764.66666667 --> 00:07:15,954.66666667
way at all? Yeah.
87
00:07:15,954.66666667 --> 00:07:23,794.66666667
But also I think a little bit is, and I've seen courses out there for prompt engineering courses, literally just learning to do prompts.
88
00:07:23,794.66666667 --> 00:07:32,504.66666667
And, we saw, a thing in the news, it's been a while back, but I believe it was the Shopify CEO basically instructing, he had a memo to all his employees.
89
00:07:32,554.66666667 --> 00:07:33,4.66666667
they need to be.
90
00:07:33,649.66666667 --> 00:07:34,639.66666667
Learning ai.
91
00:07:34,669.66666667 --> 00:07:42,874.66666667
And I don't know that he called out prompt engineering, but I think that's what he was looking for, is, you need to be able to leverage AI and you're right.
92
00:07:42,874.66666667 --> 00:07:49,714.66666667
So like when I, the telltale sign on, social medias, you'll see like the little icons, you know, that link or, IG BT does.
93
00:07:49,714.66666667 --> 00:07:52,444.66666667
And if I see those, I'm automatically just not reading it.
94
00:07:53,154.66666667 --> 00:07:53,834.66666667
Or hyphens.
95
00:07:53,834.66666667 --> 00:07:54,644.66666667
The long hyphen.
96
00:07:55,454.66666667 --> 00:07:55,724.66666667
Yeah.
97
00:07:55,724.66666667 --> 00:07:56,774.66666667
A lot of long hyphens.
98
00:07:56,774.66666667 --> 00:07:57,194.66666667
Yeah.
99
00:07:57,554.66666667 --> 00:08:02,874.66666667
It, it's, there's some telltale signs and, not that I, I just don't want to see AI completely generated.
100
00:08:02,874.66666667 --> 00:08:06,504.66666667
I could, the other day I was, totally, you know, unrelated to what we do as work.
101
00:08:06,504.66666667 --> 00:08:11,634.66666667
But I was reading an article on a hobby, and at the bottom it actually just said, AI generated.
102
00:08:12,354.66666667 --> 00:08:16,314.66666667
And I know this sounds crazy, but I actually just discounted everything that I read.
103
00:08:16,814.66666667 --> 00:08:17,534.66666667
I literally did.
104
00:08:17,534.66666667 --> 00:08:19,934.66666667
I was like, yeah, I'm not gonna like give that the credence.
105
00:08:20,684.66666667 --> 00:08:22,364.66666667
So there is something there.
106
00:08:22,414.66666667 --> 00:08:29,344.66666667
you know, I don't know if we're just on the precipice of that being better, or if we're just gonna get to a point where we can't recognize it.
107
00:08:29,354.66666667 --> 00:08:31,34.66666667
there is something like warmer.
108
00:08:31,64.66666667 --> 00:08:43,324.66666667
I feel like that humans, write, and I don't know how to describe it when you can tell, at least now with the current generation tech we have, you can kind of tell, If they use it as a help to them, you basically can't tell.
109
00:08:44,104.66666667 --> 00:08:49,234.66666667
If they use it to completely generate it, it's very obvious and it feels very hollow and it doesn't sound warm to me.
110
00:08:49,734.66666667 --> 00:08:50,694.66666667
You a little bit.
111
00:08:50,844.66666667 --> 00:08:52,524.66666667
I think for the most part, you're right.
112
00:08:52,554.66666667 --> 00:08:59,4.66666667
A lot of it though, I think is the lack of knowledge and, you know, generating, you know, information from ai.
113
00:08:59,564.66666667 --> 00:09:00,494.66666667
we actually.
114
00:09:00,804.66666667 --> 00:09:15,909.66666667
trained one, it was kind of a, we didn't use this in any form or fashion, like officially, but I trained one to basically imitate someone I knew and did a lot of training on it and, basically said the location, the position of the person.
115
00:09:16,719.66666667 --> 00:09:17,909.66666667
And, you wouldn't confuse the two.
116
00:09:17,919.66666667 --> 00:09:25,559.66666667
don't get me wrong, but if you didn't know that person and you were to talk to that chat client, would think that it's a real person, a little more so than not.
117
00:09:25,989.66666667 --> 00:09:32,779.66666667
I found a thing on Reddit that I was using it and for a different purpose than the people on Reddit were using it for.
118
00:09:32,779.66666667 --> 00:09:41,519.66666667
They had prompt training to make the AI sound so human that they couldn't get caught cheating in school, I believe was the use case.
119
00:09:41,999.66666667 --> 00:09:44,839.66666667
I just wanted something that didn't sound like an ai, Yeah.
120
00:09:44,899.66666667 --> 00:09:46,159.66666667
Text that I pumped out.
121
00:09:46,159.66666667 --> 00:09:50,779.66666667
So I utilized that and, you know, I would use it for some things.
122
00:09:50,779.66666667 --> 00:09:54,19.66666667
And when you, when you would read it, I mean, it would even dumb itself down.
123
00:09:54,49.66666667 --> 00:09:55,39.66666667
level words and stuff.
124
00:09:55,804.66666667 --> 00:09:55,894.66666667
And.
125
00:09:55,939.66666667 --> 00:10:00,794.66666667
I, I've found it, for me at least, if I need like a little more warmth in it, I actually.
126
00:10:01,294.66666667 --> 00:10:04,894.66666667
will add on and be very human-like to the prompt.
127
00:10:05,614.66666667 --> 00:10:20,734.66666667
And that's whenever you do see it, and you're right, it is it's just, it's a very catch 22 thing because we want to use AI to enhance our abilities, right? There's a lady that works with me that is a very good writer, and I joke around, I say, Hey, we'll just call her Anne.
128
00:10:21,304.66666667 --> 00:10:23,704.66666667
I say, AI makes me sound like Anne.
129
00:10:24,514.66666667 --> 00:10:26,539.66666667
'cause she's just such a good writer, right? Yeah.
130
00:10:26,839.66666667 --> 00:10:36,794.66666667
But then, you know, at the same time, like if it's not us and people that know, you, know, you, I mean, like, you're not gonna get a, you're not gonna be like, pull a, quick one over on him.
131
00:10:37,514.66666667 --> 00:10:42,169.66666667
oh, Anthony all of a sudden got a lot smarter, or he got a, you know, he doesn't make any grammar mistakes anymore.
132
00:10:42,199.66666667 --> 00:10:43,729.66666667
He used to make 'em like every other email.
133
00:10:43,729.66666667 --> 00:10:43,999.66666667
Right.
134
00:10:43,999.66666667 --> 00:10:53,999.66666667
But, so people that know you and then, like, how do you feel about that? There's, think there's just a lot of different, like, responses to people that use AI to help them in their job.
135
00:10:54,549.66666667 --> 00:11:05,459.66666667
I think there's some people that almost like the work cause they feel like you're almost taking a shortcut or kind of like cheating, right? You're like, oh, this wasn't really you.
136
00:11:05,639.66666667 --> 00:11:07,679.66666667
But I mean, it impact is gonna be there.
137
00:11:07,729.66666667 --> 00:11:08,89.66666667
I don't know.
138
00:11:08,89.66666667 --> 00:11:18,549.66666667
Someone told me, I don't know where I heard this on as a previous mentor or what, but I heard someone say that I really have kept in my head that things are never as good as they seem and they're never as bad as they seem.
139
00:11:18,939.66666667 --> 00:11:23,709.66666667
that if someone predicts AI is gonna take over, you know, let's say 20% of roles.
140
00:11:23,709.66666667 --> 00:11:27,334.66666667
I think I've heard, and you correct me if I'm wrong, but I've heard 20% of jobs within 10 years.
141
00:11:27,334.66666667 --> 00:11:29,764.66666667
That's feels like the number I've heard or something around there.
142
00:11:29,824.66666667 --> 00:11:35,464.66666667
20, maybe it could be different depending on the person, but it seems like that's, I'm gonna call it like the reasonable take.
143
00:11:35,974.66666667 --> 00:11:36,264.66666667
Yeah.
144
00:11:36,454.66666667 --> 00:11:42,344.66666667
it could be 30, you know, we got five years as a society to live and there's people that say, this is 30 years out.
145
00:11:42,504.66666667 --> 00:11:44,64.66666667
somewhere in the middle is where we'll land.
146
00:11:44,184.66666667 --> 00:11:47,244.66666667
I'm sure it won't be 30 years out, and I'm sure it won't be next year.
147
00:11:47,484.66666667 --> 00:11:50,474.66666667
I have found that typically is the way things kind of flush out.
148
00:11:50,734.66666667 --> 00:11:51,664.66666667
they definitely happen.
149
00:11:51,664.66666667 --> 00:11:56,44.66666667
I, have no, I'm not naive to think that like there's not gonna be impact from ai.
150
00:11:56,44.66666667 --> 00:11:57,454.66666667
They're a hundred percent will, but.
151
00:11:58,219.66666667 --> 00:12:01,29.66666667
Will it be as big as some people claim? Probably not.
152
00:12:01,669.66666667 --> 00:12:05,539.66666667
will it be somewhere in the middle somewhere that really nobody knows? Probably.
153
00:12:06,39.66666667 --> 00:12:15,549.66666667
but at the same time, at least for us now, I can't speak for everyone in every role, in every career, but for us, we literally are not able to get all that we need to get done.
154
00:12:15,549.66666667 --> 00:12:19,319.66666667
So least for now, we see it as a blessing.
155
00:12:19,439.66666667 --> 00:12:20,389.66666667
we're thinking to ourselves.
156
00:12:21,499.66666667 --> 00:12:23,339.66666667
There's a hundred percent of what we wanna do.
157
00:12:23,459.66666667 --> 00:12:28,79.66666667
Maybe we're getting, I'm, I'm gonna of course, quote a numbers 70% of what we want to do done.
158
00:12:28,769.66666667 --> 00:12:30,389.66666667
need that other 30% done too.
159
00:12:31,109.66666667 --> 00:12:33,629.66666667
So this is, this could be a game changer in that regard.
160
00:12:33,629.66666667 --> 00:12:46,119.66666667
one thing that recently has happened too, and I, I didn't know this until, it was actually, our executive assistant was telling me about the, the one big beautiful bill and, you know, that's, a lot of news about that.
161
00:12:46,119.66666667 --> 00:12:55,189.66666667
But one of the things in there, there was a thing on ai, Yeah, basically where they can't roll out any laws or regulations to curtail ai.
162
00:12:55,549.66666667 --> 00:13:02,689.66666667
You know, I know there's already existing stuff in place, but, you know, that's something I think about and, you know, people like, that's a pretty big deal.
163
00:13:02,689.66666667 --> 00:13:08,739.66666667
And just thinking about the, the Hollywood writers that were striking, you know, a couple years ago, do you remember that Tanner where.
164
00:13:08,834.66666667 --> 00:13:17,824.66666667
had ai, like you can't use AI to, I don't, I can't remember the specifics in it, but there was something in there where they didn't want the use of AI to be writing stuff Yeah.
165
00:13:18,189.66666667 --> 00:13:19,419.66666667
and that was a few years ago.
166
00:13:19,449.66666667 --> 00:13:24,309.66666667
And think about it like that was pretty smart of them to be thinking about that at the time.
167
00:13:25,39.66666667 --> 00:13:28,589.66666667
do you think, how much of an impact do you think regulation is gonna.
168
00:13:29,89.66666667 --> 00:13:31,999.66666667
I would say that I think there's gonna be some regulation.
169
00:13:32,499.66666667 --> 00:13:42,59.66666667
I don't know to extent what it'll be because I also, maybe this is just a pessimistic view of society, but typically if there's money to be made in something, people gonna jump at that.
170
00:13:42,59.66666667 --> 00:13:42,209.66666667
you know.
171
00:13:42,509.66666667 --> 00:13:56,584.66666667
I've seen a lot of critique of Sam Altman right now, because, early on in his, open AI career, he talked about how he, felt like they were gonna, be nonprofit, how they were gonna kind of be the AI that's the good of the world.
172
00:13:56,584.66666667 --> 00:13:58,84.66666667
And now, now that there's.
173
00:13:58,584.66666667 --> 00:13:59,844.66666667
substantial money to be made.
174
00:13:59,844.66666667 --> 00:14:03,534.66666667
It seems like, he may be changing course there, somewhat.
175
00:14:03,534.66666667 --> 00:14:10,374.66666667
So it's tough for me to, lean on the morality of society to regulate it to a point that it might need to be.
176
00:14:11,64.66666667 --> 00:14:12,534.66666667
At the same time, I don't know.
177
00:14:12,534.66666667 --> 00:14:17,579.66666667
I mean, every time I feel like throughout history, I try to be an optimist and a pessimist at the same time.
178
00:14:17,634.66666667 --> 00:14:20,144.66666667
I try to basically look through that lens with.
179
00:14:20,754.66666667 --> 00:14:31,954.66666667
any important decision in my life? Like, what's the worst that could happen? What's the best that could happen? Throughout history, have been many, many times someone thought the entire world human race would be wiped out.
180
00:14:32,504.66666667 --> 00:14:34,94.66666667
uh, there was an industrial revolution.
181
00:14:34,574.66666667 --> 00:14:41,384.66666667
Uh, you know, literally, at one point in society, every single person or nearly every single person farmed.
182
00:14:41,884.66666667 --> 00:14:43,564.66666667
And now we don't do that anymore.
183
00:14:43,924.66666667 --> 00:14:44,224.66666667
Yeah.
184
00:14:44,794.66666667 --> 00:14:53,44.66666667
a good example I heard the other day is prior to 2005, or really, realistically, mobile app developer wasn't even a job that existed.
185
00:14:53,164.66666667 --> 00:14:54,334.66666667
And that's 20 years ago.
186
00:14:54,814.66666667 --> 00:14:59,44.66666667
we're only talking about 20 years ago there were jobs now that didn't exist 20 years ago.
187
00:14:59,854.66666667 --> 00:15:18,574.66666667
Are there other jobs out there? Are there things that we wanna accomplish as a human race or as human beings? That we haven't been able to accomplish for whatever reason, because we don't have maybe the time or ability to, and someone's gonna say, well, you know, AI could think of the ideas for us.
188
00:15:19,74.66666667 --> 00:15:19,584.66666667
I don't know.
189
00:15:19,589.66666667 --> 00:15:27,69.66666667
I'm curious to see, kind of where we go with ai, but I still think that genius of AI is actually.
190
00:15:28,14.66666667 --> 00:15:34,824.66666667
The combination and kind of collection of human knowledge, and human to thought and human discussion.
191
00:15:35,544.66666667 --> 00:15:40,944.66666667
I read an article the other day that talked about how there was trouble.
192
00:15:41,14.66666667 --> 00:15:53,204.66666667
they were having trouble training LLMs on synthetic data that they were running out of, original data, like original Reddit threads and 'cause there's only so many made per day Twitter, so on and so forth.
193
00:15:53,969.66666667 --> 00:15:55,619.66666667
they needed like a lot more to train.
194
00:15:55,619.66666667 --> 00:16:02,819.66666667
So they were actually starting to create synthetic data sets to train the models on, and the performance was way worse.
195
00:16:03,29.66666667 --> 00:16:10,59.66666667
Like not getting what, they were not getting the, you know, the effect of having those human conversations to lean on.
196
00:16:10,779.66666667 --> 00:16:11,409.66666667
so I don't know.
197
00:16:11,459.66666667 --> 00:16:12,659.66666667
I think I'm in the middle, I think.
198
00:16:13,364.66666667 --> 00:16:24,784.66666667
Right now, there's a lot of, and to me, AI can help enhance and take care of, a lot of those like easy things, but easy, but time consuming things is maybe the way I'll put it.
199
00:16:24,784.66666667 --> 00:16:33,234.66666667
And then obviously some advanced stuff too, but why do we do analysis? why do we think about, cybersecurity, like some sort of cybersecurity, initiative that we're doing.
200
00:16:33,234.66666667 --> 00:16:40,234.6666667
It's because there's something out there we're trying to respond to are we at a point where we kind of say, Hey, ai, I'm trying to run a cybersecurity thing.
201
00:16:40,234.6666667 --> 00:16:47,794.6666667
What do I need to focus on? is AI gonna have the conversations with the vendors? maybe when we get to agent based ai, that's a situation.
202
00:16:48,139.6666667 --> 00:16:48,439.6666667
Yeah.
203
00:16:48,739.6666667 --> 00:16:51,724.6666667
is AI gonna implement the software? Maybe, I don't know.
204
00:16:51,724.6666667 --> 00:17:00,744.6666667
I mean, we're definitely, in my opinion, and I'm sure I'll be wrong on the exact year, I feel like we're still a ways away from that kind of level.
205
00:17:00,804.6666667 --> 00:17:09,354.6666667
But far as being able to collect your thoughts, being able to organize yourself, being able to even do analysis, of those other things.
206
00:17:09,854.6666667 --> 00:17:10,514.6666667
we're close.
207
00:17:10,514.6666667 --> 00:17:11,504.6666667
I don't know that we're there yet.
208
00:17:11,504.6666667 --> 00:17:15,334.6666667
We've been doing some testing, with, trying to do simple language query.
209
00:17:15,664.6666667 --> 00:17:20,674.6666667
And I'm gonna be honest, we've been really disappointed, where AI is right now.
210
00:17:20,794.6666667 --> 00:17:29,310.9166667
there is nothing like you can train Ai on data, and maybe that's part of the problem is like AI is trained on such a small subset of human data.
211
00:17:29,554.6666667 --> 00:17:30,844.6666667
Maybe that's a better way to say it.
212
00:17:31,594.6666667 --> 00:17:34,144.6666667
Because it's trained on conversations we have online.
213
00:17:34,234.6666667 --> 00:17:45,749.6666667
And yes, some of that is some of that is like, you know, I think the big things, right? Like how do I plumb a house, right? Or how do I fix a toilet? Maybe that's, you know, had been had online.
214
00:17:45,749.6666667 --> 00:18:10,844.6666667
But to me, like when you get into, like, at least in our field, a lot of those conversations have never been had What data set? I think in your, like, you know, just thinking maybe, you know, from my, from our perspective as co-ops, what data set has anyone put out on the internet talking about, what kind of, you know, resiliency we wanna put in our grid and where our outages are in relation and why we need to certain.
215
00:18:11,749.6666667 --> 00:18:17,179.6666667
put additional, I guess, you know, either do some additional work to certain circuits to increase reliability.
216
00:18:17,179.6666667 --> 00:18:18,769.6666667
That conversation's never been had online.
217
00:18:19,269.6666667 --> 00:18:23,589.6666667
It's like that's an AI data set that does not exist, like ChatGPT doesn't know about that.
218
00:18:24,369.6666667 --> 00:18:27,519.6666667
couldn't tell me anything about most of our systems.
219
00:18:27,849.6666667 --> 00:18:31,154.6666667
and even to train it on our systems is almost impossible.
220
00:18:31,884.6666667 --> 00:18:46,684.6666667
unless you wanna spend an, an egregious amount of money, doing a private AI model and then actually training on the data, but then it doesn't understand the data and you're having to describe the data, I would just say like, we have been disappointed with how fast we could move on that regard.
221
00:18:46,684.6666667 --> 00:18:56,214.6666667
And we think this is like probably one of the most powerful things for us that we want to use AI for is being able to have people do analysis with just simple questions For example, right.
222
00:18:56,244.6666667 --> 00:18:57,294.6666667
That would be a great example.
223
00:18:57,354.6666667 --> 00:19:00,84.6666667
Like our engineers sitting there is we need to increase reliability.
224
00:19:00,524.6666667 --> 00:19:09,304.6666667
in this particular district, what are the top 10 ways we can do it and what's the most, dollar to, value ratio we can get? can't do that right now.
225
00:19:09,594.6666667 --> 00:19:10,554.6666667
I would love to do that.
226
00:19:10,554.6666667 --> 00:19:11,364.6666667
I think that's awesome.
227
00:19:11,364.6666667 --> 00:19:15,364.6666667
And then we could take more time actually doing it, doing those things.
228
00:19:15,364.6666667 --> 00:19:15,424.6666667
I.
229
00:19:16,144.6666667 --> 00:19:19,544.6666667
Less time, thinking about that, but I don't know.
230
00:19:19,544.6666667 --> 00:19:20,294.6666667
We're not there yet.
231
00:19:20,294.6666667 --> 00:19:23,504.6666667
I'm, that could be a year from now for all I know, or two years.
232
00:19:24,494.6666667 --> 00:19:27,794.6666667
I'm impressed every day and then I'm frustrated in some ways every day.
233
00:19:28,44.6666667 --> 00:19:30,594.6666667
We've been trying to do some training.
234
00:19:30,774.6666667 --> 00:19:35,874.6666667
On our own data with our local AI model, which I know you guys have too.
235
00:19:36,564.6666667 --> 00:19:48,54.6666667
And not gonna lie, like it had a lot of hallucinations, but I've seen enough times where it did work and I was just completely baffled how efficient it was.
236
00:19:48,594.6666667 --> 00:19:55,704.6666667
one was just the very first time I messed with data was with, ChatGPT, and it was synthetic.
237
00:19:55,704.6666667 --> 00:19:56,664.6666667
It was data that.
238
00:19:57,34.6666667 --> 00:20:05,424.6666667
I downloaded like it was a table and I copied it and put it into a spreadsheet and it was people traveling and, you know, it was a huge list.
239
00:20:05,424.6666667 --> 00:20:21,354.6666667
And looking at it, you know, like, I was like, okay, this is just a bunch of travel dates for people, you know, and it talked about the flight and where they were going to and from and looking at it, you know, I couldn't make a whole lot out of it, but I pumped it in ChatGPT and had it run analytics against it.
240
00:20:22,164.6666667 --> 00:20:31,314.6666667
what things do you see? do you see anything wrong with this? And, basically just what kind of trends and analysis do you see from this? I can't recall what the trends were, but it was pretty impressive.
241
00:20:31,314.6666667 --> 00:20:34,414.6666667
It was talking about the people that are, doing the most and stuff like that.
242
00:20:34,954.6666667 --> 00:20:40,844.6666667
But when I asked the discrepancies, out of this huge list, there was someone that was double booked on a flight.
243
00:20:41,609.6666667 --> 00:20:49,109.6666667
another example, we did a, uh, I, I'm trying to remember it, but it was basically our DNS logs from our active directory.
244
00:20:49,529.6666667 --> 00:20:55,769.6666667
And we had an internal IP that was hitting a Russian website.
245
00:20:56,189.6666667 --> 00:21:01,229.6666667
And we're trying to figure out, you know, one, who is it? And we had a couple people look through that log.
246
00:21:01,229.6666667 --> 00:21:02,189.6666667
Now, this log was huge.
247
00:21:02,189.6666667 --> 00:21:03,619.6666667
It was like a 1.7
248
00:21:03,619.6666667 --> 00:21:05,429.6666667
gig document, Yeah.
249
00:21:05,534.6666667 --> 00:21:25,84.6666667
And they looked for half hour, couldn't find the offending IP I pumped that thing into the AI and it found it within, like, it was under a minute, for, you know, cybersecurity, if you're looking through log files, tell me like, if it works and it's accurate, there's no way that ai, you know, a human could compete with ai.
250
00:21:25,464.6666667 --> 00:21:37,144.6666667
I challenge you on something? Yeah, what if it didn't find something? so now when that happened, I a hundred percent validated it in the text file, like I did not blindly.
251
00:21:37,174.6666667 --> 00:21:43,674.6666667
'cause when I saw it, I was like, I, I'm not trusting this blindly and that that's something that I think people can get lackadaisical on.
252
00:21:43,724.6666667 --> 00:21:43,944.6666667
Yep.
253
00:21:44,94.6666667 --> 00:21:45,504.6666667
And not double check their stuff.
254
00:21:45,534.6666667 --> 00:21:50,534.6666667
people doing stuff like, for social media posts, like you've gotta be checking everything Yep.
255
00:21:50,694.6666667 --> 00:21:51,454.6666667
you're pumping out.
256
00:21:52,364.6666667 --> 00:21:53,924.6666667
Yeah, that's a big thing for us.
257
00:21:53,974.6666667 --> 00:21:56,134.6666667
as we start to deploy more ai.
258
00:21:56,759.6666667 --> 00:21:57,779.6666667
the responsibility of work.
259
00:21:57,809.6666667 --> 00:21:58,99.6666667
Yeah.
260
00:21:58,204.6666667 --> 00:21:58,784.6666667
thing, Yeah.
261
00:21:59,284.6666667 --> 00:22:14,234.6666667
I see a lot of benefit now, at least in what we do, and even in the small medium business world, I think there are times in a job where you kind of have, a very like regular cadence to what you're doing, right? And then there are periodic times where you have to like scale out what you're doing.
262
00:22:14,789.6666667 --> 00:22:20,849.6666667
For a particular task and then kind of scale back down to a certain level, like log review, that would be a good example.
263
00:22:20,949.6666667 --> 00:22:29,769.6666667
I think if you're doing a lot of, policy creation, maybe proofreading, like there are certain times where it just feels like you could have seven people help you out and doing whatever you're doing.
264
00:22:30,269.6666667 --> 00:22:34,859.6666667
And then there's certain times where you're like, if someone is helping me out right now, I'd have to be explaining everything I'm doing to them.
265
00:22:35,169.6666667 --> 00:22:37,389.6666667
and I'm not saying we're not gonna, we are gonna improve.
266
00:22:37,389.6666667 --> 00:22:46,899.6666667
I mean, new AI is gonna be coming out, but for right now, the times where you have to scale yourself, where you're like, you wish you had 10 of yourself, AI is great at that.
267
00:22:47,229.6666667 --> 00:22:48,669.6666667
It is like superb.
268
00:22:48,669.6666667 --> 00:22:51,959.6666667
Any of that kind of stuff where it's you know, I wish I had, I need to write this thing.
269
00:22:51,959.6666667 --> 00:22:55,139.6666667
I wish I had three writers that worked for me and I wish I had someone to preread it.
270
00:22:55,139.6666667 --> 00:22:59,579.6666667
And then I wish I could just say yes or no on what they find on what product they put out.
271
00:23:00,104.6666667 --> 00:23:01,304.6666667
AI is awesome at that.
272
00:23:01,319.6666667 --> 00:23:05,864.6666667
I mean, it is, it can be sometimes five x, 10 x time savings.
273
00:23:06,364.6666667 --> 00:23:15,604.6666667
What I've found is we've all had a colleague come up to us and be like, Hey, can I help you out in something? You're like, you kind of know that, like if they helped you, it would only help you like 5% more.
274
00:23:16,264.6666667 --> 00:23:19,784.6666667
And for those times, I have not found AI to be beneficial yet.
275
00:23:19,784.6666667 --> 00:23:20,954.6666667
I'm not saying it won't get there.
276
00:23:21,114.6666667 --> 00:23:25,169.6666667
but there are times whenever, and, maybe that's specific to job role.
277
00:23:25,169.6666667 --> 00:23:26,839.6666667
I mean, there, there could be a lot of things there.
278
00:23:26,839.6666667 --> 00:23:44,219.6666667
But what I would, what I hope, at least what I hope for us, in our industry that we can take those times where we really need to scale ourselves and we can take more time for the time that really, know, what we're doing and where don't necessarily need 10 of ourselves.
279
00:23:44,409.6666667 --> 00:23:45,819.6666667
we can have more of that time.
280
00:23:45,869.6666667 --> 00:23:47,339.6666667
More strategic focus.
281
00:23:47,839.6666667 --> 00:23:50,299.6666667
because there's just a lot of things you have to explain.
282
00:23:50,299.6666667 --> 00:24:02,909.6666667
I was thinking the other day, we were working through an issue with payments and worked at the co-op 17 years and I don't know that I could accurately, explain all the different possible scenarios I've come across.
283
00:24:03,29.6666667 --> 00:24:08,219.6666667
I remember them, if they come up again, they like kind of pop up in my mind, but like, I don't think I could write it out in a book.
284
00:24:09,149.6666667 --> 00:24:09,689.6666667
Like right now.
285
00:24:09,689.6666667 --> 00:24:10,409.6666667
I couldn't write it out.
286
00:24:10,409.6666667 --> 00:24:22,89.6666667
I probably couldn't even write you two pages of stuff, but if it hits me, if I've experienced it in the past, like it's all of a sudden there's like an unlocked memory page, right? That unlocks like, oh yeah, I remember that, and then we did this, and this is what happens.
287
00:24:22,899.6666667 --> 00:24:25,509.6666667
right now I don't know how we tap into that.
288
00:24:26,9.6666667 --> 00:24:31,254.6666667
I think it would be kind of interesting to do that because I think it would really help with onboarding, you get a new person and.
289
00:24:32,34.6666667 --> 00:24:33,624.6666667
have so much experience walk out the door.
290
00:24:33,624.6666667 --> 00:24:36,474.6666667
We talk about this a lot in our world because people stay a long time.
291
00:24:37,344.6666667 --> 00:24:50,474.6666667
and it just feels like you kind of never get that like it take, or even if you do get it, it just takes like someone's gotta be there 10 years before they kind of get that, you know, that experience level that really like starts to like really impact and help their job.
292
00:24:50,474.6666667 --> 00:24:52,154.6666667
I mean, do you agree with that? No.
293
00:24:52,154.6666667 --> 00:24:52,484.6666667
Or.
294
00:24:53,204.6666667 --> 00:25:08,704.6666667
No, we've, we definitely have some of that, you know, we've got people that have been on the system for 30 plus years, we've got one person I know of where, he's built a lot of the infrastructure for our electrical grid or been part of it, and he's just got that core knowledge the guy's name's Dale.
295
00:25:08,704.6666667 --> 00:25:11,134.6666667
Like I can ask Dale something and he can spout it off.
296
00:25:11,134.6666667 --> 00:25:14,464.6666667
He's almost, he's, he's almost like a AI guy 'cause he remembers everything.
297
00:25:15,424.6666667 --> 00:25:20,19.6666667
Um, but you know, if I had to go look through the diagrams to find where something's at, good luck with that.
298
00:25:20,519.6666667 --> 00:25:26,899.6666667
Yeah, and there's just so much right now, I bet of what we need to be, what documentation we need to have.
299
00:25:27,19.6666667 --> 00:25:28,669.6666667
We have 10% of what we need.
300
00:25:29,169.6666667 --> 00:25:33,84.6666667
There's 90% of the stuff that we do on like an everyday basis that is not written down anywhere.
301
00:25:33,384.6666667 --> 00:25:33,684.6666667
Yeah.
302
00:25:34,29.6666667 --> 00:25:35,859.6666667
It is like built into people's minds.
303
00:25:35,859.6666667 --> 00:25:38,679.6666667
It's passed down from generation to generation.
304
00:25:38,679.6666667 --> 00:25:42,429.6666667
It's experience learned over the last like 10 years.
305
00:25:42,819.6666667 --> 00:25:44,229.6666667
I'm not saying that's a necessarily good thing.
306
00:25:44,229.6666667 --> 00:25:45,909.6666667
I think we need to document better than we do.
307
00:25:46,629.6666667 --> 00:25:50,219.6666667
But how would it, like right now, let's just say, would we have someone like.
308
00:25:50,778.4166667 --> 00:25:51,169.6666667
The problem.
309
00:25:51,169.6666667 --> 00:25:59,24.6666667
Like I said, the problem is I don't think you could take someone like you, you've dealt with so many things at Fort County, but I couldn't like take you a room and be like, tell me everything you dealt with at Fort County.
310
00:25:59,524.6666667 --> 00:26:01,114.6666667
Like it can't just be pulled like that.
311
00:26:01,114.6666667 --> 00:26:02,254.6666667
We're not a database.
312
00:26:03,154.6666667 --> 00:26:05,194.6666667
we kind of write some down periodically.
313
00:26:05,194.6666667 --> 00:26:07,294.6666667
We can't just do like a dump, brain dump.
314
00:26:07,354.6666667 --> 00:26:08,404.6666667
Alright, here's everything I know.
315
00:26:09,364.6666667 --> 00:26:12,904.6666667
It's unfortunate, I guess sometimes that that would be awful handy to remember something.
316
00:26:12,964.6666667 --> 00:26:25,364.6666667
You know, half the time we're like, what was that thing again? But how do you get that, how do you get that lost documentation, let's say, into a model? and that's where I see, I think we're getting there, but like there's still some hurdles to accomplish.
317
00:26:25,514.6666667 --> 00:26:31,994.6666667
I think it, at least like what we're seeing, generally across the board, I think there's.
318
00:26:32,339.6666667 --> 00:26:35,659.6666667
There's some amazing stuff happening, but it gets more specific, I think it gets tougher.
319
00:26:36,449.6666667 --> 00:26:38,399.6666667
maybe kind of like a bell curve, if that makes sense.
320
00:26:38,459.6666667 --> 00:26:38,789.6666667
Yeah.
321
00:26:38,909.6666667 --> 00:26:42,989.6666667
as we get kind of the early stuff, it was really hard as we're kind of getting into the meat of it.
322
00:26:43,419.6666667 --> 00:26:55,209.6666667
there's, it's getting easier and we're taking in like so much, but as we kind of like really drive that like blow you away value, that is where at least we are and what we've experienced so far.
323
00:26:55,509.6666667 --> 00:26:57,159.6666667
I'm not saying that won't change because.
324
00:26:58,29.6666667 --> 00:27:01,719.6666667
Things are changing, but just not, we don't see that there right now.
325
00:27:01,929.6666667 --> 00:27:03,309.6666667
Maybe it's tomorrow, I don't know.
326
00:27:04,794.6666667 --> 00:27:06,459.6666667
Yeah, I mean, we were talking earlier.
327
00:27:06,459.6666667 --> 00:27:06,879.6666667
Earlier.
328
00:27:06,879.6666667 --> 00:27:10,599.6666667
I do think as far as AI is concerned, we're in its infancy.
329
00:27:11,139.6666667 --> 00:27:26,939.6666667
there was a podcast that I was listen to, I listened to way too many podcasts, but, one I was listened to, it was, basically it was an entrepreneur and, a CEO and he had like thousands of hours and blog posts and like thousands of hours in YouTube videos.
330
00:27:27,539.6666667 --> 00:27:28,949.6666667
And so he had it like.
331
00:27:29,574.6666667 --> 00:27:37,144.6666667
basically transcribe all of his YouTube videos and then it would ingest his, data from his blogs.
332
00:27:37,144.6666667 --> 00:27:43,194.6666667
And he actually uses it kind of like a virtual version of himself to handle some of the smaller things.
333
00:27:43,199.6666667 --> 00:27:46,554.6666667
And as far as the training, I said the couple good examples.
334
00:27:46,554.6666667 --> 00:27:49,824.6666667
I can give you some terrible examples in which AI.
335
00:27:50,184.6666667 --> 00:27:53,754.6666667
Said that it was telling me the truth, and I said, you're hallucinating, stop hallucinating.
336
00:27:54,64.6666667 --> 00:27:58,204.6666667
and it would say, okay, I will reference your documentation and will not do that again.
337
00:27:58,594.6666667 --> 00:28:00,684.6666667
And it would still come up with a hallucination after that.
338
00:28:00,934.6666667 --> 00:28:04,744.6666667
I've seen the bad stuff and I don't trust anything that's coming out without validating it.
339
00:28:05,314.6666667 --> 00:28:09,294.6666667
But, just being in the early years, I don't think it's gonna be that much longer before we get there.
340
00:28:09,294.6666667 --> 00:28:10,884.6666667
Now, do I think it's gonna be a year? No.
341
00:28:11,844.6666667 --> 00:28:14,574.6666667
But I think in five years, that's gonna be a big thing.
342
00:28:14,579.6666667 --> 00:28:23,69.6666667
So I think with big companies, you know, a lot of them are gonna start having either local AI or cloud instances and be able to, to train off their data.
343
00:28:23,69.6666667 --> 00:28:29,609.6666667
And I was just thinking, you know, down the road, like the smaller companies were, you know, in some ways we are advanced, but in some ways we're not.
344
00:28:30,119.6666667 --> 00:28:40,679.6666667
But you know, I think for like small medium organizations, if we had, you know, if it was fiscally feasible, you know, to have a local AI and to train it and all of our stuff like that would be awesome.
345
00:28:40,679.6666667 --> 00:28:43,649.6666667
Like if, so Dale, I'll, I'll throw my Dale example now.
346
00:28:43,979.6666667 --> 00:28:44,459.6666667
Granted.
347
00:28:45,359.6666667 --> 00:28:55,109.6666667
Probably any percentage is in his Outlook account, but if it could ingest all of his Outlook emails and take that correspondence and have it, now, I'm not saying we would wanna do that for a lot.
348
00:28:55,109.6666667 --> 00:29:08,339.6666667
There's a lot of reasons why you wouldn't wanna do that, but you know, like there's some stuff where we've had some major construction and stuff, and that data is in Outlook, you know, like, and he references kinda like a, a filing cabinet between that.
349
00:29:08,399.6666667 --> 00:29:12,989.6666667
And you know, there's a whole lot of people where, you know, you can't record them with AI and stuff like that, but.
350
00:29:13,664.6666667 --> 00:29:31,454.6666667
Like if you had like all these meetings, like say we're like with our FIS migration that we did, our financial information system migration, if we had every single call, you know, had it transcribed by ai, could ask it questions like validate stuff like that would've been, that would be a huge help to kind of reference things.
351
00:29:31,484.6666667 --> 00:29:33,224.6666667
And then just all the changes we had.
352
00:29:33,494.6666667 --> 00:29:41,924.6666667
We didn't have one little, uh, hiccup where we made a bunch of permission changes and we didn't document it and they did an update and we lost.
353
00:29:42,899.6666667 --> 00:29:45,719.6666667
Some of those individual permissions and then we went, rolled it out.
354
00:29:45,719.6666667 --> 00:29:53,459.6666667
We had some permission issues, you know, but if we had like all these meetings recorded and stuff like that, and have that, like that would've been super easy to recover.
355
00:29:53,459.6666667 --> 00:29:57,899.6666667
Now granted, it's kinda like you said, like we just do a better job of documenting ourselves.
356
00:29:57,899.6666667 --> 00:30:01,799.6666667
But you know, like you said, we just don't have enough people to do everything we want to do.
357
00:30:01,799.6666667 --> 00:30:07,829.6666667
Let's play it out in a world, let's say the worst of the worst happens and AI starts taking everyone's job.
358
00:30:08,744.6666667 --> 00:30:17,54.6666667
Do you think humans would fight back by basically not recording their conversations? By like doing everything in person? Yeah, talking in person.
359
00:30:17,54.6666667 --> 00:30:18,974.6666667
Never putting anything in Outlook.
360
00:30:18,974.6666667 --> 00:30:23,264.6666667
Never putting anything in office chat, like forcing themselves to go analog.
361
00:30:23,264.6666667 --> 00:30:24,884.6666667
Like it made me think about it.
362
00:30:24,884.6666667 --> 00:30:28,874.6666667
There's a while back, Israel pulled off this thing where they.
363
00:30:29,504.6666667 --> 00:30:34,934.6666667
Blew up all the pagers of, uh, and I can't even remember the details now, but they created pagers.
364
00:30:34,964.6666667 --> 00:30:35,174.6666667
Yeah.
365
00:30:35,204.6666667 --> 00:30:39,164.6666667
Basically go off at a certain time that would, that would kill certain people.
366
00:30:39,644.6666667 --> 00:30:57,734.6666667
The reason they did that is because they got so advanced in hacking the cellular transmissions of this group, that the group actually went analog, and because they went back to analog pagers, it actually became more difficult to track them and to like intercept their conversations and to figure out what was going on.
367
00:30:58,229.6666667 --> 00:31:02,339.6666667
So let's just say, I don't know, let's say we get, we get into Terminator.
368
00:31:02,339.6666667 --> 00:31:06,479.6666667
We're Terminator Skynet, it's, it's taken over.
369
00:31:07,469.6666667 --> 00:31:10,949.6666667
Is that, do we fight, do humans fight back that way? I guess, I don't know.
370
00:31:11,489.6666667 --> 00:31:24,29.6666667
Yeah, like I see what you're saying and you know, there's instances of that, but I guess, I guess it depends on where we're at, but just the, we could go back to pen and paper any day and you know, if you have some natural disaster or.
371
00:31:25,79.6666667 --> 00:31:27,899.6666667
You get hacked, you may be forced to go back to pen and paper.
372
00:31:28,709.6666667 --> 00:31:31,409.6666667
The reason we don't do that is just because it's so inefficient.
373
00:31:31,919.6666667 --> 00:31:32,309.6666667
Yeah.
374
00:31:32,369.6666667 --> 00:31:38,369.6666667
But could we, you know, think about it, like wouldn't humans make themselves less efficient? I don't know.
375
00:31:38,579.6666667 --> 00:31:44,909.6666667
At the same time, maybe the flip side of that is that makes AI even more efficient, right? Yeah.
376
00:31:45,134.6666667 --> 00:31:45,814.6666667
I was getting like.
377
00:31:47,159.6666667 --> 00:31:48,569.6666667
Five x return.
378
00:31:48,569.6666667 --> 00:31:51,899.6666667
Now everything I'm doing is getting a 10 x return because of the inefficiency.
379
00:31:52,349.6666667 --> 00:31:52,769.6666667
I don't know.
380
00:31:52,769.6666667 --> 00:31:56,129.6666667
I mean, I think there does need to be a discussion on healthy regulation.
381
00:31:56,429.6666667 --> 00:32:01,919.6666667
I don't know where it is, and maybe this is just me being pessimistic about the world.
382
00:32:02,129.6666667 --> 00:32:10,79.6666667
I was talking with my dad about this a while back and it's like we're getting to a point in our world where if someone states something, someone writes an article.
383
00:32:10,919.6666667 --> 00:32:14,729.6666667
I just saw a, a podcast about a guy who says he believes that.
384
00:32:15,314.6666667 --> 00:32:18,884.6666667
You know, he needs to talk about AI because he thinks like the worst of the worst is gonna happen.
385
00:32:19,184.6666667 --> 00:32:26,864.6666667
Are we getting so cynical or so pessimistic about the world that I don't honestly know that I hardly can believe anyone? Yeah.
386
00:32:27,44.6666667 --> 00:32:41,684.6666667
I'm like, I'm trying to, I'm sitting here and I see that thing come up, and then the back of my mind goes to what's his angle? What's his angle? Was he like ostracized by the community? Does he feel like he really deserves more credit for it than he did? And so now he's gonna try to make more of a name for himself.
387
00:32:41,684.6666667 --> 00:32:43,124.6666667
I think about politicians a lot and.
388
00:32:43,649.6666667 --> 00:32:47,519.6666667
A lot of times they say stuff they don't even believe because they just like want more clout.
389
00:32:47,999.6666667 --> 00:32:55,679.6666667
And I've began to wonder if, have we gotten to a point where like the selfish desires of an individual person are so that like it's hard to understand what to believe.
390
00:32:56,369.6666667 --> 00:32:59,459.6666667
Do we just not, we almost don't believe anything that is said.
391
00:32:59,939.6666667 --> 00:33:10,19.6666667
And so it's like hard to know are we in a lot better place with AI or are we in like a lot worse place than a with ai? Because I don't know that I can believe anyone that's published anything about it.
392
00:33:10,199.6666667 --> 00:33:11,549.6666667
Does that make sense? I don't know.
393
00:33:11,849.6666667 --> 00:33:12,239.6666667
Yeah.
394
00:33:12,764.6666667 --> 00:33:16,364.6666667
One of the positive things, and I, I've heard, I've heard it both ways.
395
00:33:16,364.6666667 --> 00:33:19,574.6666667
I've heard doom and gloom and I've heard positive things talk about this.
396
00:33:19,574.6666667 --> 00:33:26,144.6666667
And like I said, I listen to way too many podcasts, but I don't think this is, uh, too controversial.
397
00:33:26,144.6666667 --> 00:33:34,124.6666667
But the, a lot of the countries, I, I don't think the United States is, is impacted right now, but the people today are not having as many kids.
398
00:33:34,214.6666667 --> 00:33:41,234.6666667
And in certain countries I've heard, especially with like China where they had the one child rule and uh, even Russia.
399
00:33:41,894.6666667 --> 00:33:44,354.6666667
There's several countries where they just don't have enough young people.
400
00:33:44,984.6666667 --> 00:33:54,254.6666667
And if, you know, even in the US like, you know, I, I think, you know, if you look at the amount of boomers versus, you know, the whatever generation we're on now, like there's just a lot less of them.
401
00:33:55,124.6666667 --> 00:34:01,34.6666667
And you know, for right now, I, I'd look at AI's gotta be looked at as, as a positive thing.
402
00:34:01,214.6666667 --> 00:34:07,454.6666667
And the, and there, there was always the talk, you know, before ai, like people were talking about the population dwindling.
403
00:34:07,889.6666667 --> 00:34:12,539.6666667
You know, and it causing issues, especially in certain countries, not so much the US but some of the other countries.
404
00:34:13,349.6666667 --> 00:34:19,529.6666667
And this is the, you know, the invention, the genius that's gonna help, you know, kind of bridge that gap.
405
00:34:19,619.6666667 --> 00:34:26,489.6666667
'cause you know, I don't think, it's no secret everyone agrees that, you know, AI can be a force multiplier in some aspects.
406
00:34:26,759.6666667 --> 00:34:27,59.6666667
Yeah.
407
00:34:27,179.6666667 --> 00:34:30,359.6666667
So, like, short term, like I do kind of side on that.
408
00:34:30,359.6666667 --> 00:34:33,629.6666667
Like I, I hate to see someone automated out of a job.
409
00:34:34,214.6666667 --> 00:34:43,154.6666667
But you know, with less younger people like ai, you know, may help us get by and continue to, to carry the torch and progress.
410
00:34:43,454.6666667 --> 00:34:44,714.6666667
I mean, I agree with you.
411
00:34:44,714.6666667 --> 00:34:56,834.6666667
I think, I don't know, I, we've talking, we were talking before we even started this, and this has been interesting conversation by the way, but we've talked about before to kind of break things down into maybe like work intermediate long term.
412
00:34:57,434.6666667 --> 00:35:00,14.6666667
So short to me, I would say maybe it was five years.
413
00:35:00,674.6666667 --> 00:35:06,194.6666667
Intermediate would be maybe like five to 15, and then long term would be, let's call it plus 15.
414
00:35:07,34.6666667 --> 00:35:07,904.6666667
Short term.
415
00:35:09,134.6666667 --> 00:35:10,664.6666667
There's definitely gonna be impact.
416
00:35:11,114.6666667 --> 00:35:15,254.6666667
I think exactly what you're talking about is gonna be huge.
417
00:35:15,254.6666667 --> 00:35:27,854.6666667
I think we're still a lot to learn, and I know AI agents are becoming a thing, and in reality, like there's been a lot of improvements in ai, I would say in the last couple years, but it still hasn't been the drastic jump.
418
00:35:27,929.6666667 --> 00:35:30,149.6666667
Yeah, it's gotten a lot more accurate.
419
00:35:30,149.6666667 --> 00:35:32,879.6666667
Like, uh, we've gotten more accurate, we can do more.
420
00:35:32,879.6666667 --> 00:35:34,409.6666667
There's a lot more different types of models.
421
00:35:34,409.6666667 --> 00:35:40,949.6666667
So I think there has been a huge jump, but I think compared to maybe some of the doomers early on, it hasn't been as much as that.
422
00:35:40,949.6666667 --> 00:35:41,9.6666667
Yeah.
423
00:35:41,69.6666667 --> 00:35:43,199.6666667
But it also hasn't been, you know, insignificant.
424
00:35:43,799.6666667 --> 00:35:49,764.6666667
So I think short term there's gonna be us learning about what really is possible, the intermediate term.
425
00:35:49,889.6666667 --> 00:35:51,179.6666667
I see a lot of what you're talking about.
426
00:35:51,179.6666667 --> 00:35:52,709.6666667
I mean, you're gonna see people retire.
427
00:35:53,579.6666667 --> 00:36:13,319.6666667
And you know, I, I think maybe either if they can't find or I think about co-ops especially, but any type of small medium business organization, sometimes you can't find how mean, how many times have you heard that? Like, we can't find someone for this role because it's like too specialized and they don't wanna live out in the middle of nowhere or just last week be advantage.
428
00:36:13,904.6666667 --> 00:36:14,624.6666667
Yeah, there you go.
429
00:36:15,14.6666667 --> 00:36:23,984.6666667
Is that gonna be an advantage? Like maybe we can help someone who maybe doesn't have quite the talent become so we still hire someone, we just aren't able, we're not hiring that expert.
430
00:36:23,984.6666667 --> 00:36:31,4.6666667
We're allowing AI to help us like create the expert or at least offset some of that long term.
431
00:36:31,4.6666667 --> 00:36:34,4.6666667
I think there is gonna be some pretty drastic impact.
432
00:36:34,814.6666667 --> 00:36:36,194.6666667
I'll go back to my story.
433
00:36:36,764.6666667 --> 00:36:41,144.6666667
I just, I don't think I'm in a position and I don't think many people, almost nobody.
434
00:36:42,314.6666667 --> 00:36:56,984.6666667
Is it in a position to tell you what things will look like in 25 years? I think anytime someone tells you what something's gonna be like in 25 years with like utmost certainty, I think you almost can basically completely discount them because it's just the world doesn't work that way.
435
00:36:57,284.6666667 --> 00:37:04,604.6666667
So I don't know where we'll go, but I am a firm believer that like there are problems to be solved that we're not solving now.
436
00:37:04,904.6666667 --> 00:37:08,54.6666667
And so if we can spend more time solving that.
437
00:37:08,894.6666667 --> 00:37:13,964.6666667
Or even if AI helps us solve some of those problems, if we can have those people enact the solutions.
438
00:37:14,624.6666667 --> 00:37:19,184.6666667
Like I, I, you know, and I'm not saying robots, I've seen some of the robotics stuff too.
439
00:37:19,184.6666667 --> 00:37:20,144.6666667
It's come a long way.
440
00:37:20,144.6666667 --> 00:37:27,464.6666667
Uh, there's been some really cool stuff come out in robotics recently, but it's still, it is not the, the same right now.
441
00:37:27,524.6666667 --> 00:37:28,664.6666667
I think we're even farther off.
442
00:37:28,694.6666667 --> 00:37:30,74.6666667
Maybe I'll put it this way.
443
00:37:30,314.6666667 --> 00:37:32,834.6666667
I think we're farther off with robotics than we are with ai.
444
00:37:33,584.6666667 --> 00:37:33,644.6666667
Yeah.
445
00:37:33,644.6666667 --> 00:37:34,154.6666667
If that makes sense.
446
00:37:34,184.6666667 --> 00:37:35,714.6666667
So I still think there's gonna be.
447
00:37:36,194.6666667 --> 00:37:41,54.6666667
There's gotta be a lot of doing the solutions even if AI is doing a lot of analysis.
448
00:37:41,834.6666667 --> 00:37:50,684.6666667
Also, to your point, you gotta be checking up on behind stuff you need like that, that industry knowledge to like verify at least that it's in the ballpark.
449
00:37:51,164.6666667 --> 00:37:51,614.6666667
I don't know.
450
00:37:51,824.6666667 --> 00:38:01,874.6666667
I think, I think I'll settle on what I settled is at The impact is gonna be longer than a lot of people think before it happens and shorter than many others.
451
00:38:02,384.6666667 --> 00:38:04,394.6666667
I think there will definitely be impact, but I.
452
00:38:04,964.6666667 --> 00:38:23,894.6666667
I heard in a podcast recently, and it wasn't specifically talking about ai, but it said, human beings, this is, I'm paraphrasing, human beings are terrible at preventing what's coming, but we're experts at adapting to what is happening that basically, like we are not, like if AI's coming, we're not gonna prevent AI from happening.
453
00:38:24,344.6666667 --> 00:38:28,874.6666667
We're terrible at that kind of stuff, but we're very, very good at adapting to different circumstances.
454
00:38:29,114.6666667 --> 00:38:33,554.6666667
Um, and that's where I think, that's where I'll kind of leave us, at least with me, is.
455
00:38:34,409.6666667 --> 00:38:35,429.6666667
I think we will adapt.
456
00:38:35,429.6666667 --> 00:38:37,409.6666667
I don't know how that adaptation looks like.
457
00:38:37,799.6666667 --> 00:38:41,369.6666667
I don't know what that conversation looks like, but maybe, I don't know.
458
00:38:41,369.6666667 --> 00:38:46,499.6666667
Who knows? Maybe my kids' kids, we go back to farming and we just live a simple life.
459
00:38:47,339.6666667 --> 00:38:48,359.6666667
Um, we grow on food.
460
00:38:48,359.6666667 --> 00:38:52,739.6666667
I mean, and maybe that's where it is and we just leave all that other stuff along.
461
00:38:52,919.6666667 --> 00:38:53,429.6666667
I don't know.
462
00:38:53,489.6666667 --> 00:38:58,379.6666667
I don't know where we head and, and where we land, but I know one thing that I don't know exactly what's gonna happen.
463
00:38:59,9.6666667 --> 00:38:59,69.6666667
Yeah.
464
00:38:59,834.6666667 --> 00:39:06,674.6666667
There's a, a couple things short term I've, I've heard some discussion on, have you, have you heard the term vibe coding? Vibe coding? No, I haven't heard that one.
465
00:39:06,794.6666667 --> 00:39:15,224.6666667
So it's the term for basically using AI to code, just like either by speaking it or telling it what you want to do, and it just generates the code.
466
00:39:15,974.6666667 --> 00:39:22,934.6666667
Now granted, I, that's one of those things where if you don't know how to code, like how are you double checking that? Stuff like that.
467
00:39:22,964.6666667 --> 00:39:23,204.6666667
Yeah.
468
00:39:23,324.6666667 --> 00:39:24,404.6666667
You know, what's actually happening.
469
00:39:25,64.6666667 --> 00:39:28,634.6666667
That and uh, just, I think like prompt engineering, I, I do think.
470
00:39:28,934.6666667 --> 00:39:39,314.6666667
That's something that will, will come out the, not just go to chat GBT and tell you what to type or whatever, but just someone to kind of, I don't know, just kinda like an AI specialist in the field.
471
00:39:39,314.6666667 --> 00:39:42,794.6666667
I could see that coming in the short term, long term.
472
00:39:43,304.6666667 --> 00:39:53,594.6666667
One thing that I think, uh, they got a lot of publicity, and I forget, I don't know if it was a 2016 election or whatever year, Andrew Yang, I don't know if you remember, I can't remember.
473
00:39:53,594.6666667 --> 00:39:58,514.6666667
He was like the CEO or something like that, or a entrepreneur, and he talked about.
474
00:39:59,84.6666667 --> 00:40:00,194.6666667
Universal income.
475
00:40:00,764.6666667 --> 00:40:03,704.6666667
And you know, when I heard it I was like, that seems kind of crazy.
476
00:40:03,704.6666667 --> 00:40:08,564.6666667
And his thing was is you know, at some point like there's just gonna be a bunch of people that are jobless.
477
00:40:09,44.6666667 --> 00:40:15,794.6666667
And this was, I, I mean, I'm pretty sure this was before chat GBT was known or even in existence, I don't know.
478
00:40:15,794.6666667 --> 00:40:16,844.6666667
And he was talking about that.
479
00:40:17,684.6666667 --> 00:40:19,454.6666667
I think it was 2016 when that came around.
480
00:40:19,514.6666667 --> 00:40:26,984.6666667
I think today, like people wanting, I think you got a lot of, looks like you're crazy to have universal income for people, but now it's, I.
481
00:40:27,254.6666667 --> 00:40:40,244.6666667
Long term, you know, what happens if this many people are displaced? Kinda what you said earlier that I do think companies are gonna take advantage of this and, you know, the, the rich are gonna get richer and the poor are gonna get poor to some degree.
482
00:40:40,244.6666667 --> 00:40:42,974.6666667
And I think this will just really exasperate that or it can.
483
00:40:43,694.6666667 --> 00:40:49,994.6666667
And uh, anyways, that's just something I was thinking about long term, but like you said, I don't, there's no way to know which way that's gonna go.
484
00:40:50,204.6666667 --> 00:40:50,774.6666667
I agree with you.
485
00:40:50,774.6666667 --> 00:40:55,844.6666667
I think for us, and I agree on the prompt engineering, and I've talked to a lot of people at least.
486
00:40:56,339.6666667 --> 00:41:07,109.6666667
You know, as we've been kind of thinking about deploying different things that have never used AI before, I think in our industry, I don't know, I'm gonna say 10%, 15% have even used it personally or otherwise.
487
00:41:07,379.6666667 --> 00:41:07,559.6666667
Yeah.
488
00:41:07,619.6666667 --> 00:41:09,119.6666667
So there's a lot of, they're kind of learning.
489
00:41:09,599.6666667 --> 00:41:24,119.6666667
But I, I think for now, and I don't know how it's gonna go, but I'm gonna say for our organization that I think, at least as far as I can influence, that an organization that takes advantage of AI is going to outperform.
490
00:41:25,94.6666667 --> 00:41:30,134.6666667
An organization that doesn't era everything lands 15 years from now, 20 years from now.
491
00:41:30,344.6666667 --> 00:41:31,394.6666667
I can't tell you.
492
00:41:32,234.6666667 --> 00:41:39,374.6666667
What I can tell you is the organizations that just decide to put their head in the sand and don't do anything about it Yeah.
493
00:41:39,434.6666667 --> 00:41:40,424.6666667
Are gonna fall behind.
494
00:41:40,814.6666667 --> 00:41:41,54.6666667
Yeah.
495
00:41:41,84.6666667 --> 00:41:42,44.6666667
Very drastically.
496
00:41:42,434.6666667 --> 00:41:45,44.6666667
And to your point, like the richer get richer and the poor get poor.
497
00:41:45,494.6666667 --> 00:41:51,764.6666667
I don't think that richer in like financial means, but a richer in thought or in, um, vision and strategy.
498
00:41:52,709.6666667 --> 00:42:00,449.6666667
So for us, while, you know, I'm curious to see what regulation might take place, what, what impact you might have.
499
00:42:00,509.6666667 --> 00:42:02,429.6666667
We, we might have out of like the next generation.
500
00:42:02,429.6666667 --> 00:42:18,659.6666667
I think agent, agent based AI is, it seems like maybe it's the coming next kind of gen, I don't know where it's gonna go, but I know for me and, and what, in my role, I've got to, you know, help lead us to embrace this.
501
00:42:18,809.6666667 --> 00:42:21,149.6666667
And if we don't, it's not gonna be a.
502
00:42:21,869.6666667 --> 00:42:24,239.6666667
You know, we're keeping ourselves from the impact.
503
00:42:24,359.6666667 --> 00:42:29,99.6666667
I think it, in some ways, it almost will impact us worse by ignoring it.
504
00:42:29,609.6666667 --> 00:42:31,289.6666667
Yeah, I agree with that.
505
00:42:31,379.6666667 --> 00:42:39,119.6666667
This is one thing where, you know, we're trying to embrace it ourselves, but I'm also pushing our vendors to embrace it.
506
00:42:39,119.6666667 --> 00:42:43,769.6666667
And I, I think that's one of those things you said, if you know those that don't use it, they're gonna to fall behind.
507
00:42:43,949.6666667 --> 00:42:47,369.6666667
With our vendors, like, you know, we we're, they're not just vendors to us.
508
00:42:47,369.6666667 --> 00:42:49,49.6666667
They're, they're almost like ba.
509
00:42:50,159.6666667 --> 00:43:03,539.6666667
Not just vendors like it or not, and you know, I really implore them to be investigating an AI and leveraging it to make you more efficient, and also to help us, you know, do the things that we need to do or wanna do.
510
00:43:04,49.6666667 --> 00:43:06,119.6666667
No, I, I mean, interesting conversation.
511
00:43:06,124.6666667 --> 00:43:09,74.6666667
I think, I think we've, we had a, a good little episode here.
512
00:43:09,389.6666667 --> 00:43:18,509.6666667
I think maybe final thoughts? You got any final thoughts on the AI debate? Did we solve the world's problems tonight? Is this for those young people? I would definitely think about what you're going into.
513
00:43:18,899.6666667 --> 00:43:36,719.6666667
Um, I do worry about the younger generation much more so I, it's one of those things where, you know, if you didn't go to college, it was kind of looked down upon, you know, I, I don't know if that's the gonna be the case and that, and where smart people can actually just leverage AI and learn and do so much with it.
514
00:43:37,229.6666667 --> 00:43:48,149.6666667
And, you know, is college a waste for, you know, someone that is looking to get in technology these days? Are they better off just dive, doing a deep dive into AI and trying to leverage it to.
515
00:43:48,494.6666667 --> 00:43:48,794.6666667
Yeah.
516
00:43:49,544.6666667 --> 00:44:06,194.6666667
Well, I think the flip side, uh, I heard someone say they, uh, you should, you know, look more in the trades and I would say, are you gonna be able to afford someone to fix your plumbing or your AVAC system if you're not making, if you're not doing your other job, that pays pretty well.
517
00:44:06,494.6666667 --> 00:44:06,824.6666667
Yeah.
518
00:44:07,274.6666667 --> 00:44:08,774.6666667
Probably gonna be doing all that yourself.
519
00:44:09,44.6666667 --> 00:44:09,914.6666667
It's gonna be interesting.
520
00:44:09,944.6666667 --> 00:44:15,374.6666667
Uh, like I said, I don't think anyone, we don't have the answers and it's fun to speculate, but I.
521
00:44:15,704.6666667 --> 00:44:16,844.6666667
Uh, at least in my career.
522
00:44:16,844.6666667 --> 00:44:20,894.6666667
And, uh, you know, it's, I'm kind of near and close to halfway point here at this point.
523
00:44:21,734.6666667 --> 00:44:32,534.6666667
I've been amazed with a lot of technology and, and I've been, I guess, disappointed with how limiting we still, there's still a lot of limits to our technology.
524
00:44:32,834.6666667 --> 00:44:34,544.6666667
Maybe AI is the key to unlock that.
525
00:44:34,694.6666667 --> 00:44:35,744.6666667
Uh, we don't, I don't know.
526
00:44:35,894.6666667 --> 00:44:38,24.6666667
So I'm curious to see where it goes.
527
00:44:38,54.6666667 --> 00:44:40,394.6666667
I, I'm, I think the best thing to do is have an open mind.
528
00:44:40,394.6666667 --> 00:44:42,914.6666667
Have an eager mind to learn.
529
00:44:43,679.6666667 --> 00:44:49,79.6666667
We talk about that when we hire new people is, uh, willingness to learn is really kind of one of the most important things.
530
00:44:49,79.6666667 --> 00:44:54,689.6666667
So that's, I'm gonna kind of buy into that, right? And learn all I can and see where these things take us.
531
00:44:55,139.6666667 --> 00:44:55,559.6666667
Yeah.
532
00:44:56,9.6666667 --> 00:45:05,789.6666667
One thing, and I think we've made this comparison in the past, ai in some ways, it reminds me of when the cloud came out and there was those people that they were like all in.
533
00:45:05,789.6666667 --> 00:45:08,9.6666667
They were like, let's, we're gonna move everything to the cloud.
534
00:45:08,399.6666667 --> 00:45:10,889.6666667
And then there were somewhere, I'm not putting anything in the cloud.
535
00:45:10,949.6666667 --> 00:45:12,419.6666667
And there were, there were people out there.
536
00:45:12,824.6666667 --> 00:45:18,464.6666667
Screaming from the rooftops that in 10 years you're not gonna have any systems local, you know, everything's gonna be in the cloud.
537
00:45:18,464.6666667 --> 00:45:19,754.6666667
And it's not the case.
538
00:45:19,754.6666667 --> 00:45:21,764.6666667
I'm not saying there's some organizations out there that don't.
539
00:45:22,4.6666667 --> 00:45:29,234.6666667
Um, but also, you know, those people that drug their feet and fought, put stuff in the cloud, everyone's got something in the cloud these days.
540
00:45:29,234.6666667 --> 00:45:33,284.6666667
Like, I haven't seen an organization that, that doesn't, so, uh, I, I think you're right.
541
00:45:33,284.6666667 --> 00:45:39,554.6666667
There's no way to truly predict it, and the answers will lie somewhere in the, the middle of all this speculation.
542
00:45:40,49.6666667 --> 00:45:44,189.6666667
But, uh, I do like your thought on, you know, just trying to embrace it and learn from it.
543
00:45:44,279.6666667 --> 00:45:44,849.6666667
Leverage it.
544
00:45:45,359.6666667 --> 00:45:46,379.6666667
I think that's important.
545
00:45:46,469.6666667 --> 00:45:50,369.6666667
You know, if people were to have done that with the cloud, like those are the people that are more successful.
546
00:45:50,579.6666667 --> 00:45:52,979.6666667
I think with ai, the same thing is gonna come.
547
00:45:53,9.6666667 --> 00:45:54,299.6666667
Just trying to embrace it.
548
00:45:54,539.6666667 --> 00:45:55,379.6666667
Try something new.
549
00:45:55,829.6666667 --> 00:45:56,699.6666667
We'll, we'll see.
550
00:45:56,699.6666667 --> 00:45:58,559.6666667
For those listening, we'll see what you think about it.
551
00:45:58,559.6666667 --> 00:46:00,419.6666667
Let us know as we post this out.
552
00:46:00,479.6666667 --> 00:46:01,829.6666667
Started a conversation below.
553
00:46:01,829.6666667 --> 00:46:05,999.6666667
Uh, I'd like to hear what everyone else's thoughts are, so as we post this out on social media.
554
00:46:06,419.6666667 --> 00:46:17,204.6666667
If you have some thoughts on what we talked about or where you think AI is going, or where do you think the future is, is going in an AI connected and enabled world, uh, drop 'em, uh, down below and, and let us know what you think.
555
00:46:17,684.6666667 --> 00:46:22,604.6666667
Um, or if you wanna reach out to us and you can always do that at Show it Off the Wire podcast.
556
00:46:22,634.6666667 --> 00:46:26,234.6666667
Tell us, you know, give us feedback, give us an idea for a new episode.
557
00:46:26,849.6666667 --> 00:46:27,869.6666667
Anthony, we tried something new.
558
00:46:27,869.6666667 --> 00:46:29,129.6666667
I, I think it worked.
559
00:46:29,219.6666667 --> 00:46:32,219.6666667
We will see what others think as we publish this just a little bit.
560
00:46:32,399.6666667 --> 00:46:35,69.6666667
And, um, so yeah, let us know what you think.
561
00:46:35,219.6666667 --> 00:46:37,739.6666667
So, uh, this has been another episode of Off the Wire.
562
00:46:38,699.6666667 --> 00:46:39,269.6666667
Thank you everyone.
563
00:46:39,269.6666667 --> 00:46:39,299.6666667
I.