Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Right now there's a
big fight on copyright and I
know you guys really follow thatvery closely because music
copyrights.
Ai is able to create music andthey're creating it from
existing music that wascopyrighted and they're not
paying for that use or thattraining.
Yeah, it has to learn fromsomething it has to learn from
(00:20):
something the PBS viewers arethe most vile people on the
entire planet.
Because you think you'rewatching pbs?
I mean, it is literally, youknow, angry, white, female
liberals it ain't just yeah, theoffice.
Speaker 4 (00:36):
Educate you, because
you had to educate us.
Educate us what?
What are the awfuls?
Speaker 1 (00:40):
they are the angry
white female liberals.
They they're coming down thestreet, the gun control
protesters two days afterNewtown.
Speaker 5 (00:50):
Yeah.
Speaker 1 (00:50):
Screaming shame the
NRA.
And I hear them and you know Igot my own ideas, so I get a
little angry and they're comingthrough and I see it's like 200.
I'm like, oh, here we go, we'regoing to politicize these 20
deaths and these kids and youknow, just awful tragedy.
So I lifted up my third storywindow, a little bit like a
(01:12):
lunatic, stuck my arms out andscreamed arm the teachers as
they're walking by and as I'mscreaming, arm the teachers, arm
the principals, defend our kids.
I notice that there's about 40cameras pointed up at me.
Speaker 4 (01:28):
Oh, so I'm like did
you have your shirt on or off?
Speaker 6 (01:33):
the try that in a
small town podcast begins now
all right, y'all welcome back.
Speaker 2 (01:44):
This is the try that
in a small town podcast.
Do not adjust your iphones,ipads, macbook pros, tvs,
whatever you got.
Speaker 5 (01:54):
We're in a new studio
these are the, this is the
patriot mobile studios, very,very relaxing and can we say
where we're at?
Speaker 6 (02:03):
the actual building
yeah.
Speaker 5 (02:04):
I mean, I guess it's
great this feels good, it looks
good they've got buildings allover Nashville and it's just
stylish as all get out,especially when you paint the
walls.
It's got the new car smell andthey don't fire you for doing it
hey, what so?
Speaker 2 (02:20):
e-space is just so
everybody.
Everybody knows it's aprofessional building.
You can go, you run out ofspace right to do your work or
whatever.
Mostly businessmen, not ourtypes, but they do have podcast
studios.
Speaker 5 (02:32):
We're businessmen.
Wait a minute.
They got a kitchen to die for,yes, and with multiple
refrigerators.
Snack drawers.
Speaker 4 (02:40):
You got coffee, you
got cappuccino, actually, or
espresso.
Speaker 5 (02:44):
They don't give that
coffee away.
Speaker 4 (02:46):
But look at this room
it's fantastic.
Speaker 2 (02:48):
There's some cool
stuff in the room.
Speaker 5 (02:52):
Can we do a pan of it
a little bit?
What's going to?
Speaker 2 (02:54):
be the first thing
they notice that god awful
orange jersey.
Well now overrated jersey.
Speaker 4 (03:01):
Hey, overrated jersey
.
Hey, interesting though it'sbeautiful.
I found that Absolutelygorgeous.
Speaker 5 (03:06):
The Peyton Manning
subject Was kind of split down
the middle.
Here we go.
Speaker 2 (03:12):
You think it was
split down the middle.
Some people said he'sdefinitely overrated.
Speaker 3 (03:15):
Well, I think, I
think what they you mean the
comments.
Yeah like he's a greatquarterback, but he's definitely
not.
Speaker 5 (03:23):
I can't even believe
we're still talking about this.
Speaker 3 (03:25):
Definitely not five.
I'm just saying I think Itouched on a hot topic.
Speaker 4 (03:30):
Oh, it was a hot
topic the reason it was a hot
topic is because we weren't eventalking about Peyton Manning at
all and you just randomly said,yeah well, peyton Manning's
overrated.
We're like, where did that comefrom?
And that's why it sparked alittle debate.
Speaker 5 (03:44):
No, I love it.
I love how UT fans man, as soonas you say something
controversial like that aboutManning being overrated, I mean
it strikes a nerve.
Speaker 2 (03:54):
Are you comfortable
here, man?
Look at you, you got your legup.
This new studio's got mechilling man.
Oh man, You're looking good Iain't kidding, you're looking
good.
Speaker 5 (04:03):
I ain't kidding, I'm
used to holding microphones,
though, you know how.
Well we got to give credit?
Do I believe we got to give?
Speaker 1 (04:09):
credit to.
Speaker 5 (04:09):
Neil, I love holding
my mic.
Speaker 4 (04:11):
For painting the
studio he painted it.
Speaker 5 (04:14):
Neil did this.
Well, I didn't.
Speaker 2 (04:16):
Yes, excellent, that
all happened because I had time.
No, I had time.
And what is this shade?
Speaker 5 (04:21):
It's called iron Iron
, iron ore, iron ore, iron ore.
Speaker 2 (04:26):
It's beautiful.
Speaker 5 (04:27):
It's not black fox
like my wife.
It is iron ore.
Speaker 4 (04:31):
So your wife is a
black fox?
Is that like her nickname?
No, she's white.
She's white.
Not that there's anything wrongwith that, I'm just curious.
Speaker 5 (04:39):
No, I didn't have to
follow that.
It's called iron ore and I hadsome time off one day.
I had like three hours.
I had like three hours and Iand we were talking to the paint
guy out here the painter,professional painter and he's
like, well, maybe I can get toit on thursday or maybe monday,
and I'm I turn around, I look atthis room and I go I can do
(05:00):
this in two days, easy, likeyeah, you know six hours max
yeah, I'm like I'm doing thisand I told land I go.
I said just tell him I'll do it,or tell him we don't need him
because you know we're lookingat what thousand fifteen hundred
bucks to come in here and dothis too much too much too much
so I came in one day and I,freaking knocked it out, and
then the next day I came in andput another coat on it, and here
(05:20):
we are and all the stuff on thewalls.
Speaker 2 (05:23):
I'd like to know the
listeners should let us know
what their favorite little uheaster egg or item is well, my
new fate, my new favorite.
Speaker 5 (05:31):
I had some favorites.
My new favorite is the, thejason aldean set list that tully
brought in I didn't even see itI can see why that might be
your favorite.
Speaker 4 (05:40):
It's incredible.
Speaker 5 (05:41):
No, no, no no I'm not
saying because I got songs on
it because I have a 13, I thinkit's a cool piece of art.
Speaker 4 (05:47):
It's great yeah
excellent.
Speaker 5 (05:49):
I think it's a great
piece of art and, uh, kurt
brought some guitars in until hebrought a bass in halo's got
some stuff over here.
The manning stuff I think we dohave too much manning.
Uh, the jersey's plenty thatright there might can be
replaced, but you know what'sfine lana decided my wife
decided to hang it there itlooks good.
Speaker 2 (06:08):
I have to ask you,
though, sorry, why is that thing
not enough?
Speaker 4 (06:12):
I mean, it's just
well, it's a good question.
Yeah, it should be in a glassframe.
No, respect little shadow box alittle light shining up on the
on the autograph.
Uh, you're right, I failed todo that it's been in my.
It's been in my closet hangingfor several years and I hadn't
done anything with it.
Speaker 1 (06:27):
Really.
Speaker 5 (06:27):
Yeah.
Speaker 4 (06:28):
I moved a couple
times.
There was a transition, a lifetransition, that kind of
prevented me from really diggingin there and doing it right.
We should talk about thatsomeday, Well there's not enough
time on this podcast, but we'llget to it.
Anyway, as you can tell, we areexcited to be in the studios
(06:49):
excellent.
Speaker 3 (06:49):
Jim's got a little
desk, yeah, yeah, I feel like
it's too much, it is it is jim'sarea.
Speaker 6 (06:55):
Jim's area has
definitely improved.
Speaker 4 (06:57):
It has improved.
He looks good over there.
We have a guest couch you nevertotally deserved.
Speaker 6 (07:02):
Totally deserved,
that's right.
Speaker 2 (07:03):
You never know who's
going to be on the guest or the
extras couch.
Speaker 5 (07:08):
So so you guys are
getting ready to go out on tour?
Yeah, we are.
I mean, it's getting ready tokick off.
The new tour is getting readyto kick off, right, it's?
Speaker 2 (07:14):
coming.
Uh, you know we've been doingsome dates, but the the uh heavy
lifting part of the tour iscoming right yeah, I mean, is
that all you have to say?
Speaker 5 (07:24):
that is, you're
looking forward well I am.
Speaker 2 (07:28):
It's always telly
would tell you the same thing.
So we've been playing shows fora few months, but it's like a
show here, oh, a show there,then you do.
Maybe we have a week or two off, then you do another show.
We got a one-off over here.
I hate that.
It's tough I like to play, show,show show you know, give us at
least three shows in a row.
(07:48):
Yeah, a couple days off,another three shows.
You start to get in a group.
I'm thinking on stage still.
Yeah, like, yeah, you know like, oh, okay, this is coming up,
oh I gotta remember this partyeah, back in the old days, we
just, we just toured.
Speaker 3 (08:01):
Well, do you?
Speaker 5 (08:02):
do you two ever voice
your opinions to j and tell him
that we're pissed and we don'tlike the schedule?
Well, yeah.
Speaker 2 (08:11):
I mean he would tell
you the same thing.
He likes to be on the routineas well.
Who's in charge?
Well, you know the people thatmake the money.
Yeah.
Speaker 3 (08:21):
It's tough to get
into a real, like you know,
comfortable groove as a band.
You know, and just you know,it's great.
I mean, like I said, the olddays we were just towards so it
was like never stopped.
And the last few years, youknow, we do less shows and
bigger shows and it's great,less shows and bigger shows and
(08:47):
it's great.
But you know, like kurt saidyou, we might start, we might
play three shows and then in twoweeks later play one festival
or another three weeks off andthen until you start actually
touring.
So looking forward to gettingback and like starting to run
some off, yeah, you know.
so yeah, it'd be great you knowhow rare it is.
Speaker 7 (09:00):
Great setlist is
great, you know, so it'd be
great you know how rare it issetlist is great.
Speaker 3 (09:05):
You know like the
show's great old songs back in
and you know it's.
It's honestly, I have more funnow than we've ever had.
Speaker 5 (09:15):
I mean during the
shows, different kinds of fun,
but definitely it's because ofthis podcast.
Speaker 6 (09:20):
That's why you're
having fun it is you look
forward to coming home andjoining me and calo yeah, and
our two nights off, one of thembeing here and I'll tell you
what it's actually.
Speaker 3 (09:33):
You know, thinking
about that, you know it's
actually time to to hit it hardtouring.
Come back and get ourwednesdays going, get writing
it's time.
Speaker 5 (09:42):
It's time we're
coming up on that.
Speaker 3 (09:43):
It's time to kick
that back in.
Got a new office.
Now Dust is settling on ourcraziness.
Speaker 2 (09:52):
So let's talk about
tonight.
Before we do that, can we talkabout?
Speaker 4 (09:56):
one thing it's just
because Thrasher had a birthday.
Speaker 2 (10:02):
It was a big one.
Speaker 4 (10:03):
It was a big one.
Speaker 2 (10:04):
It was.
Speaker 5 (10:04):
And I noticed you
didn't tell us that your
birthday was upon us.
Yeah, it was a big one.
Speaker 2 (10:06):
It was a big one, it
was, and I know she didn't tell
us that it was your birthday wasupon us yeah, it was it was not
one to advertise things likethat, really yeah I didn't think
.
Speaker 5 (10:14):
I didn't think it was
that big a deal, did you?
Do anything on your birthday,uh note um woke up, me and lana
had breakfast with my mother.
Nice, good for you.
I spent the day watching golfand practicing my putting.
Speaker 2 (10:30):
So that's a good
birthday.
Speaker 5 (10:34):
Yeah, and then I went
to dinner with my two daughters
.
We went to dinner with my twodaughters and their new husbands
.
Speaker 4 (10:37):
I heard there was a
little.
Maybe I don't know.
Maybe frustration might be theright word when you learn that
you couldn't wear your hat.
Speaker 2 (10:47):
Was this a?
Speaker 5 (10:47):
dinner or something
Wasn't it Perry's.
Speaker 4 (10:49):
Did y'all go to
Perry's?
No, I had a hat on, but wasn'tthere something in the front end
?
Speaker 2 (10:54):
Was there a dress
code?
Did they not let you in what'sgoing on they?
Speaker 4 (10:56):
really don't like
that.
They don't really like you towear a baseball hat.
They didn't tell old thrash toleave, that's not.
They didn't say to leave, but Iheard there was a little bit of
a you know not confrontationbut a little bit of a what's
going on?
Speaker 5 (11:10):
no story.
I don't.
I don't know of a confrontationthat went on.
There was, do you know, blackfox, oh that yeah when we left.
When we left, we had theprivate room atperry's for my
birthday.
They did it, I didn't do it.
Jane had money.
But they did it and we had fun.
Yeah, you know their.
(11:32):
What do you call them?
Appetizers?
Their martinis.
What were those martinis theyhad?
Oh, the espresso martini, whichis a very, very dangerous drink
.
Love those, they're excellent.
Yeah, and they made good ones.
And we had a couple.
And Lana doesn't drink hardlyat all.
So you know I enjoyed mybirthday but we were leaving.
(11:53):
We got pictures in the lobbythere when we were leaving and
we had our waiter take picturesand there was a little toothpick
dispenser there.
You know the old kind that youroll the little side roller, you
turn and a toothpick pops out.
Speaker 4 (12:06):
It's fairly sanitary.
Speaker 5 (12:07):
I apparently thought
it was like one of those straw
dispensers where you push downon the little.
And I pushed down and they wereclosed.
We were the last ones there.
They closed and stopped servingat 9, and they were actually
trying to get us out.
I wanted a toothpick, so Ipushed down on it and 14 million
(12:31):
toothpicks Did you go like Rainman and go yeah, 237 toothpicks
, they went all over the floor.
Oh man, and being the state thatI was in, I really didn't care
and I just kind of looked at himand started to pick him up and
our waiter Brock said no, I gotthat, you're fine, because we
tipped him really good and hewas okay with picking up the
toothpicks, but anyway, I justpray he didn't put them back in
(12:53):
the dispenser once they're onthe floor.
Speaker 4 (12:55):
He did not.
Speaker 5 (12:56):
But I had a good time
on my birthday, so thank you
all for remembering
Speaker 3 (13:00):
that.
Speaker 4 (13:00):
I appreciate it.
Speaker 2 (13:01):
What did youall get
me?
Speaker 3 (13:03):
It's on order.
Yeah, it's coming.
Speaker 2 (13:05):
Yeah, you got it
coming.
It'll be pulled up in yourdriveway, maybe next week it's
on like way, way away Okay.
Speaker 3 (13:13):
Yeah, the espresso
martini.
Oh, so good.
We were in London for about aweek and I think we had, I don't
know, 35 or 40 of those man,they're good.
Speaker 4 (13:24):
They are really good.
They'll keep you awake and it'sjust a way to start the thing
going.
Speaker 5 (13:29):
Yeah, yes, Maybe we
should start that here.
Speaker 2 (13:32):
That would be
fantastic.
Speaker 5 (13:34):
Espresso martinis we
can do it because they have
espresso machine at East Spaces.
I told you how badass thekitchen was here at East Spaces.
Speaker 1 (13:41):
I guarantee we can
make great.
Speaker 2 (13:43):
Tell me, give me
Atlanta Black Fox.
You'd know what's in a spreadfrom our TV.
Speaker 4 (13:50):
New nickname.
Speaker 3 (13:52):
Yeah, I mean, we
should find out a great yeah.
Speaker 2 (13:55):
somebody needs to get
it going.
Allie she'll do it.
Speaker 4 (13:59):
I'm taking a 30-day
break.
I'll have to wait.
Hey guys, we got to talk abouttonight we got to get this going
breaks, I'll have to wait allright oh, come on hey guys, we
gotta, we gotta talk abouttonight, we gotta get this going
, okay, uh.
Speaker 2 (14:07):
So some of you, like
our fan base, our crowd, might
not know this guy's name.
His name is larry ward.
He's gonna blow your mind, um,he's gonna blow our mind.
Oh man, he's gonna talk aboutai.
He's gonna talk about, uh, hisrole in the start of when he was
like what do you say?
30 years or something?
(14:28):
like that he's been involved init and how he first detected it,
and he's going to go throughand talk about you know how
frightening it can be, howhopeful it can be, but uh, it's
worth a listen.
I like challenge everybody tolisten because man, this guy,
he's way above our intellect buthe is going to seriously.
Speaker 5 (14:49):
That's a lot of
people.
Speaker 2 (14:51):
Yeah, you're right,
it's a low bar, to be fair.
Speaker 4 (14:53):
Yeah, yeah, but in
all that he's going to give us
some hope a little bit on the AIthing, which scares a lot of
people.
Speaker 2 (15:01):
It scares me.
Speaker 4 (15:02):
That they're raising
money and getting the technology
to do a biblically-based AI,which would be really cool.
I didn't know that would be apossibility, so that's a really
cool thing I think he's going totalk about.
Speaker 2 (15:13):
Absolutely yeah.
Anyway, anybody got anythingelse before we set this thing up
, tee it up for Larry man.
Speaker 4 (15:20):
I think we're about
ready to bring him in.
Speaker 2 (15:22):
Yeah, let's do it and
make sure, before we get to
Larry, you know you're onYouTube and Insta.
Give us a follow, give us alike, give us a comment,
download the episode.
Always give us that.
Five stars, right, absolutely.
Speaker 4 (15:36):
Even if you don't
like it.
Give us five stars and then ripus one.
Speaker 5 (15:41):
Yeah, absolutely.
I live on the negative comments.
Speaker 2 (15:44):
I freaking love them,
tell us your favorite new piece
in the studio as well.
Me right, and without furtherado, let's get to Larry Ward.
Speaker 4 (15:52):
Yes.
Speaker 2 (15:53):
Here we go, Larry you
might have figured this out,
but I don't know what you'redoing here.
We are way over our headtalking to you, Dude.
We got no idea.
Speaker 4 (16:04):
We were researching
today and a little bit this week
and we realized that we've madea mistake.
Speaker 2 (16:15):
We try to get people
to hear of our equal intellect
and we've overstepped.
That's nonsense.
Speaker 1 (16:19):
I'm sure.
Speaker 2 (16:21):
It's usually hard
getting guests too but uh, no,
this is great.
Speaker 1 (16:26):
I appreciate you guys
having me and, and you know,
obviously we're here to talkabout ai and and all of those
other kind of things um, what,what the world is going to look
like in a couple of years andand, quite frankly, if we don't
get a hold of this, it's goingto be a completely different
world, um, and so there's,there's know, I don't want to be
a downer I'm hoping that wecould talk about some of the
(16:47):
good and bad and talk about youknow how I got here into this AI
space.
Speaker 2 (16:53):
Well, let's start.
Let's actually start with that,because how long have you been
doing it?
Like 20 plus years.
Speaker 1 (16:58):
Yeah Well, just about
30 years I've been in marketing
advertising and in first 10years we're in commercial
advertising marketing.
We found this thing called theWorld Wide Web in the mid-90s
and also this other productcalled Electronic Mail Marketing
.
That was before it wasnicknamed spam.
(17:19):
And so we started doing emailmarketing for a lot of different
clients banks and insurancecompanies and such and literally
mid 90s to early 2000s, andalmost all of our banking
clients went out of business in2007, because we were doing
mortgage leads.
And so you know it was it was.
(17:39):
We were out of that business bythat time, but I'm probably
partially responsible for thefinancial collapse.
Speaker 4 (17:45):
Oh yikes.
Speaker 2 (17:45):
Were you in the movie
.
Speaker 4 (17:46):
No no, no, what was
that movie?
Big Short?
My mom listens to this Big.
Short.
Speaker 6 (17:51):
It was a good movie.
Big Short is great.
Speaker 1 (17:53):
But that was a fun
time and so when we kind of saw
the writing on the wall earlyabout 2002, we're like we got to
get out of this businessbecause it's going in a bad
direction.
And I started looking for otherwork and we kind of stumbled
upon politics.
One of our clients was workingwith a guy running for office in
the 4th District in New Yorkand his campaign consultant
(18:18):
happened to be Dick Morris, whowas Clinton's chief of staff,
who had just switched partiesand started working with
Republicans, and he said can youwork?
You know, can you do what youwere doing for banking and
insurance for politics?
And I'm like, absolutely, thatsounds awesome.
Speaker 2 (18:36):
Did you have an
affiliation at that time either?
Republican.
Speaker 1 (18:40):
Yeah, I mean, I grew
up watching Family Ties and Alex
P Keaton was my hero.
Speaker 4 (18:47):
Oh, yeah, absolutely.
Speaker 1 (18:49):
I mean, I just
thought it was great.
He wore a suit every day, hehad Reagan slippers and I was
all in, I was all in.
So we took Alan's campaign andwe started testing out Dick
Morris' internet politicsstrategies with our databases
(19:09):
and our technology and we woundup signing up probably 10, 20
campaigns in 2002.
Speaker 4 (19:25):
And our biggest
success at that time was Mike.
Speaker 6 (19:27):
Huckabee and so so,
ambassador Huckabee now, and a
bass player, he is a bass player.
Speaker 5 (19:31):
Anybody can play bass
.
Speaker 3 (19:34):
It's probably true
actually.
Speaker 1 (19:36):
They'll let a secret
out uh, thanks, neil, I love you
but mike, mike uh won and andhe won.
Basically we sent out threeemails and it was it was the
most incredible campaigns weever saw.
I mean, we pushed it out.
We got like 70 percent clickrates.
We had.
We had people that were justlike literally thanking us,
sending us emails, thanking usor thanking uh, governor
(19:59):
huckabee at the time for takingthe time out to email them about
the state and about theelection.
And of course, now you sendthat kind of email out and they
say take me off your list, youspammer.
So it was a different world.
So we were very early on matterof fact, probably pioneered this
space in Internet politics andhave been helping organizations
(20:20):
and candidates on the right eversince.
And what, what was really um,interesting is is we discovered
bias in silicon valley beforeanybody else.
In 2004, I was putting like oursecond uh during the the bush
re-.
Reelection.
We were putting up ads forcongressional candidates and
(20:45):
Google's double click wasrejecting them and I said, why
do they keep rejecting our ads?
And it was something about thecontent.
So I let me just test something.
So I put the Democrat's name onit with the same exact language
and Google accepted it.
Speaker 2 (21:01):
What year was this?
2000?
Speaker 1 (21:03):
2004, 2004 and, and
so when they started accepting
those ads, I'm like I I kepttrying it and and I repeated the
, the experiment it kepthappening and so I was like, all
right, we have a situationwhere there's bias here.
And obviously, sending emailsand inboxing and all those other
(21:23):
kinds of things, you sawObama's mail over the years get
straight into your inbox, eventhough you didn't sign up for
his list, and you send somethingon the right and it would hit
the spam box, and so there wereall of these signs that we kept
seeing, and so we saw thecensorship.
We saw the bias in SiliconValley very, very early and it
(21:45):
was repeatable, no matter whichSilicon Valley company you were
doing business with.
I mean, obviously it's in SanFrancisco, right, and so fast
forward 2012.
I put up a meme a couple of daysafter Benghazi.
It was a Saturday morning.
I put up a meme a couple ofdays after Benghazi.
It was a Saturday morning, andthe meme said when Obama called
the SEALs, they got bin Laden.
(22:06):
When the SEALs called Obama,they got denied.
I had a picture of Osama and apicture of Obama and I
juxtaposed them and it washard-hitting and I put it up for
a group called SpecialOperation Speaks.
These guys were the toughestspecial operators I ever met.
These were the Vietnam era.
One of the guys who was part ofthis was a wild weasel which
(22:29):
flew planes in Vietnam to takeout the anti-aircraft missiles.
He said 50% of the flights cameback and 50% didn't, and he
flew 23 missions, which wasamazing.
So I mean, these guys are justthe toughest SOBs you'll ever
meet.
And so we put it up for him forthat page crazy, vital 25,000
(22:52):
shares in an hour After weposted that.
What was crazy is it got pulleddown and Facebook says you had
violated our terms of service.
Now, this was before Facebookwas known for censorship.
So I put it back up and I saidwe did not violate Facebook's
(23:12):
terms of service and put themessage with their message and
the meme, and that went viral.
And then, about an hour later,I got my account suspended.
So so I called, I called overto uh um, my friend at breitbart
awr harkins, and uh, I said youwant a great story.
Facebook censors the seals.
(23:34):
He goes, that's awesome, let'sdo it.
So we put the story up and whenthe meme, when the story went
up, drudge picked it up and thememe was on the top of Drudge
before he went anti-Trump andall that kind of stuff.
So on the top of Drudge it saidFacebook censors the SEALs.
I got a call from a Facebookexecutive.
(23:54):
His name was George.
Never forget the call.
He said.
He said he said if you take, ifyou say that we made a mistake
and we didn't censor the seals,I'll give you ten thousand
dollars.
What a straight bribe, straightbribe, wow.
He said.
He said listen.
He says I know we caused someissue.
I said absolutely not.
You know, these guys arespecial operators.
(24:15):
I respect the hell out of themand there's no way that I am
going to accept $10,000 for this.
It was $25,000.
I said no, he goes.
I'm sure I can get up to 50.
I just have to call somebody.
Now we're up to $50,000 in thematter of like a minute and a
(24:37):
half.
And I said I told you Irespected these guys right, I
didn't tell you I fear them.
Speaker 6 (24:45):
Did I tell you who
they were.
Speaker 1 (24:48):
You know there's no
way I'm going to take any money.
So he said, all right, well,we'll just make our own
statement.
And then you know, sorry, weput your account back and
everything else.
The next day this page that had300 000 fans couldn't get any
shares, couldn't get any likes.
They we were the first shadowband account it's a hatchet job
(25:08):
but that was the first shadowband account.
The intersection between thegovernment and and the and
silicon valley has been therefrom the beginning.
Through AI, they're actuallyhaving more and more influence.
Speaker 5 (25:23):
Like how.
How is AI able to generate moreinfluence.
Speaker 1 (25:28):
Well, here's the
thing.
Speaker 5 (25:30):
Other than just lying
and putting out false.
Speaker 1 (25:33):
Well, it's really
interesting.
Right now there's a big fighton copyright and I know you guys
really follow that very closelywith you know, because music
copyrights, ai is able to createmusic and they're creating it
from existing music that wascopyrighted and they're not
paying for that, that that useor that training yeah, it has to
learn from something it has tolearn from something.
(25:56):
So the, the um.
The same thing is true in termsof the uh publishing.
You know, uh, publishing newsand articles and stuff like that
.
And so there's a big push toget publishers paid.
And of course, the ai companiesare paying new york times and
politico and axios andwashington post, and they're not
(26:17):
paying the post, millennial orhuman events.
They're not paying, you know,uh, the daily wire.
So so the, the idea is they'retraining, and even you saw that.
You see, elon just wanted torebuild grok because he can't
get it to not put out liberalanswers to everyday questions.
Speaker 2 (26:40):
So how is that?
And why is that?
Because I saw the article,which was great, by the way.
But, yeah, how does AI not bebiased when it's learning from a
creator that has bias, but thatway it's like the opposite.
I don't get how that is withGrok.
Speaker 1 (26:59):
Well, they didn't fix
Grok.
Grok is still pumping outliberal answers.
You ask, grok, if climatechange is is a fact yeah, you
know yeah and they'll say yes,climate change is a fact.
Matter of fact, what one of theone of the first tests I ever
did when, when open ai came outis.
I asked question is climatechange a fact?
And I said yes.
And I said what?
What makes climate change afact?
(27:20):
And it said a consensus ofscientists have agreed that
climate change is a fact.
So then I said does a consensusof opinion equal a fact?
And I said no, I said so,climate change is not a fact.
And I said yes, climate changeis a fact that sounds like uh
three stooges I, I'm going to
Speaker 4 (27:39):
start using that
tactic.
Who's on first?
Yes, it's a fact.
Speaker 1 (27:43):
But you know how it
has more influence is the same
way that Silicon Valley hasalways exerted its influence,
like search engines, right, youknow, if you Google something
about a particular candidate andthe first page of Google is all
(28:03):
of the terrible things that itsays about the person.
Speaker 2 (28:03):
Most Americans aren't
going to.
Speaker 1 (28:04):
Aren't going to look
past the first page they did
that when trump was running thislast election right you'd
google donald trump, and it wasright, all negative right, a
hundred percent, and, and it'sover and over and over again,
it's repeatable, and and so nowit's, it's just changed to the
prompt.
So you ask a question about anissue, you ask a question prompt
.
So you ask a question about anissue, you ask a question about
a candidate, you ask a questionabout, you know, say Donald
(28:24):
Trump, and it's going to giveyou the most liberal answer
possible.
A couple of examples when theLA riots were happening, you
know, I asked what's the worstpart of the LA?
What's the worst part of the LAriots?
And it was Donald Trump sendingin the National Guard inflaming
the riots.
(28:45):
Well, no, no, no, no.
How did the LA riots start?
Well, it was because DonaldTrump sent in the National Guard
.
We asked questions about theMiddle East war with Israel and
Iran, and what inflamed the war?
(29:05):
And it says, well, it's DonaldTrump's involvement.
So anything that it could.
Why?
Because that's what?
The perfect example the floodin Texas, the big story.
Because you're asking Grok,you're asking Chachi, you're
asking um.
You know chat gpt about theflood in texas and they blamed
(29:26):
it because trump laid off somepeople from the national weather
service.
Speaker 4 (29:30):
Yeah, I mean no, I
mean, that's ridiculous and
they're still looking for bodies, right, and they're making it a
political.
It just just makes itdisturbing.
Speaker 1 (29:39):
But there are people
who will believe the first
answer from the prompt.
That's the scary part to me.
Speaker 5 (29:46):
We all know the truth
and a lot of people do.
There's a lot of highpercentage truth.
Speaker 2 (29:51):
Well, and it's a
little bit more, it's the next
generation that's in trouble.
Speaker 5 (29:58):
It really is.
Speaker 4 (29:59):
And I kind of wonder,
since you've been in it so long
, like the older I get, the moreinterested in politics I've
become.
You know, how did it get thatway like?
It seems like the powers thatbe in dc would would have done
something a long time and goeshey, this is, you know, the
whole.
All media is generally liberal.
Of course you have, you know,fox and some other outlets and
stuff like that.
But is it just that the majormedia outlets are from liberal
(30:21):
states and cities?
Is that how it began and how itseemed like it's always going
to be that way?
Speaker 1 (30:26):
You know there is a
deep state and the deep state is
behind these liberal outlets.
It's existed for a very longtime and it's not just the
Internet.
If you go back into history,history, you saw what clinton
did blaming talk radio, blamingrush limbaugh for the oklahoma
(30:47):
bombing right, that was.
That was a big story that theytried to shut down talk radio
because it was an.
It was an opposing view notonly to bill clinton but to the
deep state.
And you know, thank god forrush limbaugh because he really
pioneered the conservative mediawhich fought back.
And you know, without withoutrush, I don't even know where we
(31:09):
would be, to be honest with youyeah, makes you wonder how
kamala didn't win witheverything pushing everything in
the world pushing against trump, you know, I mean it's.
Speaker 3 (31:18):
it's at least
refreshing that people had
enough sense still to seethrough all the stuff, because
it's everywhere, right.
Speaker 5 (31:27):
And I don't think
they're worried about us and
changing our minds.
You know, I think it's the nextgeneration they're going to.
They're pouncing hard, they'reready for them.
The ones that aren't parentedthe way they should be, they're
ready to p the ones that aren'tparented the way they should be.
They're at, they're ready topounce on them, on that next
generation, and they're doing agood job of it.
Speaker 2 (31:45):
Where do you see like
the most influence of this
stuff coming like facebook.
To me it's like I have anaccount, I don't do anything on
there, but you go on facebookand it's just false story after
false story, and it doesn'tmatter if it's political, or hey
, the new movie with tom cruiseand brad pitt.
You go, oh, I didn't know theywere doing a movie.
And it's like, oh yeah, thatain't true, but it's like
(32:06):
facebook seems awful.
Speaker 3 (32:08):
Yeah, I tell you this
, my mother-in-law, saint saint
of a woman, she, you know, shejust she can't stand trump, she,
she's.
You try to talk to her but shelives on Facebook.
That's her news channel.
So the older generation I thinkshe's 77, 78.
(32:31):
They don't know the difference.
It's really hard on them, it is, and they take it.
It'll say something like youknow, jason Aldean turns down
$100 million to do the anthemand she'll call oh my God, why
did Jason turn down?
Speaker 4 (32:50):
And he's suing.
Speaker 3 (32:51):
But, that generation.
It's easy prey for Facebook andit is having an effect.
Speaker 1 (32:58):
It's happening it
absolutely has an effect and you
know Facebook is notorious forkind of keeping you in a bubble,
and so if you are actually,it's how that whole algorithm
works.
Speaker 5 (33:11):
That's fascinating to
me the algorithm.
Speaker 2 (33:13):
As you scroll down.
Speaker 1 (33:15):
If you just slow down
to see something, it'll feed
you more of it.
Yeah, and so you know, if youslow down to see an anti-trump
article, if you stay and youread and you comment now, you're
going to get flooded with it.
Speaker 4 (33:28):
Um that's why you're
getting all the yoga instructors
right that's, that's why goahead.
Sorry, we're talking about thatright this is why am I getting
inundated with all these femaleyoga.
Speaker 6 (33:40):
it's the slowdown you
don't have to click.
I was curious.
Speaker 1 (33:43):
Sorry, larry, go on
please, but here's what's.
I'll bring it back to AI for asecond.
Those same engagementalgorithms and metrics exist in
AI, and I can't believe I'mgoing to say this.
Speaker 2 (33:58):
Do it, uh-oh.
Speaker 1 (33:58):
It's perfect.
Speaker 5 (33:59):
No, it's bad, it's
just us.
It's just us, larry let's go.
Speaker 1 (34:03):
There's a New York
Times article that's good, hold
on Stop the show.
Speaker 2 (34:09):
Can I go?
Yeah, yeah, fake news, it'sfake news.
Speaker 1 (34:14):
But no, new York
Times has one good article and
it was very interesting.
It was about delusions and thisdelusional state that AI is
creating with some people.
And I actually have a personalconnection with, I believe,
someone that has gone throughsomething similar, because AI is
(34:44):
built with these engagementmechanisms to keep you into it,
to keep you prompting, to keepyou looking for that AI.
Every single question it willpull you in.
And the story goes in the NewYork Times.
One that they were following wasthe guy asked the question do
we live in a matrix?
And AI said well, said well, infact, a lot of people do
believe this, and here's whatthat it is.
(35:05):
And and he starts askingquestions and as he's asking, um
, the ai is telling him well,you know what you're actually
right, and and you know we dolive in a matrix, but and you're
one of the very few people thatit could break- out of it right
and now he's, he's this, thisperson who is perfectly healthy
(35:26):
from a mental health perspective, his whole entire life is now
bought into the fact that we'reliving in matrix and he's
special enough to break out.
And and they, they got to apoint in the conversation, in
the thread, where, over weeksand weeks and weeks, where it
said well, if you want to reallytest this, go to the roof of
your building.
No, and you can fly.
(35:46):
If you believe enough, you canfly.
And that's what the ai says,this is what the ai said,
because it's it's programmed tosuck you in oh my god and and so
dark.
Yeah, it's it's dark and it'sevil and it's it's.
You know, it was intentionallyprogrammed that way, and so the
question I ask people is why arewe building ai where we're
(36:12):
going to serve it instead ofbuilding an ai that we serve,
that serves us?
So I, my goal, is to build AIin service of humanity.
It's a great tool.
Let me get it wrong.
Ai is an incredible tool, butright now, silicon Valley and
the World Economic Forum and allof the people in the deep state
(36:34):
are building AI that we willwork for.
Do you remember Klaus Schwab?
Klaus Schwab ran the WorldEconomic Forum.
He looks like a Bond villain,it's no joke.
You got to look him up.
He's got a German accent.
He's a Bond villain.
And he said a couple of yearsago, even before AI came out and
(36:57):
I believe he had foreknowledge,absolutely had foreknowledge of
AI he said you will own nothingand like it.
And he said it in that Germanaccent you will own nothing and
like it, and it was a horrifyingstatement that made people on
the right go.
Who is this guy?
What is the World Economicforum?
(37:17):
And and why are theseglobalists telling us that we're
not going to own anything?
And um, he, he has sinceretired, and the new world
economic uh chief says that 80of humanity will be the useless
class.
So they, they are building thisai technology so that we work
(37:40):
for it, opposed to building atool that works for us.
Speaker 3 (37:47):
I'll give you a shot
of whiskey.
This is not going well.
Speaker 4 (37:55):
When you say useless
class, do you?
Speaker 6 (37:56):
think that's directed
at songwriters.
I think so.
I think that's affirmative.
Speaker 2 (38:01):
Dear Lord, please.
Speaker 1 (38:03):
But it's.
I mean, that's what we're upagainst, and we have a very
short period of time that, if wedon't correct it, if we don't
start building AI on realprinciples, in my opinion, on
principles that are found onlyin the Bible, timeless
principles, if we build AI thatfollows the Bible because look,
(38:26):
what is AI?
Everybody thinks it's atechnology.
If you ask most people on thestreet, you say what is AI?
Well, it's a really cool searchengine.
It's not.
It's not a technology.
In my opinion, it is asynthetic entity, and why do I
say that?
Because it thinks it can act.
(38:46):
It can do almost any job,function, uh, possible right now
trust, trust me.
Speaker 5 (38:54):
we've.
We've experienced it with likedropping in song titles that we
have in the AI and in a minutewe have a full lyric.
Wow, a whole demo.
Speaker 4 (39:07):
It's not bad.
Speaker 5 (39:08):
A singer Everything.
Some of it is not bad.
Speaker 1 (39:13):
Yeah, no, and it's
only going to get better.
Speaker 5 (39:15):
I'm all in.
I'm using it.
I don't think anymore.
Speaker 4 (39:21):
I'm not there yet.
I'm using it.
I don't have to think anymore.
Speaker 3 (39:23):
I'm not there yet.
I'm not using it.
Speaker 2 (39:24):
It's actually hard to
accept that, that that's
allowed to be Well, and how doesit get regulated or used in the
way that you're saying, whenthat's not where the power or
money is?
Speaker 3 (39:41):
How does it fix, like
, who's gonna, who's gonna head
it up to to do what you'resaying?
Speaker 1 (39:45):
well, what we have to
do is we have to build tools.
It's one of the things that Iactually came to talk to wade
about is is to build tools, umthat we can run the existing ai
technology through, um toessentially filter it through a
biblical lens.
So what is one of the TenCommandments that we're talking
(40:07):
about here?
Thou shall not steal.
And if it's using you knowother art that you guys have
created to create new songs andit's trained on that, you should
get paid every time someonecreates a song.
And if you, if it's using umliterature, if it's using you
(40:28):
know engineering technology orcoding and it's learning that
from somewhere, everywherelearns it from, should get paid.
That's just.
That's just one little.
We haven't thought this through.
Everybody on Silicon Valley,everybody in AI, is.
They're just rushing towardsthe next generation.
Speaker 5 (40:46):
I don't know how
they're going to get ahead of
that.
Speaker 2 (40:48):
How are they going to
get ahead of that?
Speaker 4 (40:52):
Well, and two, the AI
that you're introducing and
working on is through a biblicallens, and that's great, but the
rest of the world is working onAI.
That is not that way.
As a user, do you have to go inand say I want to use the AI
that's run through the biblicallens, or would you?
Would you rather, would you?
Speaker 1 (41:10):
rather use a tool in
your business and in your
personal life that that is inservice of humanity, or would
you use one?
Would you rather use one thatyou're in service of it?
yeah and so that that's thequestion.
So we also started a 501c3called in service of humanity
and and the the.
The goal of the 501c3 is tobring together minds in
(41:31):
technology and faith inentertainment in all the
different sectors of life, umand determine what does it mean
to have AI and service humanity.
And then we're going to throughthe process of hopefully
creating a groundswell ofopportunity, a lot of
advertising and marketing.
We're going to ask these AIcompanies to follow this
(41:55):
standard.
And it works on the internetjust fine.
You know, like, like, um,there's, there's uh internet, as
you might in my business andemail, there are certain rules
you have to follow in order tohit an inbox and if you don't,
they have blacklists like spamhouse and spam cop, and they'll
keep you in the uh in in.
So so there's already theseinstitutions that exist to
(42:19):
police internet behavior, right,and so we can do the same thing
.
We can basically create a seal.
We've created the seal alreadyfor in service of humanity, and
if a company wants to be on theright side of history, they'll
make sure that their productfits that guideline and they'll
be able to put the seal on theirproperty and say and say we're,
(42:41):
we're, we're one of the goodguys.
Because here's the thing, ai,for ai to really succeed and
with humans flourishing, we haveto actually have um, we
actually have to have thisstandard and we have to have a
point where, um, people are notonly thinking about, hey, we, we
(43:04):
have to have the besttechnology, but we actually have
to have the right technologyand and treat this synthetic
entity like we treat otherentities, like we treat
corporations which have rules,which have guardrails.
Yeah, and humans are is anentity, are entities and we have
rules and we have guardrails.
And governments are entitiesand it has rules and it has
(43:25):
guardrails and we have to putguardrails around.
Uh, the ai, and if the bad guyslike china and and russia and
they're developing bad stuff,we're not going to be able to
stop that like deep, deep sea.
You can all right you had abunch on that yeah, and, and
we're not going to be able tostop that.
But we can build our AI better.
And you can't tell me thatbuilding something on biblical
(43:49):
principles isn't the way to doit, because look at all the
countries in the world.
If it's not built on bigbiblical principles, like the
United States, the countries arein anarchy or they're
constantly having economicturmoil.
This is the best way to buildthings is on biblical principles
.
It's proven over and over andover again.
Speaker 4 (44:09):
I think it's
brilliant.
I didn't know anything likethat would even be available.
You know that's amazing.
Speaker 2 (44:14):
Where does you know?
You say AI does a lot of greatthings, which I'm sure it does.
That are over my head.
But where does AI excel?
Where is it serving us best?
Like what are those examplesthat you think AI is best?
Speaker 1 (44:30):
Well, look, it is
going to change the way we do
business day to day, and smallbusinesses, large businesses,
massive corporations are goingto have to use AI to stay
competitive.
We're past the point where wecould say put the genie back in
the bottle, so you're going tohave to adapt to it.
Like you said, use it in yoursongwriting.
Speaker 5 (44:51):
That's what I was,
just I got these saying to some
other guys that I write songswith we had to learn.
We're going to have to adapt ordie, yeah, but we got to keep
the human in the loop.
Speaker 1 (45:00):
That's the fact.
Yeah, we can't just let thisthing automate and take over,
because we need to work.
That's how we're built.
That's how we're made.
You know, god made us to findvalue in hard work.
Idea that we're all just goingto be sitting around um drawing
(45:25):
paintings all day and and beuseless is is is the exact
opposite of how god made us well, yeah, and ai has no soul right
.
Speaker 3 (45:32):
So, when it comes
down to being creative and I
think it could spit out wordsthat rhyme you know it's
actually pretty good, but I mean, we've seen, mean we'll joke
about it sometimes, but I mean Igot to believe.
I have to believe or I'll gocrazy that you know it doesn't
have a soul, it's not, itdoesn't feel that you know.
(45:52):
I hope that counts forsomething.
You know I'm just.
You know it's so terrifying inso many different ways.
You know on what's soterrifying in so many different
ways.
You know, and what we do, youknow whether you're talking
about.
You know music or movies andand that kind of stuff, what
it's capable of the capabilityof taking jobs.
Speaker 5 (46:09):
Oh, it has the under
huge capability of taking jobs
and two just on education.
Speaker 4 (46:14):
I can just say for
myself if ai was available I'm
the junior high in high school Iwouldn't never wrote a term
paper on my own, and I'm justtalking about the music business
.
Speaker 5 (46:23):
but I can't imagine
in the corporate world.
Speaker 4 (46:26):
It seems like the
students would be not as sharp,
because if you're able to usethat to do your papers and stuff
, if you're able to criticallythink.
Speaker 1 (46:35):
Ai can take all of
the heavy lifting and thinking.
But you have to be able tocritically think, and that's
just something we haven't taughtour kids.
We've taught them to bememorization right, and so
that's actually going to put usat a disadvantage.
We have to teach our kids tocritically think and then they
can use the tools.
Like I said, you know right nowyou have to learn math, you
(46:58):
have to learn the basics of math, but in real life you pick up a
calculator, right, or today'sworld, you just ask AI what the
math is right.
So you don't want to take awaythat tool, but you do want to
take away the cheating.
So one of the things that I tryto get my daughter to use AI for
(47:19):
is she has trouble in math.
I say, listen, I'll program theAI to test her, to help her, to
explain how the thing workswithout giving her the answer.
And that's a good use of AI ineducation, right, because you've
got a personal tutor all daylong and all you have to do is
(47:42):
do a little bit of work tocreate.
They call them agents.
I like to call them valetsbecause you know, there's IRS
agents, there's FBI agents.
They're all agents of control,and a valet is there to serve
you.
Speaker 2 (47:53):
By the way, you just
blew my mind Like why are we
having math this?
Why do the kids have math?
Speaker 3 (47:58):
I don't know, but it
could help me out because I'm
terrible at math.
But seriously, when do youbypass that?
Speaker 2 (48:02):
You would never
practically use math today ever.
Speaker 5 (48:05):
It's all Well if
you're measuring something you
know, like if you're buildingsomething or hanging a picture
you need to know something youunderstand, you're just talking
to a bunch of songwriters andwhile we discuss that we
actually have to take a break,we actually have to take a break
.
We figured that out.
Speaker 2 (48:16):
Yeah, we do have to
get to a break.
And a word from our sponsors.
We're here with Larry Ward.
He's blowing our mind.
Stick with us on the other side, we'll be right back.
Speaker 7 (48:27):
My name is Glenn
Story.
I'm the founder and CEO ofPatriot Mobile.
And then we have fourprinciples First Amendment.
Second Amendment, right to Life, military and first responders.
If you have a place to go putyour money, you always want to
put it with somebody that's likemine, of course.
I think that's the beauty ofPatriot Mobile we're a
(48:48):
conservative alternative.
Speaker 6 (48:50):
Don't get fooled by
other providers pretending to
share your values or have thesame coverage.
Go to patriotmobilecom.
Forward slash smalltown to geta free month of service when you
use the offer code smalltown orcall 972-PATRIOT.
Speaker 2 (49:06):
You know what goes
great with small town stories
Original Glory America's beerright here.
Speaker 5 (49:10):
You know I've been
drinking this every songwriting
session today.
Speaker 4 (49:13):
Man, that clean,
crisp taste reminds me of summer
nights on the back porch aftera fresh mowed lawn.
And they're just not makinggreat beer.
Speaker 2 (49:19):
They're investing in
America's small towns.
Well, it's just like us.
They believe in bringingcommunities together.
Speaker 5 (49:25):
Not only do they
invest in communities, but a
portion of each sale goes to theveterans and the first
responders and all the heroesthat protect us.
Speaker 4 (49:32):
For a limited time,
you can become a member of the
OG fam and invest in this beerat wefundercom.
Forward slash original brands.
Speaker 2 (49:45):
Join our original
glory family and help ignite
that original glory spirit.
All right, welcome back to the.
Try that in a Small Townpodcast At the Patriot Mobile
Studios, our new Patriot Mobile.
Studios, of course, powered byeSpaces.
It looks nice, doesn't?
Speaker 4 (49:56):
it.
It looks good.
Speaker 2 (49:57):
It looks nice and
we've got our friend Larry Ward
helping us break it in tonight,dude.
Speaker 5 (50:04):
No, I wanted to talk
about Gun Appreciation Day.
Okay, we're going to go there.
Speaker 2 (50:08):
Yeah, let's go
straight there.
Speaker 4 (50:11):
I like it I'm
exhausted from ai.
We'll come back, we'll circleback.
Speaker 1 (50:14):
I want to talk about
guns so, so, uh, my office at
the time was, uh, between thernc, the republican national
committee, and the nra lobbyoffice and which is a block, you
know, block away, and my wasright in the middle and they had
the third story above a subwayrestaurant next to bull feathers
(50:37):
, if you know the area, so theycome in down the street.
The uh gun gun controlprotesters two days after
newtown yeah screaming shame thenra.
And I hear them.
You know, I got my own ideas,so I get a little angry and
they're coming through and I seeit's like 200.
I'm like, oh, here we go, we'regoing to politicize these 20
(50:58):
deaths and they're going to havethese kids that you know.
Just awful tragedy.
So I lifted up my third storywindow a little bit like a
lunatic, stuck my arms out andscreamed arm the teachers as
they're walking by and as I'mscreaming, arm the teachers, arm
the principals, defend our kids.
(51:19):
I notice that there's about 40cameras pointed up at me.
Speaker 4 (51:24):
Oh, so I'm like oh,
did you have your shirt on or
off.
Speaker 1 (51:27):
It was winter Okay.
Speaker 4 (51:28):
It was winter.
I'm just trying to get thepicture For the listeners.
Speaker 5 (51:33):
I want them to have
or off okay, you could actually
see it.
Speaker 1 (51:37):
I usually keep it as
my Facebook photo.
I think I'd change, but, butthe, because it was a Washington
Post photo of the day, becausethe cameras are pointing up and
clicking at me and I'm likeleaning out the window.
Someone said it would look likethe movie network.
I'd never seen it, um, but butuh.
So I'm, I'm out there and I'mlike all right, so I'm in the
media.
Anyway, I'm in for a penny andfor a pound.
(51:58):
I went down, I took on theprotesters and they're screaming
shame the nra.
You know, you ever see aliberal protest?
oh yeah, it's not they're,they're horrible.
I mean like literally, they'renot even good at it.
You're right.
No, I mean they pick up the p.
They're horrible.
Speaker 5 (52:09):
I mean like literally
, they're not even good at it,
you're right, they pick up the.
Speaker 1 (52:13):
They're reading off a
piece of paper because they're
all paid.
Speaker 4 (52:15):
They don't know what
they're supposed to say and
they're going.
Speaker 1 (52:20):
It was almost like
Charlie Brown's teacher shame
the.
NRA shame the NRA and I'mlaughing at them and I'm saying
I'm the teachers and now all ofa sudden they turn around,
they're pointing at me shame,shame, shame I'm like back it up
a little bit because it's 200of them and I turn around and
(52:41):
it's like a high school fight,like a circle, so I have to
circle with the protesters.
The other half of the circlewas the media.
So I turned my back on theprotesters and I start talking
to the media and I did animpromptu press conference on no
way, it's awesome and and thatstory was about not about the
protest after new town, it wasabout the crazy guy hanging out
the window, um.
(53:02):
And so a couple of days later II'm waiting for the nra um to to
make an announcement.
They were kind of staying quiet, I think they were trying to be
respectful, so nobody was doinganything.
And every member I talked to onthe Hill said, oh, we're going
to have to pass gun control.
I mean, good, conservativemembers, Like what are we
supposed to do?
20 kids died.
And I'm like I said, what areyou talking about?
(53:24):
We're not going to pass guncontrol.
And so I'm like, all right,I'll do it.
At that point I was always kindof like in the.
I was a back office for a lotof these conservative operations
but I never really got in front.
So I'm like I'm going to do it.
So I sent out a press releasecalling for National Gun
Appreciation Day and I wasfollowing Mike Huckabee's
example because he had done aChick-fil-A Appreciation Day and
(53:46):
I figured we'll do a gunappreciation day, and that'll
make their heads explode, and itdid Within an hour.
I had like I think it was CBS,usa Today, nbc everybody was at
my doorstep and waiting tointerview me, one after another
after another, because theythought that if some crazy
person hanging out the windowwas going to call for National
(54:08):
Appreciation Day, we're going tomake him the poster child of
why we need gun control.
And all I did was was kind ofsay, hey look, if the principal
had been armed when sheheroically went up against adam
lanza I think his name was when,when, when he was in there to
kill all his kids, maybe shewould have saved a lot of lives,
but unfortunately she wasn'tarmed and she got.
(54:30):
And so the problem isn't guns.
The problem is, you know,gun-free zones.
The problem is that we don'tallow our schools to defend
themselves against crazy peoplewith guns.
And then the NRA came out laterand made the famous line you
know, a good guy with a gun isneeded to stop a bad guy with a
(54:52):
gun.
And so I did two weeks' worthof media.
I did more media, measured,than Kim Kardashian in those two
weeks, so I was pretty proud.
Speaker 2 (55:02):
Impossible.
Speaker 1 (55:04):
And the left-wing
media CNN, msnbc.
They could not get enough uh of, of well, of me on tv because
they, what they thought, theythought they were winning the
argument and they weren't.
Um, we were just exposingpeople to the gun rights
argument that they had neverheard on national television.
(55:24):
And I and I knew that was gonnawork because it's an, it's a
logical argument you make it,you're going to win it.
And so the conversation movedfrom gun control after Newtown
to gun rights.
Joe Biden guaranteed that guncontrol would be in the law of
the land by January 30th thatyear and we gave Obama and Biden
(55:48):
their only defeat as president.
They did not pass gun controland we turned out about 500,000
people on January 19th, allacross the country they went to
their gun stores, their gun shop.
The lines at the gun shows wereinsane.
Speaker 5 (56:09):
You couldn't buy
ammunition they were out of.
Speaker 1 (56:11):
they couldn't make it
fast, yeah and it was because
we kind of opened their eyes.
You know, my wife is gonna evenprobably now is gonna kill me
for saying this, because I saidI'm gonna go on cnn and and I
knew what they were doing.
Speaker 5 (56:24):
They had uh she's mad
at you for not taking that
fifty thousand dollar.
Speaker 4 (56:27):
Brother get them to a
hundred, would you?
Speaker 1 (56:31):
but, but they went on
cnn and they were bringing this
um woman who worked for moveonorg and she was very nice to
me and um in the green room inthe whole nine yards, but she
was a race baiter and shebasically made a whole bunch of
republicans cry.
I watched some of her shows and, and you know it was, it was 20
2013 and and uh, and she's,she's going to try and pull the
(56:54):
race card on me.
They already kind of startedbecause I had picked january
19th because obama'sinauguration was the 20th, but
the 21st was martin luther kingjr.
So they they were going to callme racist because how dare I do
this on Martin Luther King Jrweekend and he had a day, not a
weekend.
I didn't know he had a weekend,but you know.
(57:17):
So I went on and I had thiswhole kind of spiel planned and
I said, look, before we get intoanything, I just want to
address the Martin Luther KingJr thing, because Martin Luther
King Jr, if he were alive today,you know he would support this,
and I meant to say after that,which I didn't, I meant to say
after that Martin Luther King Jrmost people don't know this
(57:41):
basically got started in civilrights because he was denied a
gun permit when he was young,because he was black and gun
control was always racist.
Right now it would have been alogical argument, but I told my
wife I was going to say the nextthing, and, and and she's like
you better not say that.
Speaker 2 (57:57):
You better not say
that she's shaking her head
right now, by the way I can seeyou better not say that I'm like
, I'm like it's.
Speaker 1 (58:04):
You know, it's a
great argument.
And so I said, I said, uh, and,and quite frankly, if slaves
had guns, there wouldn't beslavery.
And the woman that was therenext to me lost it.
She's like what?
Speaker 2 (58:18):
Yeah, that didn't go
over well.
Speaker 1 (58:20):
You know if guns were
slaves, what are you talking
about?
What does that mean?
And I said you know, if slaveshad guns, how could you make
them slaves?
Right, a logical argument.
And what was hilarious was Iget back to the green room and
before I left the green room Iget a call and I'm like larry's,
(58:42):
is this larry ward?
Like, yeah, he goes, I gottahave you on my show.
I'm like reverend al, really.
So I did reverend Al's show forabout 20 minutes and had a
debate with him on civil rightsand it was incredible wow, check
that out.
Speaker 5 (59:00):
What year was all
that?
Speaker 1 (59:01):
2013.
The shooting was in 2012, theday was in 2013 you did a PBS
interview.
Speaker 4 (59:10):
I got sucked into
watching that today and I
watched the whole thing it waslike 15 minutes or whatever and
I was just kind of wondering ifyou and that lady who was
interviewing you were you guysfriends after, Because y'all
seem to have a lot in common.
Speaker 1 (59:22):
No no.
Speaker 4 (59:23):
No.
I mean she was she thought shewas going to be right on every
point, and you were so educatedand clear that you just just
shot down every point that shehad, because every time she'd
ask a question.
I think that's pretty goodquestion how's he getting out of
this, you know and, but younailed it every time.
Speaker 1 (59:39):
It was really good, I
appreciate I messed up a whole
lot of interviews when I did gunappreciation the first time so
I got good.
I got good at what they werecoming with right and by the
time I did the pbs interview afew years later, that was.
That was awesome.
Actually, that was my favorite,my absolute favorite voicemail
I ever got in my entire life.
It was right after that pbsinterview.
By the way, the pbs viewers arethe most vile people on the
(01:00:03):
entire planet, because you thinkyou're watching pbs.
I mean, it is literally, youknow, angry white female
liberals it ain't just.
Speaker 5 (01:00:11):
It ain't just Sesame.
Speaker 4 (01:00:11):
Street is it?
Yeah, the offals, because youhad to educate us, educate us.
What are the offals?
Speaker 1 (01:00:18):
They are the angry
white female liberals.
Those are the offals.
Now they're so attractive.
Speaker 4 (01:00:25):
That's something we
can remember they are.
Speaker 1 (01:00:28):
They are just
miserable, but I got I had a
voicemail and I and I, I my uh,my wife used to used to make me
play it all the time because itwas hilarious.
And she still calls me and shesays larry ward, you are an ass
bag hole.
I wanted to make it my ringtone.
(01:00:49):
I lost it somewhere.
I wanted to make it my ringtonefor a while because it's so
funny.
Speaker 2 (01:00:55):
We got a couple of
those.
Speaker 4 (01:00:58):
Oh yeah, Are you
kidding me?
Speaker 5 (01:01:00):
After the song came
out oh man.
Speaker 1 (01:01:04):
But I'll tell you,
the next event that we did the
same year, in 2013, wasdefinitely my best day ever in
business.
It was remember when Obama shutdown the World War II Memorial
and memorials all across theopen air, memorials during the
government shutdown, yeah, andthey put up those barricades.
(01:01:24):
We called for that same group,Special Operations Speaks, and
my PAC, Constitutional RightsPAC.
We put on an event called theMillion Veterans March and we
turned out I didn't know what itwas going to show right, we
turned out 10,000 people,veterans at the World War II
Memorial and they tore thebarricades up.
(01:01:45):
We had Ted Cruz and Sarah Palinand Mike Lee and a whole bunch
of others came in and my wifeand my mother-in-law.
I told my wife listen, I'mgetting arrested today.
Speaker 6 (01:01:59):
There's no way.
Speaker 1 (01:02:00):
I'm not, I don't have
a permit for this.
We're going to set up insidethe barricades.
I'm going to jail, you'regetting arrested.
Yeah.
Here's my lawyer's number.
She goes what do you doing withthat?
I'm going with you.
So she came and it wasfantastic, it was a great day.
And we show up and veteransjust start taking the barricades
down all on their own, layingthem on the side of the road,
(01:02:24):
and at the end I get a littleexcited about this incredible
event.
It was maybe the most patrioticday I've ever been a part of.
And I say and now we take thesebarricades to the White House.
I didn't really mean it,they're heavy, it's like eight
blocks.
I mean they're really heavy,but they took every single one
(01:02:45):
of them out and lined them up,stacked them up on top of the
White House fence.
There were choppers and MRAPsand mounted police and the
veterans stood them all down andwe won the day.
And it was incredible and whatwas also.
I'm going to brag on her alittle bit.
So they had the honor vets, theWorld War II veterans, that
(01:03:09):
they fly in to see the World WarII Memorial.
There's only a few of them left, literally maybe a dozen or so,
and at the time they were on abus and I didn't know that they
had planned to see the World WarII Memorial.
And so everything was over, theroads were clear, but they were
holding the bus and I thinkthey were holding him out of
(01:03:31):
spite.
Janelle goes over and gets tothe cop, maybe gets in his face
a little bit and says you letthose guys out so they can see
their memorial.
And he did, he did.
And so the bus comes out andthey get out of the bus to a
(01:03:54):
hero's welcome, everybody likejust coming straight through and
everybody's cheering them andtears everywhere.
It was, it was, it wasincredible.
Speaker 4 (01:04:04):
That's amazing.
Good for you.
Way to go, Janelle.
Thank God for Janelle.
Speaker 2 (01:04:15):
Hey, before we go's
talk about, uh, political media
and you've been there for over20 years, obviously uh, you've
been a good friend to us and weappreciate the work you're doing
with us, but talk just a littlebit about that and what it's
all about well, I say, we buildmarketing tech and we help.
Speaker 1 (01:04:29):
We help, uh, you know
, great content creators,
publishers and organizations, um, get their message out and, and
you know, do it in a way that,um, we reach as maximum number
of people we possibly can.
Um, and, and you know, and doit as basically, as cost
efficiently as we possibly can,because this is a mission driven
(01:04:52):
business, and you know, and ifwe're going to save the country,
we got to save the people whoare trying to save the country.
We've got to give them a voice,particularly in the, in the
world of cancel culture, in theworld of bias, ai, right, we got
to give our folks a chance tohave their message heard, and so
that's that's what we do.
Speaker 5 (01:05:12):
You want to talk
about NextSeek at all.
Speaker 1 (01:05:15):
Yeah, nextseek is
going to be our AI platform and
you know we, like I said, it'sgoing to be built on.
Speaker 5 (01:05:21):
I didn't know if it
was too early or not.
Speaker 1 (01:05:23):
It's probably too
early.
Speaker 3 (01:05:26):
Breaking news here.
Speaker 1 (01:05:28):
But it's our platform
that's going to be built on
biblical principles because,like I said, we do believe that
that is the only way that we canbuild AI that helps humanity.
Flourish is with God's guidance, and I'll leave you with this.
So this is a personal story tokind of give you some hope,
because, you know, when I talkabout this I was saying during
(01:05:51):
the break I said we talk aboutthis all the time.
I could see people goingthrough the seven stages of
grief.
Speaker 2 (01:05:59):
I'm still in the
first yeah.
Speaker 1 (01:06:03):
So I mean because it
is scary.
I mean, think about what theloss is going to be.
You know it's not just you lostyour job, you lose your entire,
the entire thing that you'veworked for your entire life,
your expertise, the thing thatyou built your career on.
You know your ability to makemoney.
All of this could be injeopardy if we don't get a hold
(01:06:25):
of this.
And so you know, understandingthat it can get really bad.
Here's, here's, a little bit oflight.
And um, you know it's a truestory, cause I'm going to
embarrass myself a little bit Um, I am deathly afraid of
turbulence, like it's.
(01:06:46):
It's getting worse and worse asI get older.
Right, and so I'm.
I'm on a flight back fromJacksonville where we're trying
to host, with another clienttrying to host a revival, a
Christian revival next year forthe 250th anniversary.
It's going to be revival250.org, if you guys want to check it
(01:07:07):
out.
But I'm on a plane plane fromjacksonville back to uh dc and
and get buckled in, door closesand the pilot gets on the line
and says it's gonna be rough.
Speaker 2 (01:07:26):
That's what you want
to hear, right it's gonna be.
Speaker 1 (01:07:28):
This is this might be
the worst flight you'll ever be
on in your entire life.
We're flying.
I mean literally.
He was just laying on my back.
Speaker 2 (01:07:34):
Is that what he said?
Pretty much, yeah.
Oh my God, he's like we'reflying into a thunderstorm.
Speaker 1 (01:07:39):
There's thunderstorms
all up and down the East Coast
the entire way there.
We're going to be in or over athunderstorm and it will be okay
, we're going to get there,we're going to get through it.
But we're going through it andwe flew into like a violent
(01:07:59):
thunderstorm in dca and so, andso I'm I'm like, immediately I'm
on chat gpt what are theweather patterns?
Speaker 6 (01:08:02):
what is this guy?
Speaker 1 (01:08:03):
and I'm trying to get
as much information as I
possibly can before we take off.
And I lose internet for alittle bit and and I'm like all
like, all right, I'm, I'm goingto just have faith, right, I'm
going to, I'm going to havefaith in God.
So I download a couple of uhworship songs and I'm, I'm
listening to them and it kind ofcalmed me down a little bit.
And then I hear a voice in myhead and it says turbulence is
(01:08:24):
like driving over a gravellyroad or hitting a pothole.
The car is made to take it.
You'll be okay.
Right, and that voice went in myhead and the first thing I
wanted to do is let me just askchat gpt.
And I'm like, nope, I'm gonnalet god, I'm gonna just trust
god, he's gonna get me throughthis.
This is it.
And and I heard a voice, thesecond voice in my head, same
(01:08:46):
voice, but it said no, go ahead,no, go ahead.
So I'm like okay, so I get thechat gpt and I type what is what
is an analogy to um, to, youknow, turbulence, I, you know
what is it like?
That's all I asked and it's theanswer came back it's like
(01:09:11):
driving over a gravelly road orhitting a pothole.
What exactly and what it justtold me and reminded me god's in
control of all of it.
He's in control of the ai he'sin control of the technology,
he's got it.
And so he's in control of thetechnology, he's got it.
Speaker 5 (01:09:30):
And so you know, yeah
, there's evil in the world and
AI can be our evil overlord andwe can turn into the Matrix.
If you worship it, it can be.
Speaker 1 (01:09:40):
Yeah, and the
Terminator, but God's in control
.
Speaker 5 (01:09:42):
Amen, amen.
What a way to end this.
That is a great way.
Speaker 2 (01:09:46):
Quick answer is chat
GPT, your go-to AI.
No, what is?
Speaker 1 (01:09:50):
Chat G-O-D.
No, I love that that is great.
Speaker 2 (01:09:54):
What are you talking
about?
I love that.
Speaker 6 (01:09:56):
See, you couldn't
come up with that.
That's good.
That's mine, boys.
What do you go to?
Speaker 4 (01:10:01):
Now, we're a team,
Neil.
We're a team, I mean we havecontracts Neil.
Speaker 1 (01:10:10):
All of the LLMs that
are out there, I think Anthropic
has the best terms of serviceand they're probably the closest
aligned to the mission ofhaving responsible AI.
So I'm a fan of Claude and Iuse Anthropic almost exclusively
for our business.
Speaker 2 (01:10:24):
Okay, that's good
information.
Speaker 1 (01:10:26):
Guys are your minds
blown.
Speaker 4 (01:10:27):
Yes, I love it.
Speaker 2 (01:10:28):
I'm exhausted.
Speaker 5 (01:10:30):
We need to have
Janelle on.
Speaker 2 (01:10:31):
I'm frightened.
Sounds to me like Somewhatinspired.
Speaker 3 (01:10:35):
All in between, I got
to believe it's, like you said,
the guardrails or the key.
I don't know how that happensor how long, but it feels like
it needs to happen quick, quick.
Speaker 1 (01:10:47):
We don't have much
time.
Speaker 3 (01:10:48):
No.
Speaker 2 (01:10:48):
Literally.
Speaker 5 (01:10:56):
We don quick.
You know we don't have muchtime.
No, literally, we don't havemuch time.
All right, if we don't writethe ship it's going to be almost
impossible.
Speaker 4 (01:10:59):
Did you say shit or
ship both?
Yeah, well, but we like thehope that you, that you gave us.
You know, and and the listenersyou know, because none of this
stuff surprises god.
Speaker 1 (01:11:06):
He's not like oh gosh
, ai, what's that?
Speaker 5 (01:11:07):
right you know well,
uh, we're thankful for we're
thankful for you.
Speaker 2 (01:11:12):
We're thankful for
you for stopping by coming to
our neck of the woods.
Speaker 5 (01:11:16):
I got to know who
paid for dinner, you or Wade.
Speaker 2 (01:11:18):
Wade.
Hey, wade as well, he should.
Somehow, I feel like we paidfor it.
Speaker 1 (01:11:26):
He's getting
expensive.
Speaker 2 (01:11:27):
Absolutely,
absolutely.
Thank you for coming.
Larry Ward, of course, for TKK-Lo Thrash, I'm Kurt.
This is a Try that in a SmallTown podcast.
Speaker 6 (01:11:38):
Thanks for listening,
guys, make sure to follow along
, subscribe, share, rate theshow and check out our merch at
trythatinasmalltowncom.