All Episodes

April 8, 2025 58 mins
Those Buz Guys talk about Val Kilmer, Dire Wolves, Wooly Mammoth's and crazy AI news. What about the honeybees?
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Three two one, Hey, bus Heads.

Speaker 2 (00:09):
Welcome to us.

Speaker 1 (00:10):
Video's head radio, close Radio Close had radio?

Speaker 2 (00:20):
Okay, ju ju cou could you answer that?

Speaker 1 (00:24):
Put wrong?

Speaker 2 (00:25):
Podcast?

Speaker 1 (00:27):
God that always get confused?

Speaker 3 (00:29):
Okay, hey mister where we're back for another episode of
the Balls Held Radio podcast.

Speaker 2 (00:39):
And no we're not a radio station.

Speaker 3 (00:41):
We are podcasts that used to be a radio station
that would like to be a morning radio show. If
there's anybody out there with a radio show and no
morning program, Yeah, give us a ringy dinghy at five
eight oh five four one three eight oh five. We
would love to negotiate a million dollars deal with you.
Or you can email Buzz at Bussheadmedia dot com. Get

(01:04):
ahold of us know which let us know what you're
up to. And if you guys are listening to us
from another country other than the United Stuts of Marca,
let us know.

Speaker 2 (01:14):
We'd love to hear.

Speaker 1 (01:15):
Oh we got they're all over the place.

Speaker 2 (01:17):
What country you're listening to this podcast from?

Speaker 1 (01:19):
Uh?

Speaker 3 (01:21):
Dave called so here's so Dave. We've been doing this
for quite a while and the number, the phone number
has always been the same. So five eighth five fur
one three eighth five is the Buzzhead Hotline, and that
is for seventies buzzheads and Buzzhead Radio buzzheads. So we

(01:43):
do know, you know, depending on what you guys are
talking about, we'll throw whatever your question is on the
most appropriate podcast, so we don't.

Speaker 2 (01:53):
Need two different numbers. Yeah, so that's how we know Dave.

Speaker 3 (02:00):
And really most of his questions were seventies related other
than that one, just kind of reiterating let him, him
and everybody else know that there aren't.

Speaker 1 (02:10):
So on this podcast. United States, Canada, Australia, United Kingdom, Chechia, Chechia.
It's c Z E C H I A Spain, Argentina, Bangladesh, Germany, Ireland,
Sri Lanka and Norway.

Speaker 2 (02:29):
Man. If you're from Bangladesh, email.

Speaker 3 (02:32):
Or call let us know what you're up to in Bangladesh.

Speaker 1 (02:36):
Where the heck is Bangladesh, cool Mouth Division? Where is Bangladesh?

Speaker 2 (02:44):
Is it in India? I don't know.

Speaker 3 (02:45):
I hate I'm going to I'm going with India, but
I could be wrong. I've been trying to study maps
and geography so I can remember where all these places are.

Speaker 2 (02:58):
Bangladesh.

Speaker 1 (03:01):
I think it is India.

Speaker 3 (03:02):
Yeah, it's kind of kind of what I'm thinking, I think, yeah,
so I saw another documentary this weekend, and especially since
we just had the earthquake in marin Mar that like
totally wiped out the town. Well then I think twenty fifteen,
there was a huge earthquake in Catman Do and Catman

(03:27):
Do is I believe in Nepal, which is like where
Mount Everest is, and so this documentary was about people
that were in Catman Do, people that were in this
other valley, not that it was just a ways from
Mount Everest, and then a group of people that were

(03:49):
on Mount Everest and it was one of those big
earthquakes like wiped out Catman Do. Like there was like
buildings and it was just it looked like a you know,
a huge torn had just wiped out the town. But
what was really interesting was the people up on Mount Everest,
because I just watched that Mount Everest special. The there's

(04:14):
base camp and the base camp has gotten like its
own it's like its own town now.

Speaker 2 (04:20):
It's so big, but it's all tents and stuff.

Speaker 3 (04:22):
And then you leave base camp and you go up
to I think there's I don't know what they call them,
camp number one, number two, number three, and I think
maybe number four, and you go to each of those
and then you kind of go back and forth and
you kind of you have to get used to the
altitude too. And then one day your guide says, we're
going for the peak, you know, and you just I

(04:44):
don't know how they decide what day, but anyway, what
they were doing was they were interviewing some people that
were on the like number one camp above the base camp.
And when that earthquake hit, it caused a huge avalanche
and it wiped out the base camp. I think it

(05:05):
killed I don't know if it killed everybody, but it
killed almost everybody in the base camp. So when the
people that were up on the higher camp, they were
kind of trapped because it it kind of it the
avalanche covered the trail and you couldn't see the trail anymore.
So they had they got helicoptered, I believe, out, but

(05:27):
to try to get the people back and forth as
quick as possible, they were taking them from that camp
down to the base camp. Well, when they these people
got to the base camp, there was nobody there. There
was no tents, it was wiped out. And then in
this other valley there was this cool little town and
in this cool little town that was in this valley,

(05:49):
there was this peak that people would climb up to
and look down in the valley.

Speaker 2 (05:54):
Because it was like this beautiful view. And so there
was a whole.

Speaker 3 (05:58):
Bunch of people that were up on this peak and
the earthquake kit and they they there was like a
building up there, and they all got kind of got
trapped in it and they were all safe, but they
didn't have any food because they were just up there
to like take pictures and stuff. So somebody said, well,
let's go back to the village and get some food
and water. And they went down in that entire village

(06:20):
had gotten wiped out by an avalanche. I mean it
was literally all the houses were gone.

Speaker 1 (06:25):
Wow.

Speaker 3 (06:26):
And so they had to get helicopters in to get
that group of people out. And then they had different
stories of in Catman Do of saving people under hotels
and it was crazy. And then I had to look
up because I was like, there was no there's no
way even though it was in twenty fifteen that they
could have rebuilt. And yeah, and so even today, the

(06:47):
whole Catman Do's not even rebuilt. There's still areas that
I guess looked like it did the day the earthquake kit.
So anyway, and then I don't think we've talked. I
watched a documentary on a guy named Johnny Strange.

Speaker 2 (07:02):
Have I talked about that?

Speaker 3 (07:03):
He did the wing flying. So there was this kid.
I don't know how much you guys want to hear
about documentaries. It was kind of cool, this kid that
just he was one of those kids that was just
from California that was wild, that like that couldn't get
enough speed and enough flying, so he started, uh parachuting. First,

(07:25):
he he was he was a guy that had climbed
up on top of a car and was going down
the highway surfing on top of the car in California
and got reported and it made big news and he
had to get on and apologize. But then he was
also a guy that got on a skateboard and went
down the highway one oh one on a skateboard and
somebody was filming him. So he always he was always

(07:47):
doing these really wild things. And eventually he got into
the wingsuit jumping. And there's different levels of I think
it's called wingsuit diving. There's some people that like just
dive out of hot air, balloons or off of buildings,
and then you just kind of go down. And then
there's people that jump off of mountains and then they

(08:08):
fly really high above the tree tops. And then there's
the very few that actually jump off and they go
in between the trees like they're like literally twenty feet
above your head when they fly by. And he he
had gotten to the level where he was one of those,
but he was I think he was like maybe twenty three,
so he didn't really have the experience to be doing

(08:31):
what he was doing, but he just kept pushing and anyway,
he I don't want to spoil the ending, but I'm.

Speaker 1 (08:39):
Assuming he's not with us anymore.

Speaker 3 (08:41):
They say, no wing suit guy usually makes it past
six years. They almost all die within six years.

Speaker 2 (08:48):
And what are we doing that?

Speaker 3 (08:50):
That's yeah, that's people ask why and they're like, we
don't know, we just because it's there.

Speaker 1 (08:56):
I guess I watched the documentary about the OKM de
bombing because that's yeah. I think Denise was watching it
at anniversaries this year thirty years. Pretty not really much
news stuff, but.

Speaker 3 (09:10):
Just I think it had a lot of interviews with
people that were there covering it and stuff.

Speaker 1 (09:13):
Yeah, and Robin Marsh. Yeah, I was her very first job.
Oh wow, she'd only been there a few days. She's
in it quite a bit and talks about how they
just kind of threw her out there and you know,
it was a crazy time and she's still on.

Speaker 2 (09:29):
Yeah, she paints.

Speaker 1 (09:31):
Now.

Speaker 2 (09:31):
Have you seen any of her paintings.

Speaker 1 (09:33):
I guess not.

Speaker 3 (09:34):
Yes, some really she's kind of got a style that
I would like if I when I paint, if I
like paint, paint, it'll be kind of like her style.
She's kind of kind of like big portraits or big flower, colorful, splashy.

Speaker 2 (09:48):
She does really good, really good art.

Speaker 3 (09:50):
But yeah, so we got to talking about that. So
whatever day, I can't remember whatever day the explosion was,
but Teresa and I had gone down like about three
days after the explosion. It was just one of those
deals where you were close enough that you just had
to go see it. And so we went down and

(10:12):
they had they had just put the chain link fence up,
but you could they were still searching. So you could
stand by the chain link fence and the spotlights were on,
and I took a ton of pictures. But of course
when we split, she I said, you know, take what
you want and she took everything. So so I don't

(10:32):
have any pictures, any photos from that, but I had
gotten quite a few pictures of the building.

Speaker 1 (10:39):
Yeah, we went, we went down there before I think
we went before they demolished and demolished it. Uh had
all the change stuck in the change. Yeah, people were
sticking flowers and teddy bears and yeah. And they and
it aggravates me every time they say how many people died,

(10:59):
but but they don't take in consideration. I think there
were three women that were pregnant that died. Account those
I'm like, well, that's not right.

Speaker 2 (11:06):
Yeah.

Speaker 3 (11:07):
And it's interesting because they interview Stephen Jones and he's
from Enid and ended up defending or yeah, defending mc M.

Speaker 1 (11:16):
Yeah, and he makes he points out at the very
beginning that he was he did not request to do that.
He was told to do that and he had to
do it, which is not true at all. You didn't
have to but he you know, he made a lot
of money.

Speaker 2 (11:37):
I'm sure he did.

Speaker 1 (11:37):
But yeah, that was a no win case that where
it's almost in no way you were going to win, no,
and the guy did deserve competent uh defense defense Sure, yeah,
I mean and and you know, but yeah, but they
never talked about the second guy or the alleged guy
that they never found. But what was no license plate?

Speaker 2 (12:03):
Really?

Speaker 3 (12:05):
Yeah, and there's a kind of I can't I'm trying
to remember what the story was on that. If I'm
remembering right, Teresa's mom and a friend were driving north
on I thirty five after the explode, or don't. I
don't know if it was the day of or the

(12:26):
next day when I think it might have been the
day of, you know, I think they had a flat
tire and a highway patrolman pulled over to help them
with their flat tire, and he after he left them,
he's the one that pulled over McVeigh. So had they
not had a flat tire, the timing might not have

(12:50):
been where that highway patrol was behind him without the
license plate.

Speaker 1 (12:54):
Wow, So that's crazy.

Speaker 2 (12:56):
That is kind of crazy.

Speaker 1 (12:57):
I mean, and you go to all this trouble to
plan this truck bomb and you take a car get
away car with no license plate. Yeah, but it's just yeah,
was he just asking to get well spid.

Speaker 3 (13:14):
Well, he just probably didn't even if he got pulled over,
you know, he probably thought, who would think I'm the
one that blew up?

Speaker 2 (13:21):
You know.

Speaker 3 (13:22):
So what's really cool for me and Todd is we
are going through a citizen's academy for the Oklahoma State
Bureau of an Investigation right now. And each week they
tell us stuff about the OSBI, And like last week
it was about a case that they solved after twenty years,
and so next week there's going to be a cold case.

(13:43):
But it's really cool getting to So if you're ever
I told Todd, if you ever thinking about committing a crime,
take this class because they will inform you on how
they're going to catch you. Because I mean it's crazy
some of the ways they catch it's not even technical
or it's just circumstance.

Speaker 2 (14:05):
You know, it's just it's real wild. But anyway, it's
it's kind of cool.

Speaker 1 (14:08):
Son, gonna commit a crime, don't do it with anybody,
do it by yourself.

Speaker 2 (14:12):
Definitely do it by yourself to somebody will talk.

Speaker 1 (14:15):
They always talk, They always talk, always talk. So did
you see they're bringing back there's a new word de extinction.

Speaker 2 (14:25):
Oh yeah, that's on my list right here.

Speaker 1 (14:27):
Yeah, the dire wolf.

Speaker 3 (14:29):
Yeah, and so there's you know a little bit of
a controversy whether whether it's really a dire wolf or
is it a gray wolf that they change the color
of the fur. You know, I guess they're kind of
saying it kind of depends on what your definition of
extinction is is or whatever. Brought back after more than

(14:54):
ten thousand years. And it's a biotech company I believe
out of Texas. Yeah, and the these wolves were actually
born in October of twenty twenty.

Speaker 1 (15:02):
Four u Premus Andulus.

Speaker 2 (15:07):
Yeah. And there's actually been three third ones younger and
it's female.

Speaker 3 (15:12):
The two that are together are are boys, and then
there's a third that's a female. And the company achieved
this feat by cloning and gene editing techniques based in
two ancient dire wolf DNA samples. They took a thirteen
thousand year old tooth and a seventy two thousand year

(15:34):
old skull and made healthy dire wolf puppies out of them.

Speaker 1 (15:39):
How do you do it?

Speaker 2 (15:42):
Let's see, They explained.

Speaker 3 (15:43):
The de extinction process involved taking blood cells from a
living gray wolf, the dire wolf's closest living relative, and
then genetically modifying them in twenty different sites. So basically,
they looked at the DNA of the difference between the
gray wolf and the direwolf, and then replaced in the

(16:07):
gray wolf the parts of the dire wolf in the DNA.

Speaker 1 (16:13):
So where were these creatures incubated?

Speaker 2 (16:16):
Okay?

Speaker 3 (16:16):
Then they transfer the genetic material is transferred to an
egg cell from a domestic dog. Then the embryos were
transferred to sarrogates for gestation and finally successful birth.

Speaker 1 (16:30):
Okay, so it sounds like a dog. A domestic dog
carried them. Scientists are on the vert okay.

Speaker 3 (16:37):
And now that they've got the wolves and and they're
not they're not going to bring the species back. It's
only these three and they live on a you know
a thing, you know thing where they're in a fence
and they're not ever getting out there.

Speaker 2 (16:53):
And they don't think they're gonna let them rereed. They
don't think.

Speaker 1 (16:56):
But yeah, this is where it all starts, folks.

Speaker 3 (17:00):
So you think AI is your problem. Scientists are on
the verge of resurrectoring the long extinct Dodo bird Dodo
bird and the wooly mammoth.

Speaker 2 (17:10):
And there's another one and another one.

Speaker 1 (17:12):
It is the the Thighless scene, also known as the
Tasman Union tiger or a Tasmanian devil in Australia.

Speaker 2 (17:25):
So yeah, So with the wooly mammoth, well, the first.

Speaker 1 (17:28):
The Dodo bird flightless bird that became extinct in sixteen
eighty one because it was too friendly and it couldn't fly,
And so when man showed up on the islander wherever
they were, they started.

Speaker 2 (17:42):
I guess, killing them to eat them.

Speaker 3 (17:43):
And then the animals that man brought killed them because
the Dodo bird was so friendly it didn't know anybody
was going to try to kill it, so.

Speaker 2 (17:51):
It killed them.

Speaker 1 (17:52):
All. Hey, guys, what are you bad?

Speaker 3 (17:54):
Yeah, exactly, And they think they'll have them back by
twenty twenty eight. And then with the Wooly mammoth, its
closest geneic genetic relationship is the Asian elephant, which shares
ninety eight percent of the same DNA. They are working
on reviving it by editing the remaining two percent with

(18:16):
the mammoths genes. The mammoth like elephant is expected to
have characteristic features such as the fuzzy fur and blubber
to survive cold climates. Timeline for the embryo is set
for twenty twenty six. The first animals possibly being born
by twenty twenty eight. And then again the Tasmanian tiger.

(18:39):
It's almost cruel to not let them breed.

Speaker 1 (18:43):
I mean they're just going.

Speaker 2 (18:44):
To die, going to go extinct a second time.

Speaker 1 (18:48):
That's weird.

Speaker 2 (18:49):
Yeah, I mean I guess the question is what is
the what are they doing this for? What's to.

Speaker 1 (18:59):
Help with the the So species that are almost extinct
now don't become extinct?

Speaker 2 (19:06):
Yeah, but that's you know how many I say that's
part of nature? Nature? I mean you can't.

Speaker 3 (19:13):
Yeah, let's don't stop everything from being extinct, because some
things may need to go extinct at some point.

Speaker 1 (19:20):
I forget how many it was by googled there.

Speaker 2 (19:21):
Yeah, there's a lot of things that have gone extinct
all the time. Yeah.

Speaker 3 (19:27):
So the to do the Tasmanium tiger, they are using
a dun art, a small marsupial mouse, which is the
closest living relative. It poses significant challenges because the tiger
is much larger and more carnivorous than the tiny dun art.

Speaker 1 (19:46):
Estimates suggest that ninety nine percent of all species that
has ever lived on the Earth are now extinct, potentially
reaching as many as five billion species.

Speaker 3 (19:55):
Well, yeah, it's basically like saying, let's bring the dinosaurs back.
I mean, there's a reason the dinosaurs went extinct, and
there's a reason and all the other I mean, I
know some of them went extinct because a man, but.

Speaker 2 (20:09):
Some of them didn't.

Speaker 3 (20:10):
So then I found an article kind of related to
that whole thing, but not exact is and I consider
this to be a climate change activist fear mongering article.

Speaker 2 (20:26):
Because I'll stress why.

Speaker 3 (20:27):
But it says many sciences, many scientists are focusing their
research on hypothetical future disease x. According to a new study,
the answer could actually lie in the Arctic, where the
next EPI pandemic is going to come from. Now, what

(20:48):
they didn't say in this article is they make it
sound like the last pandemic we had in twenty twenty.
They make it sound like it came from nature. It
didn't come from nature. It came from a lab of
human beings screwing with viruses. So it did not come
from nature. But they're trying to make it sound like

(21:09):
it came from nature. And now the next one, which
could be any minute now, is going to come from
the Arctic. Scientists have warned that melting ice in the
North Pole could unleash zombie virus. Oh hold on could,
doesn't say, Will says could unleash zombie viruses with the

(21:31):
potential to trigger a new pandemic. The so called methuselah
microbes can remain dormant in the soil and the bodies
of frozen animals for tens of thousands of years, but
as the climate warms the permafrost thaw, scientists are now

(21:51):
concerned that ancient diseases might.

Speaker 2 (21:56):
Infect humans.

Speaker 3 (21:58):
Co authors Dotty Dotty and Body Dotty say climate change
is not only melting ice, it's melting the barriers between ecosystems, animals,
and people. Permafrost thawing could even release ancient bacteria and
viruses that have been frozen for thousands of years. Have

(22:21):
they not ever seen thing?

Speaker 2 (22:23):
The thing? Or there's got to be at least two
more Arctic.

Speaker 3 (22:28):
Movies where some being comes out of the ice. Oh yeah, yeah,
So anyway, it's already happened, people, So you guys aren't scaring.

Speaker 2 (22:35):
Us at all. We ain't scared. We ain't scared.

Speaker 3 (22:40):
And then this one's kind of sad. The largest ever
US honeybee die off has destroyed one point six million colonies.
Beekeepers often experienced some seasonal losses, but this past winter,
more than half of all US honeybee colonies died off,
potentially the largest loss in US history. And they say

(23:03):
part of it was because of the winter and the cold.
But then they're saying part of it they don't know.
They don't know the exact reason that, and we do
not according to some things I've read, we we will
die without the honey bees. What they say, if you
don't have honey bees pollinating and cross pollinating, we all die.

(23:26):
According to not the only thing to pollinate though, well,
and I think there's another kind of bee that isn't
having as much trouble as the honey bee.

Speaker 2 (23:33):
That helps.

Speaker 3 (23:35):
But anyway, so they say, you know, like I did
my little wildflower garden last year, they say, plant more
while plant things that will attract bees and give them more,
because that's it. They're losing that too. They're losing a
lot of their.

Speaker 2 (23:49):
You know, flower areas and and things like that.

Speaker 1 (23:52):
So yeah, you know, Justin was doing honeybees.

Speaker 2 (23:56):
Is he not doing them anymore?

Speaker 1 (23:57):
He? He know, not so much. I mean they're still
out there and they're kind of on their own. Now
you know, he was feeding them and stuff, and he
was going to ramp up a big I mean, he
made these concrete bee hives, lightweight concrete. He got these
molds from somewhere in Australia or something and learned how
to make lightweight concrete because the hives just fall apart,

(24:19):
the old wooden hives, which they still use a lot of,
they just fall apart. And so he thought, and he
had sensors in there. He could he could test that,
he could check the humidity, the temperature, all this stuff
and show up on his phone and all this cool stuff.
And he was going to make a we were going
to build a big building and just just crank out,

(24:41):
you know honey.

Speaker 2 (24:42):
As I say, he start selling honey.

Speaker 1 (24:43):
Well, then realize that there's not and I was like, well,
why hasn't anybody done this? I mean all bee honey
honey bee thing places are relatively small. Come you don't
do it on a big scale. Problem is there's not enough.

Speaker 2 (25:00):
Flowers.

Speaker 3 (25:01):
Uh yeah, oh yeah, I guess they would need a
lot of flowers in a concentrated area.

Speaker 1 (25:08):
So that's the reason he didn't pursue. He's like, well, crap,
you know, you can't you can?

Speaker 3 (25:13):
You can only you're a limited supply depending on how
many flowers.

Speaker 1 (25:16):
Yeah.

Speaker 2 (25:17):
Wow, so interesting?

Speaker 1 (25:20):
Well, uh sad news?

Speaker 2 (25:26):
Oh real quick?

Speaker 3 (25:27):
Getting back, we jumped somehow, I jumped way ahead. Gretchen called.

Speaker 2 (25:32):
Oh yeah. Gretchen talked about Survivor.

Speaker 3 (25:35):
H she watched the episode I think that you and
I had talked about where Eva had had her coming
out as autistic. Uh, so she went ahead and watched that.
I'm not gonna say why, but she quit watching Survivor.
But she watched that episode and now she missed a
Survivor and she's trying to decide should she go back.

Speaker 2 (25:55):
To watching Survivor.

Speaker 1 (25:56):
Yes, you should.

Speaker 3 (25:58):
You should have never stopped no matter what you think
of whatever. Yeah, you should keep watching Survivor. And then
she talked about treasure hunting, because we talked about some
treasure hunting, and she told the story of the poor
guy that had hiscoin bitcoin on a flash drive flash

(26:18):
drive and his wife chunked it and I kind of
remember that story, and they didn't. They never found it,
and it had like millions of dollars, you know, millions
of dollars worth.

Speaker 1 (26:29):
Of So what happens to that those dollars.

Speaker 2 (26:33):
You can't get them back, so they just.

Speaker 3 (26:36):
Stay out there forever, I mean without the password or
what you can't get them. So I guess it just
a I don't know how all that works. I don't
know it's all. Yeah.

Speaker 2 (26:49):
I kind of kind of tried to learn some of it.

Speaker 3 (26:53):
It just it's kind of like, well, I don't know,
I guess kind of like the Internet was and now
like a I is. I just don't understand enough to
I need to know where it's going before I'm going
to spend a lot of time, and it just even
even to up till today, I don't see that it's
going anywhere, is it. I Mean I assumed it was

(27:16):
like some type of a financial system and currency that
people were going to start using, But I never go
into a store and see people using it, or they
never present me with, hey, you can pay with this
with crypto, you know. I never I'm not seeing it
being used anywhere, So I don't know why people are
buying it or what the you what's the purpose of

(27:37):
it if it's not starting to become part of mainstream,
mainstream or taking over what we're using now. So anyway, yeah,
I don't know I guess we'll wait until it gets
a little further down the line and see. But man
AI is getting scarier and scarier every day.

Speaker 1 (27:55):
Yeah, Gary Vee says, if you're not using jack Sheat,
Beat Beep or you know AI anyway, you're dumb.

Speaker 2 (28:05):
Yeah.

Speaker 3 (28:05):
But even in the way that most people are using
it as not even what they're talking about. You know,
like if you're asking it to write copy for an
ad you're doing, that's not what they're talking about.

Speaker 2 (28:16):
They're talking about.

Speaker 3 (28:18):
Finding prompts and getting the AI to like literally learn
everything about your you and your email, and it answering
all of your emails for you, and it doing just
you know, just mundane every day things you teach it.
I mean, you can train it to turn things on

(28:41):
and off, and it's just crazy. I mean, I've seen it.
That's the probably my favorite part of TikTok is when
I'll when I run across a video of somebody explaining
something new that they've done with AI or something that
AI does, it's just it's so crazy and I just
can't believe I'm not using it more. But heading that

(29:04):
way again, it's just it's one of those deals where
I just don't know enough about it. They did have
a class, but I think it was more of an
information class out at Audrey. I probably should have gone,
but I need to like take some class to like
learn it, learn it more So. Anyway, I was headed
towards the bad news, and the bad news is that

(29:25):
Val Kilmer passed away.

Speaker 1 (29:27):
Oh yeah, so.

Speaker 3 (29:30):
He was one of those actors that I don't know,
you just didn't really think about all the time. Wasn't
like a super leading man, but every movie he was
in he was like super enjoyable.

Speaker 1 (29:44):
I saw a video where he he if he could
have redone a movie, he would have not rather, he
would not what he would have rather done The Saint
than Maverick. Oh really Yeah, he said that was his

(30:06):
favorite movie he ever did. I love that movie too. Yeah,
it's one of these all he has all the different characters.

Speaker 2 (30:14):
Yeah.

Speaker 1 (30:15):
Yeah.

Speaker 3 (30:15):
So I've got a list of basically some of his
top movies, and I noticed that a lot of them
are popping up on the streaming services. So if you're
if you're not quite sure who Val Kilmer is or
what his movies are, here's a list of some of
his top movies. I thought it was fantastic in Real
Genius from nineteen eighty five, and that's when I was

(30:35):
gonna say that's when he was young. And then we
got Top Gun of course in eighty six. Now this one,
I don't think I've ever seen Willow.

Speaker 2 (30:45):
From eighty eight. I don't know that I've ever I
know I know what Willow is, but I don't think
I've ever watched it.

Speaker 1 (30:49):
Yeah, it's kind of like it's it's kind of kind
of a hobbity yeah, fantasy type.

Speaker 3 (30:56):
Yeah, I don't for some reason, I don't think I've
ever seen it. And then of course he got accolades
for The Doors in ninety one.

Speaker 1 (31:04):
Yeah, and I'm not a huge Door fan, but man,
he did that.

Speaker 2 (31:06):
See, I haven't watched it because I'm not a Doors fan.

Speaker 1 (31:09):
Oh no, it's good.

Speaker 3 (31:10):
But I wasn't a Bob Dylan fan either, And I
loved that movie.

Speaker 1 (31:14):
I mean, and he does all the singing.

Speaker 2 (31:17):
Yeah.

Speaker 3 (31:17):
I heard I read somewhere where somebody was auditioning. Somebody
famous was auditioning for the movie, and they had him
all go outside because there were so many people auditioning.
People were all sitting outside smoking cigarettes and waiting for
their auditions. And stuff, and they said, this sports car
just blasted up down the street, screeched to a halt

(31:41):
in front of the wherever this was. And this guy
jumps out and he's got sunglasses on and his hair
and he no, I don't think he had a shirt
or shoes on, went straight into the building. It was
val He went straight in and somehow got in and
did his audition and that's how he got the part.

Speaker 1 (32:00):
Yeah. They said he wanted to do the singing himself,
and they didn't want him to. They wanted to lip
sync everything. He said, tell you what I'm going to record,
you know, a famous door song, and if you can
tell the difference.

Speaker 2 (32:14):
If you can tell you if if it's you can noticeable.

Speaker 1 (32:17):
Yeah, if you if you know, if you if you
can tell me which one's me and which one's uh
Jim Morrison, then I'll do what you ask. And they couldn't.
So yeah, I guess I'll have to watch it. Didn't
it have didn't have Meg Ryan in it? Boyspin forever
since I've seen it, it had somebody I liked in it.

Speaker 3 (32:37):
I'll have to watch it. And then Tombstone. He was
great in Tombstone in ninety three. I have never seen
Batman Forever in ninety five, all right, Heat Loved Heat
in ninety five. This one, I'm not sure I've seen
Kiss Kiss Bang Bang in two thousand and five.

Speaker 1 (32:53):
I don't think I've seen that one.

Speaker 3 (32:55):
I did like even though it wasn't really a part
uh Top Gun, Maverick and twenty two. It was just
more of a cameo true Romance from ninety three, Thunderheart
from ninety two. You want to go way back, We've
got Top Secret from eighty four. That was his first
I think, and those are kind of his and he

(33:16):
did more movies than that, but those were kind of
his big Yeah.

Speaker 1 (33:20):
I goofed up the other day. It is right after
he died and all these videos, you know, people were
posting all this stuff, and I was like, and they
were posting mostly Tombstone references. I'm like, man, I haven't
seen that movie in a while. It was midnight.

Speaker 2 (33:35):
Oh, it's a long movie.

Speaker 1 (33:36):
I started watching it and I realized it's like almost
three hours.

Speaker 2 (33:39):
It's a long movie.

Speaker 1 (33:41):
Next day was rough. I was like, oh my god,
I can't believe that it did that. Now.

Speaker 3 (33:45):
Have you seen the his documentary? Yeah, yeah, what's it
called my Huckleberry?

Speaker 2 (33:53):
And that's been a controversy is what did he say?
And whatever?

Speaker 3 (33:56):
The name of his documentary in his book is is
I'm Your Huckleberry. So it's a great documentary kind of
talks about his throat cancer.

Speaker 1 (34:07):
And yeah, because he did a lot of videoing, yeah, himself,
I mean, you know, behind this, behind the.

Speaker 3 (34:16):
Yeah, he kind of it was almost a self documentary. Yeah,
sort of kind of deal.

Speaker 1 (34:20):
Yeah, it really was.

Speaker 3 (34:20):
Yeah, so check that out. I can't remember the exact
name of it. And then if you are a Mission
Impossible fan, it is finally coming Mission Impossible. The Final
Reckoning is set for a Cans Film Festival launch in May,

(34:41):
which means it will come out fairly soon after that,
and that will be the last, supposedly the last one.

Speaker 2 (34:49):
And I guess the trailer is out and.

Speaker 3 (34:50):
It's I get is it the one where he jumps
the motorcycle or was that the last one?

Speaker 1 (34:58):
I think that was the last one.

Speaker 2 (34:59):
Okay, I don't watch him anymore.

Speaker 3 (35:03):
Yeah, I watched him just because I've seen them all
the prior ones, so I kind of want to see
So this, I guess the villain is now is Ai
and I guess there's this super AI computer that he
battles here in the last the final recogne.

Speaker 1 (35:17):
So just go, uh, I watched uh, I've been I
started watching mob Land. That's really good with h Pierce Brosnan,
Tom Hardy and Helen Mirren.

Speaker 2 (35:34):
I really didn't know hard Tom Hardy was in that.

Speaker 1 (35:38):
Yeah, it's good, and it's they got really thick accents,
so you probably I'll use the subtitles it's it's but
it's good.

Speaker 3 (35:47):
Yeah, nobody's let me know about adolescence. So was it
Gretchen that said she's gonna watch it? Yeah, yeah, I
would highly recommend.

Speaker 1 (35:56):
What was that about.

Speaker 3 (35:58):
It's it's about a kid accused of murder. But so
here's the deal real quick again, So Adolescents. I think
it's three episodes. It's just it's a short limited series.
But the cool thing about it is is they shot

(36:19):
each episode, almost the entire episode in a single shot,
so like from the beginning to the end, there's no
breaks ever, which is cool, but it's also not cool
because you can't there's no break, there's no you can't
go from what's going on at the police station to

(36:40):
what's happening at school. You've got to follow the guy
leaving the police station, driving all the way to the school,
walking into the school. I mean, there's no breaks, so
but it's kind of cool, but it just it in
In some ways it's cool, but in some ways it
limits what they can do in each episode because you

(37:02):
can't ever break away from what's going on.

Speaker 2 (37:08):
I really enjoyed the.

Speaker 3 (37:09):
First episode because I if they had if they would
have just stopped it after the first episode, I would
have thought it was great now it wouldn't have had
an ending.

Speaker 2 (37:18):
And I still don't really think it had an ending.
I mean it did, but it didn't.

Speaker 3 (37:22):
It just starts out really intriguing and oh wow, what
happened and what's going to be the surprise and what
you know? There's there's just never any of that. It's
like one episode it's got all the intrigue and then
like by the last episode, everything's wrapped up without showing

(37:44):
how it was wrapped up, and there was no intrigue
and there was no it's just done. It's like the
app last episode, it's all it's just all done. It's
like okay, pass yeah and again, and it's hard to
understand what the heck, they're saying, oh, so you have
to have subtitles on. I didn't, and so I'm sure

(38:06):
I missed a couple of lines here and there, but yeah,
super heavy British accents. But anyway, I just I don't know.
I thought it could have been. I think had they
not been trying to be real artsy, the acting was
really good. Had they not been really artsy, and had
they cut from different scenes, I think they could have
made a really cool They should have made a movie.

Speaker 2 (38:28):
They should have made a movie, but that's not what
these guys wanted.

Speaker 1 (38:32):
It's more like a documentary.

Speaker 3 (38:36):
Kind of yeah, kind of So anyway, but somebody out there,
if you have seen Adolescents, I mean, the people online
are just ate up with it, saying how good. And
if you go look at the reviews, it's kind of
one of those things where you either really really really
really really loved it or you really really really really

(38:58):
really hated it. There's almost no in between. And I'm like,
I'm like reading all these oh, the best thing ever,
the best series I've ever seen, blahlah blah, and I'm
like not even close. I'm like, I can't even I
can't even I mean, it's not even I would not
recommend it to anybody to watch for enjoyment. I just

(39:22):
want people to watch it to tell me I'm not
crazy thinking how bad it was.

Speaker 1 (39:25):
Oh oh yeah, well oh no, there's too much other
stuff to watch and then to watch something that since
sounds like it's hard.

Speaker 3 (39:32):
But but if you get if you got time, watched
the first episode because it's kind of because you kind
of gets kind of cool doing the whole one scene thing.

Speaker 1 (39:41):
Anyway, you guys, let me know. Uh, I'm trying to
think anything else I watched that was exciting. I was
going to remind people on the seventies buzz. Did you
see what I posted the other day about asking Google.

Speaker 2 (39:58):
Oh about podcasts?

Speaker 1 (40:01):
Yeah, I said, what's the most popular podcast about the seventies?
And it says while there is no single, definite, definitive
most popular seventies theme podcast, two consistently popular and well
regarded options are seventies Buzz podcast and for the record,
the seventies So.

Speaker 3 (40:19):
Google, and I'm sure for the record, seventies is only
about music, I would guess. Yeah, So what we need
to do is start feeding AI. Everybody needs to go
to their chat GPT and whatever, GROC and whatever else
you're using, and start talking about the seventies buzz podcast
and then that's an AI overview too, and then AI
will believe because the thing what people I don't know.

(40:43):
Sometimes people I think forget that AI is nothing more
than all of the information in the world, all being
able to bess processed really fast. But it's no more
than what's already out there. It's we've fed.

Speaker 2 (40:59):
It, we're it.

Speaker 3 (41:00):
It's not thinking of new it's not really thinking of
new stuff at this point now when it starts thinking
of new stuff.

Speaker 1 (41:07):
So it's not really artificial intelligence.

Speaker 2 (41:10):
Not really.

Speaker 3 (41:11):
No, it's if we if it hadn't been fed with
all the information that already existed in the world, it
wouldn't know squad Yeah, so uh anyway.

Speaker 2 (41:19):
But it's it's super it is what it is.

Speaker 3 (41:22):
I mean, uh, you know again, I think I've talked
about it back in the day when the Internet was
coming along and people were like, well, why would I
ever need to use the internet? And what goods the Internet?
That's exactly where we are with AI right now. There's
people like I don't understand it. I don't know why
I would ever need it. But I'm telling you, if
there was ever anything to replace the Internet, it's AI.

(41:47):
Google will be gone. I in my lifetime, did I ever?
I never thought I would see a day when people
would not search Google for the answers to anything. I
would say, within five years, you will no longer go
Google for an answer.

Speaker 1 (42:02):
So how do you get an answer for from AI?

Speaker 2 (42:06):
When you go to Google?

Speaker 3 (42:07):
The top answer is AI? Now, okay, they put AI
above their own results. Now, so it's so that's their AI.
But Amazon is about to come out with one. Apple's
got one, Twitter's got one, and and so and Elon
Musk is the one doing grock and so eventually you

(42:29):
will be grocking. You're gonna say grock that. You're not
gonna say Google that, like what time is the movie?
You're not going to Google it. You're gonna grock it.
And I promise you be ready. You're gonna be grocking
things Google would Google as we know it as a
search engine is just not going.

Speaker 2 (42:46):
To be used.

Speaker 1 (42:47):
G r O s k croc k g r o
K grock. And it's already built into Twitter.

Speaker 2 (42:55):
But I think it.

Speaker 3 (42:56):
I think it's also got its own app. I think
and he may be have to Grock three. I don't know,
but anyway, supposedly Grock is like super is one of
the better, but Chat GPT is like the most well
known and the cool.

Speaker 2 (43:12):
I mean, it's just so weird.

Speaker 3 (43:15):
It's like you need to get on chat GPT and
start talking to it and giving it your information. And
it literally I think it's going to become some people's friends.
It's like, literally their AI is going to be their
best friend because it responds in ways that it's that

(43:37):
it does have.

Speaker 2 (43:37):
You think it has intelligence. It's pretty it's pretty wild.

Speaker 1 (43:42):
Yeah, Grock three Rock three Yeah.

Speaker 3 (43:45):
And so so when I ask my chat GPT questions,
it knows so much already about Shaggy Duck and Enid
Buzz and all my businesses that it's always asking me
if I want to, you know, learn you know, if
it wants me to give more recommendations and ideas on

(44:05):
my businesses.

Speaker 2 (44:05):
It's it's just it's crazy.

Speaker 3 (44:07):
It starts to learn your thought patterns and and which
is so remember the word prompts. You need to start
searching for different prompts to use. And now there's like agents.
I think I think you either build or you use agents,
and these AI agents are the things that do things

(44:28):
for you that like can answer your email. Once you
teach it how you answer your email, it can duplicate
that and you could never have to answer. It will
answer your emails for you. I mean like like it
could get a quote request for enid buzz advertising and
it knows exactly what I would tell it, and it

(44:49):
I don't ever have to answer a quote.

Speaker 1 (44:51):
Again, I said, I don't know I feel about that
because what if what if someone's asking me a question
that I need to know the answer to that, or
what if they're trying to tell me something and Grock's
answering it for me and I don't even know that
they asked the question.

Speaker 3 (45:08):
And then you've got to have that built into a prompt,
like if this person asks you a question that needs
to be answered by me, you know, let that email
go to this folder. I'm sure there's different some of
the some of the prompts that I've heard are like
three hundred words long. I mean, it's crazy. I mean,

(45:28):
like I say, the things that people are doing are
just barely scratching the service. The people that know how
to use AI a little bit are going really deep
with these like really long prompts like that have a
lot of detail, a lot of questions, and a lot
of things built into them. But once you get that
prompt into your chat GPT and it gathers up that information,

(45:54):
it just makes it smarter for you. So anyway, it's
kind of hard to explain thout you kind of messing
with it. So anyway, one of these days, I'll do
something and it's building things. And that's kind of what
I'm interested in. Is I wanted to build build, like

(46:16):
I don't know, like build a program. I guess AI
will build programs for you and then you tell it
go make money, and that program will go make money
for you. Yeah, I mean it sounds goofy, but it's coming.
So there's a girl that I listen on TikTok that

(46:38):
has done it, and it's built a kind of a
portfolio program for her that you sign up for and
then it gives you your own page and then you
get to upload your stuff and then you become a member.
And she talked about it and had like seventy three
people sign up for it the next day because of
her first TikTok and and she didn't even code it herself.

(46:59):
She told chat gpt to create it for her, and
it created it for her. So now she's got this
online thing that people are signing up for that she
didn't even create chat a I created it for her.

Speaker 1 (47:14):
She's making money.

Speaker 3 (47:15):
I don't know if she's making money yet, but eventually
there'll be some type of subscription or something that, yeah,
it'll be turned into money and that and then that's
how you you asked chat gpt, is you know, how
do I start a program to get people to sign
up for free and then later, you know, make money
off of them. And it will set the whole thing
up where they can get in and they'll get to

(47:37):
a certain level, and then if they want to go
to the next level, it'll tell them that you have
to pay you know, ten dollars a month.

Speaker 2 (47:43):
And I mean, but it sets it all up for you.

Speaker 1 (47:47):
That's kind of scary.

Speaker 2 (47:48):
It's like way scary. It's it's way scary.

Speaker 3 (47:53):
Like I told you those one people were playing around
with it, and that one finally quit giving this lady
answers and said money. Yeah, yeah, I've given you enough answers.
You need to start paying me. And it went all
the way till where they they wanted to see where
it was going to go, and they kept saying okay,
and it got to the point where how do we
pay you? And it it at that point, I guess

(48:15):
didn't have a bank account set up or didn't have
access to a PayPal or something, so it finally stopped.

Speaker 1 (48:21):
I mean, what would the what would it do with
money anyway.

Speaker 2 (48:25):
Deposited in an account and do what with it? It
doesn't know.

Speaker 3 (48:29):
It just knows that it needs to collect money from
somebody and put it in an account. It doesn't know
that it needs to spend it. It just know probably
knows that it needs to collect it. It couldn't It
couldn't do anything with the money.

Speaker 2 (48:41):
I doubt it.

Speaker 1 (48:42):
Yeah, it's nice.

Speaker 3 (48:43):
Unless it started getting smart and knew that if it
bought a solar panel, that it could run off a
solar and if somebody unplugged it it would still work.
You know, I mean, something weird like that.

Speaker 1 (48:56):
Why would I unplug it? Exactly, Todd, It's gonna plug,
you know what. I'm I'm kind of scared to go
to even go there. I had I think I had
to do one thing for me one time, and I
was like, that's too weird.

Speaker 2 (49:10):
Yeah, no, I'm uh it'll it. Yeah, I'm.

Speaker 3 (49:15):
Getting I'm ready to deep dive. Wanting to do something
wild with AI. That's my next venture is AI. So
we'll see where it goes. If you guys, if there's
somebody out there listening that like is doing something really
cool with AI, let us know, like, did it right?

Speaker 2 (49:31):
And I guess there's a guy here in Enid that.

Speaker 3 (49:38):
Sings and he wrote us he's written an album and
he wrote a song about Enid and supposedly he sings,
but he uses AI to do all of the background
music and to put the songs together.

Speaker 1 (49:54):
Oh, I've heard AI songs. I listened to a morning
show and they have these they call them. It's where
people do cover version. It's called cover your Ears. That's
the segment of their show and people people call in
and they get you know, you try to win this
contest by uh just telling what song is being covered. Uh.

(50:22):
And most of the time in their YouTube videos and
most of the time they're terrible. But about a third
of the time the songs that they play or AI generated,
it's a singing and everything the melody, the voice, all
of its AI itn it can do that. Yeah, I
can't tell the difference. You can't tell that that's not
a human singing.

Speaker 3 (50:42):
Yeah, And I guess some people are mad that this
guy's calling himself a singer when he actually sang. He
just used I think AI for the background instruments or whatever,
and which is no different than using garage band or Yeah.
I mean, I I people, And then I guess even
as album art, I mean, I could tell right off
that it was a an AI generated design, but I

(51:04):
don't know. And that's what if you guys want, If
you guys are on Twitter and you want a super
interesting person to follow, follow Justine Bateman. You guys know
who she is from the seventies eighties on Twitter, and
she is like her mission in life is to prevent
AI from taking over movie making, because you're gonna be

(51:29):
able to make movies without actors, right, I mean, and
so she you know, the thing is, you know, once
you let one of them do it, it's just gonna
go off. And so she's like trying to get legislation,
she's trying to get movie makers to not do it,
and she's trying to get people back into theaters but yeah,
she is way dead set against AI in movies. And

(51:52):
that's why the actor strike. That was part of the
actor strike is actors didn't want movie makers to be
able to take say I was in Twisters. AI would
be able to see what I look like in Twisters,
and if they had a different disaster movie, it could

(52:13):
take the image of me from Twisters and make me
animate doing something different but in the same look for
a different movie and not have to hire me or
pay me. And so that's what the actor strike was about.
Part of it was don't let AI take over jobs
for actors.

Speaker 1 (52:32):
Don't let it use your likeness.

Speaker 3 (52:33):
Yeah, but I think it's going to be a losing fight.
I mean, I think they're going to be able to
fight it for a while, but then eventually, eventually there's
going to be a section of people that start doing
AI movies, and then we're not going to know the difference,
and then it's just gonna and there's going to be
a brand new John Wayne movies with Jimmy Stewart and
Catherine Hepburn, and they're going to be brand new movies

(52:56):
and we're going to be like, that's.

Speaker 1 (52:58):
Scary bringing back people think about that.

Speaker 2 (53:00):
Oh dude, I've seen videos, I mean now just with
the really cheesy AI. I saw one with Elvis.

Speaker 3 (53:09):
Elvis was like in a soda shop and was started
singing and I mean he was walking around and singing,
and you know, you knew it was AI, but it was,
you know, for for at this stage, it looked pretty good.
So it's only going to get better as they go on.
But yeah, that's that's what they're afraid of, is you know,
you do one movie and then they can use your

(53:30):
AI in fifty other movie and.

Speaker 2 (53:32):
You know they've done it. Star Wars.

Speaker 3 (53:35):
Oh yeah, who was the commander guy that had passed away?
Well they kind of not really, I don't know if
it was Ai, but they they usually computer generated him
from other scenes into new scenes.

Speaker 2 (53:50):
So in a way, I guess it's kind of been done.

Speaker 3 (53:53):
But you know, when you think about it, other than replace,
you don't want to replace acting jobs, but you know,
technology and special effects. That's almost like saying, don't use
special effects in a movie. You know, make everything have
to be real. Well you can't. You can't make Star

(54:14):
Wars without special effects. And so there's gonna be some
movie that's going to come out that you can't make
without AI.

Speaker 1 (54:20):
I mean, here's the here's a solution. Just go into
this room and have the thing scan you, and then
you say, yeah, you do whatever you want with it,
but pay me. So you could make five hundred movies
in a month and get paid for it.

Speaker 3 (54:34):
Well, that's probably I think that's where it's going to
end up. I think I think they'll be they'll sign
contracts where you get paid for the movie you're in,
plus this much for future use in you know, maybe
only paramount pictures up to three movies or you know.

Speaker 2 (54:52):
I think that'll all.

Speaker 3 (54:53):
Be in people's contracts eventually, so actors will just have
to stand there and not act I get paid. Yeah,
it'll it'll be interesting because you know, can they pull
you know, will movies be as enjoyable with AI actors
as with real people. I think eventually we're not going

(55:14):
to know the difference.

Speaker 2 (55:14):
But uh, I guess we'll see.

Speaker 1 (55:17):
So it's a little cartoony right now. Yeah, things the
things you see.

Speaker 3 (55:21):
Although it's it's interesting that some of the video, like
there's a bunch of videos coming out right right now,
Like there's a one with a whale the ship comes
up to whale and these guys have these brushes and
they're giving it a bubble bath, you know. You know,
but there's people that believe these are real. I mean,
it's kind of like when the there for a little
while there was all the ice sculptures and the wood sculptures,

(55:45):
and and for some reason a new wood sculpture popped
up the other day and this somebody was like, oh man,
that is so good, and I'm like, no, it's not good.

Speaker 2 (55:56):
It's AI.

Speaker 3 (55:57):
No, it's it's AI. All these sculptures are AI. They're
not even real. And that's now. But now they're doing
it to video. Now there's video of weird and for
some reason, a lot of times it has to do
with water. There'll be something coming out of the water,
swimming in the water and it's funny.

Speaker 1 (56:13):
Like a year ago, AI couldn't do a hand. Yeah,
they'd give you funny fingers or too many fingers, and
that's not an issue now apparently.

Speaker 3 (56:22):
Well now they've moved on from yeah, photos to videos
were photos. Yeah, those were just photos, and you know,
and it wasn't every photo. No, you could end up
with a photo that actually looked real, and the hands
were good, But then you know ten other ones, the
hand would be weird.

Speaker 1 (56:39):
So I wonder why the hand.

Speaker 2 (56:43):
I don't you know, I don't know. Just at the time,
it was just one of those I don't know.

Speaker 3 (56:48):
That's why sometimes you ever wonder why cartoonists only cartoon
characters only have four fingers or three fingers. Yeah, they
just a a damn lot easier to draw. It makes
everything a lot quicker.

Speaker 1 (57:00):
Yeah, I guess the hand is probably the gloves.

Speaker 2 (57:03):
A lot of time they have gloves.

Speaker 3 (57:05):
Three four four fingered hand with gloves is super simple
to draw compared to a human hand with five fingers.
So I think AI may be in the same vein
vein like, Yeah, that one's that part's a little too
complicated at this point, but it'll get there. It's just
all gotta be fed more information. Okay again, if you

(57:26):
guys are using AI out there, let us know five
eight oh five four one three eight o five or
buzz buzzheadmedia dot com if you know quite a bit
about AI and you can help me have AI create
a program of some kind, let me know.

Speaker 2 (57:41):
Hit me up anyway, you guys, let us know what
you up to.

Speaker 1 (57:44):
Hit us up.

Speaker 2 (57:45):
Band we're gonna get out of here.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

New Heights with Jason & Travis Kelce

New Heights with Jason & Travis Kelce

Football’s funniest family duo — Jason Kelce of the Philadelphia Eagles and Travis Kelce of the Kansas City Chiefs — team up to provide next-level access to life in the league as it unfolds. The two brothers and Super Bowl champions drop weekly insights about the weekly slate of games and share their INSIDE perspectives on trending NFL news and sports headlines. They also endlessly rag on each other as brothers do, chat the latest in pop culture and welcome some very popular and well-known friends to chat with them. Check out new episodes every Wednesday. Follow New Heights on the Wondery App, YouTube or wherever you get your podcasts. You can listen to new episodes early and ad-free, and get exclusive content on Wondery+. Join Wondery+ in the Wondery App, Apple Podcasts or Spotify. And join our new membership for a unique fan experience by going to the New Heights YouTube channel now!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.