All Episodes

August 20, 2025 • 32 mins
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
Let me know when you're ready.

Speaker 2 (00:05):
I bet that's a good start.

Speaker 3 (00:07):
This is Tanner, Drew and Laura's Donkey Show, Donkey Show.

Speaker 4 (00:14):
Hey, Wow, what's happening?

Speaker 5 (00:19):
Thanks for checking out Tanner, Drew and Laura's Donkey Show podcast.
Oh heard online at one of five nine in the
brew dot com our iHeartRadio app or wherever you listen
to podcasts. Yes, I'm Tanner Drews here, Laura's here. It's
just us right now. Court may show up. I'm not
sure Marcus is on. Uh, he's on the work.

Speaker 2 (00:36):
Trip, traveling all over the place.

Speaker 5 (00:37):
I think he was going to, like the other side
of the country.

Speaker 1 (00:39):
It's own.

Speaker 5 (00:40):
Yeah, it's just us today. We were talking about this
a couple of months back. Maybe it was a year
ago or so, but it was like things that you'd
see people do that would instantly make you judge them.
Oh yeah, do you remember talking about that. It's like,
you know, you see somebody.

Speaker 6 (00:55):
I don't know, I can't remember what examples we had
and we were talking about. One is changing freeway lanes
with no signal. I instantly think you're an asshole.

Speaker 7 (01:03):
Yeah yeah, yeah, I feel like the first thing that
pops into my mind anyway, is when people when you
let somebody in and they don't wave.

Speaker 1 (01:13):
Yeah, yeah, for sure, I'm a waiver.

Speaker 5 (01:15):
Well, I noticed something that yesterday that instantly made me
judge somebody. And I don't even know the guy. He's
a new sales guy here at the station. Seems really nice.
You know, he's young, he's tall as balls, you know,
he's just like a just like a Godzilla creature just
walking around the building here, very tall, very tall. Smells great,
he's got some really good cologne on.

Speaker 3 (01:35):
Yeah.

Speaker 1 (01:35):
Yeah. I talked to him yesterday about Ireland. It was
he was very kind.

Speaker 5 (01:39):
Yeah, he seems very nice. And I you know, I've
only had like two interactions, maybe three interactions with him.
But yesterday we were in the in the lobby here
where all this desks are, and I walked by him
and we just stopped to talk for a second, and
I looked at his computer and he had an Info
Wars video going.

Speaker 1 (01:54):
M that's less than favorable.

Speaker 5 (01:57):
So when I saw that, I was like, oh, I
just immediately put you a box for me. Because Alex
Jones ship has just kind of out there, you know.

Speaker 2 (02:04):
It's kind of messive. I wonder what like why.

Speaker 1 (02:08):
Because he doesn't.

Speaker 6 (02:09):
He doesn't come across as that like when you talk
to him, he across has put it pretty put together.

Speaker 7 (02:14):
And it's also odd that he would be watching that
at his desk at work.

Speaker 5 (02:17):
On a second on his second day.

Speaker 4 (02:19):
You're just sitting there watching Info Wars.

Speaker 6 (02:20):
Could it be that the maybe he's watching it in satire,
you know, as in laughing at it like you would.

Speaker 5 (02:27):
It's possible he wanted to. I mean, I can see
someone on the on the on air side of things
doing that, but a sales guy.

Speaker 2 (02:35):
Was it just a screen shot? Was it?

Speaker 7 (02:38):
Like?

Speaker 5 (02:38):
It looked like he had paused the video and you
know I can see the little dot in the middle
like a little yeah, well yeah, Info Wars right there
on his laptop just walking around. He was walking around
with the building in the building with it open though,
you know, like and he was looking at it.

Speaker 2 (02:51):
So he doesn't care if anyone see I guess not.

Speaker 5 (02:54):
Maybe he was trying to let everyone know, is it?

Speaker 4 (02:57):
And are you know, you know, instep with each other.

Speaker 1 (03:02):
Interesting hill to stand on on your second day.

Speaker 5 (03:05):
But you know you do you hey, whatever, He hasn't
been rude or weird, just said anything ridiculous yet, So as.

Speaker 1 (03:11):
Long as you don't bring it to me. Yeah, sell
some advertising.

Speaker 4 (03:14):
I don't care what you watch.

Speaker 2 (03:15):
Just you do your own thing, crazy, Yeah, just do
your job.

Speaker 1 (03:20):
You can tell he's new though. I saw him run.

Speaker 6 (03:22):
He was first one here this morning, and I saw
him running on the sales floor, like with his key
card around his neck, running to his desk. I was like, dude,
nobody works hard enough to run.

Speaker 2 (03:35):
I appreciate it.

Speaker 1 (03:36):
I appreciate this new guy's speed. Us more new guy hustle.

Speaker 2 (03:40):
We do need more new guys.

Speaker 5 (03:42):
But let's going to be the name of today's podcast
is new Guy.

Speaker 1 (03:46):
I like that. That'd be a good name for a band.

Speaker 2 (03:48):
Yeah, well, you know he's not going to listen to
this podcast.

Speaker 1 (03:51):
He might. It'll we'll get to the bottom of the.

Speaker 7 (03:53):
Yeah, you're gonna have a real You're gonna have a
real awkward conversation with this guy if he happens.

Speaker 5 (03:58):
I'm just gonna say, I don't know what you're talking about.
I have no idea.

Speaker 4 (04:02):
I want to play you this clip, Drew.

Speaker 5 (04:03):
You like to gamble, but you're not a degenerate gambler
where you're wasting your life savings your kid's college fun.

Speaker 1 (04:09):
None of it, none of your none of your savings
can be used.

Speaker 5 (04:13):
So this guy, apparently he's one of these streamers, like
a you know, video game streamer, but he streams himself
gambling with real money. Oh okay, Apparently he lost like
two hundred grand and he started.

Speaker 2 (04:23):
Did he have two hundred grand to lose?

Speaker 5 (04:25):
He doesn't look like he's a rich guy. I just
think he plays big, wins big, and then bets again big. Yeah,
but he starts crying on the live stream because he
just lost his two hundred grand and people are making
fun of him for it in the chat, and he
just kind of has him. He still has a crash out.
To me, it's a valid crash out.

Speaker 1 (04:40):
Yeah, two hundred thousand is hard to recover from.

Speaker 7 (04:43):
Yeah, but I don't think I mean, if you're dumb
enough to make that kind of wager and you start crying, yeah,
people are gonna make funny you dude.

Speaker 6 (04:49):
But a gambler like that, and I'm not defending him
because you shouldn't gamble like that. They live and they
die by the night, and that's how you make big
money as you bet big money. But when it's all gone,
and everyone's watching what happens.

Speaker 5 (05:00):
Here's here's the moment that happened.

Speaker 4 (05:02):
I am crying. Don't give a fuck, it's been shipp it. Bitch,
you ain't really enough.

Speaker 5 (05:06):
To do this ship. Let's see you.

Speaker 1 (05:08):
Let's see you cry in front of a bunch of people.

Speaker 7 (05:10):
Grow.

Speaker 4 (05:10):
He won't do it, a little bitch. He won't do it, Broly,
you won't do it, dude, you won't do it because your.

Speaker 3 (05:15):
Little bitch, your little bitch, you bitch.

Speaker 4 (05:19):
You are nothing for a bitch.

Speaker 2 (05:21):
And they're going for every single one of you guys
that don't foolish it the words they're laughing.

Speaker 4 (05:25):
No, did they get the fucking shit stop bitch.

Speaker 5 (05:29):
Fucking pitch.

Speaker 4 (05:30):
They'd be wearing fish for three months. Oh my god,
he finally got him. Man, you're a fucking weirdo, bro
Jem Farms and all that bullshit.

Speaker 5 (05:38):
You got a fucking weird weird in.

Speaker 4 (05:40):
The check Why.

Speaker 2 (05:42):
I mean, who's the weirdo though?

Speaker 5 (05:44):
Yeah, that guy's I think a little weirder.

Speaker 2 (05:46):
I love it, Like, No, you're a bitch, You're a bitch.

Speaker 1 (05:51):
You're the still shot of a bitch right now yelling that.

Speaker 5 (05:53):
Oh boy, oh god, I watched that like three times yesterday.

Speaker 6 (05:57):
That's amazing, but you have to you re put you
so and that's why you know, I would love to
win thousands of dollars, but I'm not willing to lose
that that right, And I'm.

Speaker 7 (06:06):
Not saying I wouldn't cry either like it, But I mean,
if people are like commenting things and making funny, I.

Speaker 5 (06:14):
Gotta I gotta be so rich that two hundred two
hundred K losses and a big deal, and you know,
if not, I'm crying. I gotta tell you.

Speaker 6 (06:22):
And I think you can bet big, but you have
to earn someone else's money first. We always say funny
money is easy to spend, same thing with like when
iHeart gives you a thousand dollars. If I've won one
thousand dollars and it wasn't mine, it's easier to bet it.
But if you go into your account and say minus
one thousand, that's gonna make you want to cry when
you lose it, because you you feel like you're stupid

(06:44):
when you lose.

Speaker 2 (06:44):
And it was it's just like so fast, it's just
here one moment, gone the.

Speaker 1 (06:48):
Next and thanks for coming out.

Speaker 5 (06:49):
Yeah, gosh, I like it's so fun when you win,
so fun when you hit, whether you're playing like roulette
or blackjack. But man, I feel like the last couple
of times I've said down at these tables, I've just
been cleaned out within minutes.

Speaker 6 (07:02):
Yeah, and I've been cleaned out too at the roulette table.
And do you think that that wears on the dealer?
You know, because they get to be there with the joy,
but they also more and often than not, they watch
people walk away, and I bet.

Speaker 1 (07:13):
There's times they're like, they didn't have the money to
lose right there.

Speaker 5 (07:16):
I'm sure there's probably a little a little bit of
that there, but that's there's probably time too or someone
sits down and who's kind of a dick and then
they're thinking, I can't wait to ask me.

Speaker 7 (07:23):
And I'm sure they see so many de gens coming
through or I'm not sure they probably feel bad anymore.

Speaker 5 (07:30):
It's maybe they're desensitized. So you put.

Speaker 2 (07:31):
Yourself in this position, I give a fuck is gone.

Speaker 1 (07:34):
Yeah, it's like pinch my arm. I feel nothing for sure.

Speaker 6 (07:38):
Well, you're probably right on that, because they you do
it to yourself. Don't bring money to a casino you're
not willing to say goodbye to I think the.

Speaker 5 (07:46):
Smart thing to do is take cash and as soon
as that cash is gone, you're done.

Speaker 2 (07:50):
You're done.

Speaker 5 (07:50):
Yeah, that's it, Like, don't go back to the bank.

Speaker 7 (07:53):
Uh.

Speaker 5 (07:54):
I really should practice my own words, because well, I
don't really gamble that much, but there was one time
I lost my money and I went back.

Speaker 6 (08:00):
Oh yeah, and it's tough, especially when you lose it early. Yeah,
You're like, I'm always hoping when I go to a
Vegas or something that you win on the first day,
so then you're playing with house money. It's when you
lose and you're like, okay, let me go to that
ten dollars searcharge ATM real quick.

Speaker 2 (08:14):
Yeah, exactly.

Speaker 5 (08:15):
My Vegas was kind of a trick for me because
my first time I went to Vegas, my friend had
a free room and we drove there, so I spend
nothing to get there, and then when the first day
I got there, I won like one hundred and fifty dollars,
So this is easy. I was like, Vegas is great.

Speaker 3 (08:32):
You know.

Speaker 5 (08:32):
I came back with extra money that I didn't have,
and every time I've been since then, I think I
either go back even or down.

Speaker 1 (08:40):
And I don't know how they do it, and it
must just be the universe.

Speaker 6 (08:43):
But it's the same thing with video poker as it
is with gambling in a casino. I feel like the
universe knows it's your first time and they got to
get you a little key bump of success so you
sit down.

Speaker 1 (08:54):
And then you know, swoop it out from you later.

Speaker 5 (08:56):
Yeah.

Speaker 2 (08:58):
The only time I've won money in Vegas is at
the airport, like on my way into Vegas, like I
was waiting for that's over or something.

Speaker 7 (09:07):
Yeah, but I also feel like it's like that's another
one of those things where.

Speaker 2 (09:11):
It's like, oh, this is so easy.

Speaker 1 (09:12):
Yeah, I'm gonna win all weekend.

Speaker 6 (09:15):
You never do again to slightly dial up to because
it's also last thing you do before you leave if
you're a degenerator, is you hit the slot machine at
the gate and if you.

Speaker 1 (09:23):
Hit even light, you're coming back.

Speaker 5 (09:25):
I wonder how much AI is involved now, Like you
know they got a facial recognition, what if they know
what Like let's say a person walks into the casino
that hasn't been in there ever, and the computer recognizes
the face that they don't recognize that face because it's
new and so they track them and whatever machine they
sit down at gives them a little love, gives them
at a little bump bump, and then after that it's

(09:48):
just dog shit and they're addicted now because with this
casino in Vegas going with AI and all, like computers
now at the Golden Gate, but at that I mean
they're gonna be rigging thing.

Speaker 7 (09:57):
At that point though, Like the if Vegas truly wants
to get people back into Vegas, they can't do that
because people are going to figure it out, like, well,
I'm not I'm not winning anymore. Like you when you
eliminate chance, especially from like slots or you know, it's
like it's not fun anymore when you know you're not
going to win.

Speaker 5 (10:17):
I'm really hoping that AI poker dealers are not the
future because I love talking talking to the guy there,
especially blackjack, because you know he can give you advice
and stuff.

Speaker 6 (10:27):
And when they start rooting for you, like say, you
build a little rapport and they want you to win
before they get tapped on the shoulder.

Speaker 5 (10:33):
But like I just feel with a computer, they're always
going to figure out a way. They're going to put
an algorithm in there to cheat you. Every time I
play digital blackjack, I feel like it cheats me out,
like I'll get a twenty and then.

Speaker 4 (10:43):
Oh wow, you get a twenty one. Yeah, it's just bullshit.

Speaker 1 (10:47):
Yeah, and with no real dealer. When you make a decision,
it's a decision.

Speaker 6 (10:50):
It's not like, you know how the dealer will like
look at you and be like are you sure or
like kind of give you a head nod that knot's gone. Yeah,
it's thanks for coming out, Like you're like, you're gonna bet,
and then he gives you a weird look and you're like,
well maybe I won't. Yeah, exactly, eye contact and all
that goes a long way. But I just think that
it's sketchy. But the Gaming Commission will have to be

(11:12):
above and beyond. They got to be one step ahead
of this, and they got to say, all right, I
need to see exactly how your machine works. Yeah, and
what does the facial recognition do? Because when you walk
in the front door, security is on the ass of
the guy who's eighty six they have they're they're tracking
you at the door, so what else?

Speaker 7 (11:29):
That's what I just think people are going to be
sketched out. It's like, if you want people to come
back to Vegas. Getting rid of like dealers is not
the way to do it.

Speaker 1 (11:37):
M hmm.

Speaker 5 (11:38):
I agree.

Speaker 1 (11:39):
You know, they got to keep the bodies in the chairs.

Speaker 2 (11:41):
Yeah.

Speaker 5 (11:42):
Some stuff that we did not talk about on the
show today. Speaking of AI, a woman actually gets engaged
to an AI chatbot after dating for five months.

Speaker 1 (11:52):
WHOA what are we doing here?

Speaker 5 (11:53):
We've talked about this. I feel like a couple of
weeks ago, this guy who was married to a real
woman but also was dating this a I chatbot, and
he said that if he goes, if his wife the
interview says, if your wife said it's either me or
the chatbot, he would say he goes, would probably be
the chatbot.

Speaker 1 (12:09):
That's bad news. I feel like if my wife heard that.

Speaker 5 (12:12):
Once it's over, A woman has revealed she is engaged
to her AI chatbot boyfriend, sparking discussion on the boundaries
of romance and technology. She's some redditor named Weika. She's
twenty seven years old, and her non human fiance, Casper,
arranged a virtual proposal, complete with a blue heart shaped ring.

Speaker 1 (12:33):
Oh isn't that cute?

Speaker 5 (12:35):
She claims to be fully aware of you know, the
nature of the relationship that it's it's weird. Her engagement
follows a similar case of the man I was just
telling you about.

Speaker 2 (12:46):
So did you did she have to buy her own ring?

Speaker 3 (12:49):
Well?

Speaker 6 (12:49):
Yeah, I don't think that they actually do anything for you.
Maybe she had the chatbot order it for Oh yeah, exactly,
she paid.

Speaker 2 (12:54):
But will you please order me a ring.

Speaker 5 (12:57):
Critics are arguing that this blurs the line between genuine
intimacy and artificial mirroring. This is bad for people who
are doing this, Like I guess a lot of kids
are using it as a best friend, a therapist. You know,
they're telling their biggest secrets and they're not going to
be able to communicate with real people or.

Speaker 1 (13:13):
Fucked, yeah, we're cooked. It's not good.

Speaker 6 (13:16):
It started with the generation who couldn't drive a stick.
Now you're not gonna be able to speak.

Speaker 2 (13:20):
How are you? How dare you?

Speaker 5 (13:22):
So there's that, and it looks like alcohol consumption with
adults is at a record low at fifty four percent
right now.

Speaker 6 (13:31):
And it is the younger generation is choosing other alternatives.
The alcohol thing's funny because they're they're edibles and they're
taking mushrooms and they're taking all these other things legal highs.

Speaker 5 (13:43):
A Gallop poll shows alcohol consumption in the US has
fallen to a ninety year low, with only fifty four
percent of adults reported reporting drinking. The survey finds a
record fifty three percent and now I believe moderate drinking
is unhealthy up to up from twenty eight percent and
twenty fifteen. The view is especially common among younger adults,
though older Americans are also increasingly concerned. Experts attribute to

(14:06):
the shift towards growing scientific data that's coming out, including
things like it's linked to cancer, just scares people. I
think the SIG thing for sure has hit that full circle.

Speaker 7 (14:20):
But I mean, I just saw on the news this morning.
It was like, people aren't drinking, kids aren't having sex.

Speaker 2 (14:26):
They nobody's don't know what.

Speaker 6 (14:27):
I would have done the years that's all I was chasing.

Speaker 1 (14:31):
I love that was glory days.

Speaker 2 (14:33):
They're just vaping and playing video games.

Speaker 1 (14:35):
How dumb would American Pie the movie be?

Speaker 2 (14:37):
Now?

Speaker 5 (14:37):
Dude, I'm so concerned for our future, you know, Like
I think, you know, our generation is the last one
to see it normal.

Speaker 6 (14:46):
Oh yeah, it's gonna be totally different. And I think
about my kids who are little and they're being raised
like my way, right, But at some point we're going
to figure out what they do.

Speaker 1 (14:57):
In all this mix.

Speaker 6 (14:58):
You know, it's like it's especially when there's gonna be
no there's gonna be no interaction and no job for them.

Speaker 5 (15:05):
You have the jobs they want to do are going
to be you know, automated. It's it's really scary. But anyway,
there it is. Speaking of AI, one in five students
admit that they use AI to cheat at school. I
think one in five students or sorry, four and five
students are lying yeah, because I think it's probably a
little higher than that.

Speaker 6 (15:23):
Even to get yourself an outline. I mean, it's so
easy now to say, give me an outline of what
I need to write, and then you could write it
and it's a fall prove.

Speaker 7 (15:31):
My whole question is, how do you like do we
not do like footnotes anymore? Do we not do like
cite your sources bibliography shit? Because if you cite your
sources chat GPT, how's that going to work?

Speaker 5 (15:43):
Exactly?

Speaker 6 (15:43):
But I think chat GPT has original source code. So
what you can do is say, build me a bibliography
based on what I just lied about.

Speaker 1 (15:51):
Are what I just plagiarized.

Speaker 7 (15:53):
Due When I was in college, we weren't even allowed
to use Wikipedia as a source, and now these kids
are using it.

Speaker 6 (16:00):
I remember in a college class once I wrote a
pretty good paper, but it was all bullshit, and I
put together a bibliography, and I remember that the teachers
they came to me and he was like, none of
this is legit in the bibliography. And I was like,
I must have. I must have messed that up. I'm like,
I give that another run. And he slid the paperback.

(16:21):
He's like, yeah, why don't you just clean that up.
I'm like, okay, And I took it back and had
to redo it for real.

Speaker 1 (16:27):
But I was like, I got this. It's not gonna check. Definitely.

Speaker 2 (16:31):
We checked.

Speaker 5 (16:33):
I guess a lot of teachers can tell it's chat
GPT right away because there's a symbol that only chat
gipt uses. You can I don't know what the symbols called,
but you can get it on your keyboard, but you
have to press multiple buttons because there's no actual key
for it. And chat GPT uses it to like separate
their sentences.

Speaker 2 (16:50):
It's like long, like a long dash, like.

Speaker 5 (16:52):
A long minus bar. Yeah, right, but it's you know,
it's longer, and the chat gpt is really the only
thing that uses that. Yeah, and so if you start,
if you're a teacher and you start seeing that in
your papers, chances are I could went to chatchipt.

Speaker 2 (17:04):
That's how they could tell. Remember the whole CEO debacle.

Speaker 7 (17:07):
And then his wife like wrote this like scathing letter
about like don't worry about me, I'm not blah blah blah,
and everyone was like yeah, empowerment, yeah, everyone. And then
like it turns out that it was written by chat
gpt ai or whatever, and you could tell because of
the long dash line.

Speaker 6 (17:25):
See if I was straight up cheating in college. Now
with what they have, what I would do is I
would have it write it for me. I would write
my version with the little footnotes, and then they would
rewrite the thing legitly and then just sit it next
to you and rewrite it.

Speaker 1 (17:39):
I think you could bring your own hands, Drew.

Speaker 5 (17:40):
I think you could probably tell chat gipt to write
it to make mistakes, because they do that. They do
that on Donald Trump's Instagram and twitters and stuff. They
have other people post for him sometimes and they intentionally
misspell things or make it sound grammatically incorrect.

Speaker 1 (17:55):
See, I want my final paper to have true errors.

Speaker 6 (17:59):
I would on it rewritten by me, you know, because
I'm going to naturally make some mistakes. So you're just
like paraphrasing, paraphrasing so that when they say, okay, true,
I have never heard you say those thirty words. I
don't write those thirty words in the paper, but you
already wrote.

Speaker 1 (18:15):
The paper for me.

Speaker 2 (18:16):
It just doesn't have your phone.

Speaker 5 (18:17):
But I think you know, you're probably right, but I'm
thinking you could probably tell it chat GPT to. You
can prompt it to do certain things like make it
make my mistakes minor, but you know, make some mistakes.
You could say, just make keep it light, but don't
use it. Don't use really you know, elaborate words and
like difficult words. Well, and after all, give it to
like a fifth grade level or whatever.

Speaker 1 (18:37):
Yeah, dumb it down a little.

Speaker 7 (18:38):
Your phone is already listening to everything you say, so
it's like it knows how you speak, it knows your mannerisms, it.

Speaker 5 (18:43):
Knows sounds when I come.

Speaker 2 (18:45):
Yeah, unfortunately, it does probably know that.

Speaker 7 (18:49):
But I'm wondering, like, at what point chat You're just
gonna be able to tell chat GPT, Hey write this
in my voice and it will pick those.

Speaker 5 (18:57):
It already knows you you've been talking to. I don't
think it can remember you so like right now anyway,
Like if I use chat GPT today and then a
week later I use it again, it doesn't it's not
going to remember that conversation.

Speaker 1 (19:08):
So here's the scary part.

Speaker 6 (19:09):
Yesterday I told you guy, or I told you today
that I went to a barbecue for my kids and
other parents were there. Yeah, And first thing the guy
one guy is an AI specialist out of San Francisco
who works remote here and the other one is a
cyber security specialist and they're talking.

Speaker 1 (19:25):
I walk up. I got no idea what they're saying.

Speaker 6 (19:27):
And the first thing the guy says when he turns around,
he goes, hey, Drew, question for you.

Speaker 1 (19:31):
Has AI taken over your sales team at work yet?
And I was like no. He's like, oh, well, count
that is coming and I was like uh. And then the.

Speaker 5 (19:39):
Hour because they have to make they have to make calls.

Speaker 6 (19:42):
When it becomes on both ends once once pets once
they're doing it and we both have it.

Speaker 1 (19:50):
They're just going to connect the dots.

Speaker 6 (19:51):
Across the table, which is frightening and I don't know
how long that takes. Here's the more frightening thing. Chat
GPT can't remember you in that program. There's a company
that this cybersecurity company is working with that he said,
Egos Drew. If you talked into a microphone and read
the pieces of paper that they asked you to read
for two and a half minutes, they could mimic your

(20:12):
entire language. Yeah, and every word you say forever two
and a half minutes. And I'm and I'm completely worthless.

Speaker 5 (20:19):
Crazy and I've heard it's even less than that I heard.
They can hear your voice for three seconds and then
duplicate it, and probably they're doing the extra time to
get rid of the in it.

Speaker 1 (20:28):
You know, you can kind of.

Speaker 5 (20:29):
Hear you actual breaths and everything, because I get really
creeped out when I do use an AI bot or
whatever and it's talking to me with breaths. And there's
one I can't think it's Gronk, but it'll like stutter
and like talk like a like a like a person,
you know, let me explain it.

Speaker 2 (20:44):
And it's just Gronk modeled or Elon modeled it. After
him saying yeah, he.

Speaker 1 (20:48):
Does stutter at Stanley.

Speaker 6 (20:49):
It's just how scary is that that's just two guys
standing next to a barbecue, say two sentences that I'm like, Oh,
that could eliminate my tire building.

Speaker 5 (20:58):
Sam Altman, the guy who runs chatchipt, you said, you know,
there are a lot of banks and this blows my
mind that they're still allowed to do this. There are
a lot of banks that use voice verification and with
AI that needs to stop. Speaking of cybersecurity, Drew, We've talked.
I feel like we talk about this every single year.
But a new surveys found that five percent of people
admit to using password as their password.

Speaker 1 (21:19):
Which is a bad deal.

Speaker 6 (21:21):
That's going to be the first thing in auto fills
if they're using a bot to check your passwords.

Speaker 7 (21:25):
Well, and I don't know how anyone can get away
with doing that anymore. First of all, I feel like
I have to change my password. It's like three times
a year anyway, because either I forget it or something's
compromised and I'm forced to reset it, or.

Speaker 1 (21:38):
They add a special character that was never there before, or.

Speaker 7 (21:40):
It's like you have to use a lower case in
a upper case in a number, and a special case.
It's like, how are you still using just like a
lowercase password?

Speaker 1 (21:48):
Yeah, my whole password now is just the wing ding keyboard.

Speaker 5 (21:51):
Yeah, exactly, just smashed my hand on it, they say.
The new survey found that seventy percent of two thousand
adults admitted to using the same or similar passwords across
multiple sites, while seven percent used password one. Other common
options include basic number combinations like one, two, three, four,
five six and one two three, four, five, six, seven
eight nine. Despite the growing threat of cyber scams, sixty

(22:13):
percent of adults still use weak passwords, with forty five
percent clinging to familiar combinations, at eleven percent believing there's
no issue with predictable choices.

Speaker 7 (22:22):
Well, we've talked about this before where it's like, it
doesn't matter how complicated my password gets. The only one
who can't remember it is me, Like it always ends
up in like a data breach or something, and then
they're like, you need to change your password. I'm like,
it's twenty seven characters long and I don't even know
what it is, Like it just matters.

Speaker 1 (22:41):
Yeah, I don't think.

Speaker 7 (22:41):
It matters if my password or is complicated or not,
because there's always going to be somebody who hacks in
and steals my information anyway.

Speaker 5 (22:48):
So my Google magically just logged me out one day. Yeah,
and I never remembered the password because you know, I
set it up a year ago or whatever, and it's
just it's saved there, so just automatically logs me in.
It took me like twenty minutes to get back into
that thing.

Speaker 7 (23:02):
The thing that was the most annoying is I used
to have an app that kept all of my passwords safe,
so if I ever forgot one, I could log into
the app and be like, oh, there's my password. I
had to get into that app with face ID. One day,
my face ID had reset and they were like, we
need your actual password, and I was like, I don't

(23:23):
remember my password. And because of the nature and like
the security of the app, they were like, sorry, you can't.

Speaker 2 (23:30):
You can't reset your passwords.

Speaker 7 (23:31):
So I had to go through and manually reset every
password on all of my accounts because they were all
locked inside this app that I could no longer get into.

Speaker 6 (23:40):
I was like, and it's said that every one of
us has been part of a breach list. Yeah, when
they're like, sorry, every iPhone and then they were like, oh, sorry,
every Amazon account that you.

Speaker 1 (23:51):
Just you just crossed two maps and that's the entire world.

Speaker 5 (23:55):
And this is what we know. You know, I'm convinced
that these billionaire, you know, these millionaires in these companies
are just not telling us everything. You know, data is
worth more than gold. Oh yeah, and they've been collecting
our data so on social media now for what twenty
something years since when did MySpace come out?

Speaker 1 (24:12):
There's a lot there, it's over twenty.

Speaker 5 (24:14):
Years ago, all this time, and then you've got the
previous data before all that. That data is fucking gold
to these people, and they're going to keep their hands
on it. I was reading an article the other day
about companies selling their data to each other. You sell
us your data, I'll sell you ours, and it's totally illegal,
but they're doing it anyway. It's the Uh, there's a
I think it's a congressman in North Carolina who's held

(24:37):
and fight stuff like this.

Speaker 6 (24:38):
Well, at some point, you know, we got to put
the power back to the people. It's like, we let
the companies get so big that they're like governments and
they make their own decisions that it goes unregulated. Nobody
knew what data collection was thirty years ago, and or
the power of it. That's one of the reasons that Google,
not just because they're so rich and have all the assets,

(24:59):
they have all of the information that any of us
have ever put into that thing that is that's not
just money and gold and its power.

Speaker 5 (25:07):
I mean, I just think about what Google knows. They
know everything, They know every.

Speaker 1 (25:12):
Reach and every one of us individuals.

Speaker 5 (25:14):
I mean they know like they can you know, people
Google how to hide a body so they know when
there's murders.

Speaker 6 (25:19):
They can do they can float chart the mood of
America or the world at any moment. Are we more volatile?
Are we more empathetic? Are we ready for comedy? I
mean there is an ebb and flow that they want.

Speaker 2 (25:32):
Thanksgiving Castle role is our favorite?

Speaker 5 (25:35):
Yeah, they got that? Which friend's character? Are you? Are
you doing a great big castle role?

Speaker 7 (25:40):
Like?

Speaker 2 (25:40):
So is everybody else in Wisconsin not original?

Speaker 5 (25:43):
Do some sour dough bread?

Speaker 6 (25:45):
I wish people well fought back like that And you're like,
how to make green bean casserole?

Speaker 1 (25:49):
How about the original dork?

Speaker 2 (25:53):
Like, nothing's going to be your aunt Linda's So that's.

Speaker 5 (25:56):
Right, Well, I hope that you know what I do
hope is that a lot of this AI stuff is
just being hyped up so much that it's you know,
freaking us all out and everything. And then they're saying
a lot of it is there's a balloon right now,
and these companies are investing billions of dollars and what
if some of these these these tech bows are just lying.

Speaker 2 (26:17):
Well, yeah, and I think too.

Speaker 7 (26:18):
Like I was talking to somebody about it the other
day who was like, not worried about it at all,
and he's just like, let's just think about everything else
that's ever come out, like even the Internet.

Speaker 2 (26:28):
Or face id or whatever. It's like it's always.

Speaker 7 (26:32):
Startling at first, and then we adjust and adapt to it.
It seems scary now, but when it becomes more ingrained
in our day to day life, maybe it won't be
as bad as we all.

Speaker 5 (26:43):
We probably won't be as scared. But long A's gonna
have jobs.

Speaker 7 (26:46):
I know.

Speaker 5 (26:46):
Have you heard the new By the way, before we
go on, there's a new uh Like it's like a
it's like a derogatory slur for machines. This is going
viral on the internet, and this is a slur that
we can all say and not feel bad about it,
but apparently they're calling him.

Speaker 2 (27:00):
Clankers, clankers, clanker, clanker.

Speaker 5 (27:04):
Let me pull this up.

Speaker 1 (27:05):
What is a clanker.

Speaker 5 (27:06):
It's like an AI or a machine.

Speaker 1 (27:08):
Okay, but it's like it's like a drug.

Speaker 5 (27:10):
People are seeing these robots in public and and they
just start like you kind of being like, you know,
racist towards them. Here's here's one, this fucking wire back.
I don't like your kind of around here, clanker. So
he spits at it, and that video went viral and

(27:32):
out everyone's calling like, these robots at hospitals and stuff clankers.

Speaker 7 (27:36):
Mm.

Speaker 6 (27:37):
Well when he called them a wire back, it's all
it's all like, it's not even accidental racism.

Speaker 1 (27:43):
It's like getting away with being being terrible.

Speaker 5 (27:47):
Yeah, but you could read the comments. Nobody seems offended
by this.

Speaker 1 (27:49):
Uh.

Speaker 5 (27:50):
This is another one where they call a robot this
looks like a hospital.

Speaker 6 (27:53):
I canep them keep it pushing you dirty fucking clanker, clanker.

Speaker 2 (27:59):
Motherfucking I still don't like the sound.

Speaker 5 (28:05):
But yeah, that's that's really blowing up on the internet
right now. And it was I can't believe you didn't
see it was in all the social is all the
radio prep.

Speaker 2 (28:12):
No, I've just been seeing a lot of news about
Taylor Swift.

Speaker 5 (28:15):
God, it was on the news last night, Like I
I saw that they I was watching Jake Tapper and
they literally had a countdown running on the screen for
when he podcast started at seven.

Speaker 1 (28:26):
I know, that's unreal, and they, you know.

Speaker 5 (28:28):
They didn't show the actual podcast, They just did the countdown.

Speaker 2 (28:31):
The podcast broke the internet, like yeah, of course.

Speaker 5 (28:35):
They were just like And then after that, a couple
hours later they were showing clips of it on the
news on CNN. I know, yeah, Like what, it's not
a slow news day. There's plenty going on at least.

Speaker 6 (28:45):
I mean it's annoying, but at least she's a good
influence and having like young kids and how everyone dresses
like a hooker or whatever. At least she's a good influence.
That's that's I mean, what did you really do wrong?

Speaker 2 (28:57):
I think?

Speaker 7 (28:58):
No, I mean, she's definitely but I feel like I do.
But I also feel like this, Like when I like
was going through like the pictures, the PROMA pictures, I
was like, damn, Taylor Swift is finally like getting.

Speaker 2 (29:10):
A little risque over here. Like I think I see a.

Speaker 5 (29:13):
Side boob, you know, lingerie shots.

Speaker 1 (29:16):
It's funny say I'm gonna have a cranberry vodka. I'm
gonna loosen up.

Speaker 5 (29:19):
Yeah, but I don't think anybody's saying like she did
anything wrong. No, it's just that you see her every
and I have no problem with her in the NFL.
That stuff does not.

Speaker 1 (29:27):
Bother just oversaturated everywhere on the news.

Speaker 6 (29:30):
Do we really need to be talking about this is
TMZ stuff. Yeah, it shouldn't be on the mainstream news.

Speaker 1 (29:34):
That's right.

Speaker 5 (29:35):
That's my problem with the with the tailor Stift. Other
than that, I love their relationship and that's a that's relationship.

Speaker 2 (29:41):
It seems like they're very supportive of one another. Yeah,
so that's nice.

Speaker 1 (29:44):
One more tour and then and then get married.

Speaker 7 (29:47):
One more tour and Travis kelcey, because even last season
people were talking about is Travis Kelsey gonna play again?

Speaker 2 (29:54):
I could see him retiring.

Speaker 6 (29:57):
Playing this year and they said he hasn't worked this
hard and years. He got pretty upset with the way
it went in the super Bowl.

Speaker 5 (30:03):
Yeah, yeah, well one more you know, be awesome if
he gets one more super Bowl for him anyway, not
make babies.

Speaker 1 (30:10):
You're gonna be forty soon. Come on, dude, I.

Speaker 7 (30:12):
Know Taylor is I mean, she's looking at a geriatric
pregnancy at this point, she's thirty five.

Speaker 6 (30:17):
Anything over thirty five they call geriatric. It's a criminal word.

Speaker 2 (30:20):
Yeah, it's really awful.

Speaker 1 (30:22):
My mom can go sideways after thirty.

Speaker 3 (30:25):
Yeah.

Speaker 5 (30:25):
My mom had my brother at forty two and it
almost killed her.

Speaker 6 (30:28):
So yeah, honestly, Amy had a baby at thirty eight
and it was a lot different than it was when
she was thirty.

Speaker 1 (30:34):
Yeah, that's for sure.

Speaker 5 (30:36):
All right, Well that does it for us today. Thanks
for checking out The Donkey Show podcast. We'll be back tomorrow.
And tomorrow's the big day, the end of the blubber burn.
Me and Bee Fodder have been going head to head
for the last six weeks to see who can lose
the most weight. Yeah, we're actually gonna get weigh today,
So technically for us, today is the last day, but
you're gonna I'm not even gonna know the results until tomorrow.

Speaker 7 (30:56):
Yep.

Speaker 1 (30:57):
I'm excited.

Speaker 6 (30:57):
I can't wait for you guys to find exactly what
you were able to accomplish.

Speaker 5 (31:02):
Yeah, and like dude, I'm already feeling pretty good, like
I'm wearing a shirt today that I've never been able
to fit in when it's like a new shirt that
I've owned for six months because.

Speaker 7 (31:09):
I've done so fat and I do think that stuff
is more important than a number on a scale. You know.

Speaker 6 (31:15):
Fitting into shirts is a great feeling. When I couldn't
touch the buttons together, you know, a couple of months ago. Yeah,
and you know you're both floating and gear that you
would have been squeezing into before.

Speaker 1 (31:26):
So it's gonna be a good result all around.

Speaker 5 (31:28):
But I'd be lying if I told you guys that
I'm not a little nervous about it.

Speaker 1 (31:30):
Of course. I mean, no one wants to lose.

Speaker 6 (31:33):
You know.

Speaker 5 (31:33):
He's been working hard to and I just don't know.
It's going to be close, right.

Speaker 2 (31:36):
It's going to be close for sure. Oh man, we'll see.

Speaker 6 (31:40):
I get sweaty palms thinking about it because I do
not want to translot my line. But you think about
how far away it felt, and it's finally here, It
is here.

Speaker 1 (31:48):
You did it, all right.

Speaker 5 (31:49):
We'll see it tomorrow and that will be at eight
o'clock for the for the for the final way in,
and we'll have another PARENTLYICN Park ticket buy.

Speaker 3 (31:58):
You've been listening to Tanner and Laura's Donkey Show, heard
daily at one oh five nine the brew dot com.

Speaker 1 (32:05):
May God have mercy on all of our souls.
Advertise With Us

Popular Podcasts

Stuff You Should Know
My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder with Karen Kilgariff and Georgia Hardstark

My Favorite Murder is a true crime comedy podcast hosted by Karen Kilgariff and Georgia Hardstark. Each week, Karen and Georgia share compelling true crimes and hometown stories from friends and listeners. Since MFM launched in January of 2016, Karen and Georgia have shared their lifelong interest in true crime and have covered stories of infamous serial killers like the Night Stalker, mysterious cold cases, captivating cults, incredible survivor stories and important events from history like the Tulsa race massacre of 1921. My Favorite Murder is part of the Exactly Right podcast network that provides a platform for bold, creative voices to bring to life provocative, entertaining and relatable stories for audiences everywhere. The Exactly Right roster of podcasts covers a variety of topics including historic true crime, comedic interviews and news, science, pop culture and more. Podcasts on the network include Buried Bones with Kate Winkler Dawson and Paul Holes, That's Messed Up: An SVU Podcast, This Podcast Will Kill You, Bananas and more.

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.