All Episodes

October 21, 2025 60 mins
The AWS Meltdown: $50 Billion Lost and Nobody’s Talking About It | Karel Cast 25-134
When Amazon Web Services (AWS) went down, the internet itself nearly broke — apps crashed, transactions froze, and the global economy took a $50 BILLION hit in just hours. But the real story isn’t just about one outage… it’s about how dangerously dependent we’ve become on a single company’s servers.
Karel breaks down the truth behind the AWS crash, why it’s scarier than you think, and what it says about the fragile state of our tech-driven world.
Also on today’s show:
🏛️ Trump’s White House renovation fantasy — is a “gilded ballroom” really what America needs?
💊 90% of adults may be sick and not know it — Karel explains the rise of CKM Syndrome.
🎁 Holiday toy scandal alert: the Fake Labubu controversy is here — and it’s wild.
💬 Join the conversation and help keep independent media alive!
👉 Subscribe & comment on YouTube: youtube.com/reallykarel
💚 Support the show: patreon.com/reallykarel

#KarelCast, #AWSCrash, #TechNews, #AmazonWebServices, #InternetOutage, #CloudComputing, #BigTech, #DataSecurity, #TrumpNews, #WhiteHouseRenovation, #CKMSyndrome, #HealthAwareness, #FakeLabubu, #PopCulture, #IndependentMedia, #PodcastClip, #BreakingNews, #CyberSecurity, #TechFailure, #Karel
https://youtube.com/live/WRerBWaLDu4

Become a supporter of this podcast: https://www.spreaker.com/podcast/the-karel-cast--1368295/support.

The Karel Cast is supported by your donations at patreon.com/reallykarel and streams live Monday–Thursday at 10:30am PST. Available on YouTube, TikTok, Instagram, Apple Music, Spotify, iHeart Media, Spreaker, and all major platforms.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Is here. No time to fear.

Speaker 2 (00:02):
Corall is so near because show time is here.

Speaker 1 (00:06):
So on with the show. Let's give it a go.
Corilla is the one that you need to know. Now.

Speaker 3 (00:15):
It's show tide, all right, ladies and gentlemen.

Speaker 4 (00:30):
Donald Trump is destroying a federally protected monumental We're gonna
do anything about it? And aws goes down and so
does most of the world. Is this something we should
be concerned about? And is your teenager in love with
an Ai chapa?

Speaker 5 (00:44):
Uncensored, unfiltered, fun hinged. It's coral Cast. Listen daily on
your favorite streaming service.

Speaker 4 (00:58):
Welcome every one of the outcast. It is the Tuesday,
October twenty first, So glad you're joining me today?

Speaker 3 (01:05):
What a day?

Speaker 4 (01:07):
Before we begin with the pressing issues of the day.
Can I just say, as I age, and I wonder
if this applies to you too. I am such a
creature of habit Where are.

Speaker 3 (01:17):
You miss them? I truly am?

Speaker 4 (01:20):
And I realized that this morning so Ember is having
a test for cushings. Now you have heard me talk
about this, and if you hire a dog owner, let
me educate you a little about what cushings is. Cushing
is too much cortisol, the stress hormone being produced and
being in their blood. It can cause excessive panting, It

(01:43):
can cause weakness, lethargy, It can cause the kidneys to
shut down. It can cause a lot of things having
too much of this hormone. Now, usually it's one of well,
it's always one of two things causing this. It's either
a small benign tumor on the patuoit terry gland in
the base of the brain, or a tumor on the

(02:04):
adrenal glands, which are in the stomach, and this tumor
stops them, stops the cortisol from being suppressed. It just
goes too high. So they have a couple tests they
can do for it. One is a urine test, which
told us that she probably didn't have it, but that

(02:26):
urine test isn't always one hundred percent, you know, complete,
So because she has other labs that say she may
have it, elevated liver enzyme, her alp, triglycerides and cholesterol,
and an enlarging liver, we really want to rule it out,
like really rule it out, because there is medication and

(02:50):
it can help extend their life if they get on
the medication and respond well, and I want her to
have the longest life possible. I'll be in tears tomorrow,
by the way, if we find out she has it,
even though she doesn't die right away.

Speaker 3 (03:04):
So they she's.

Speaker 4 (03:05):
Doing today what's called a low dose dextra methazone suppression test.
I want to explain this to you because you hear
me talk about it, and if you're a dog owner,
you may encounter this one day. So this is what
they do. She's fasting right now, which she hates. I
am like public enemy number one. Neither she nor I
have eaten all day today and we won't be eating

(03:28):
till four o'clock this afternoon because she eats with me.
And this is where we're going to talk about being
a creature of habit. But so this test, we went in.
They took her blood pressure. You ever seen them take
a dog's blood pressure. They put a cuff around two
paws and then a little electro on the bottom of
the pod or the paw, and of course it's gonna
be high. She was, you know, stressing that, stressing out,

(03:52):
So so we did that. They injected her with a
low dose of dextra methasone. Now, according to chat GPT.
This test is used to help diagnose cushions. It checks
whether the dog's adrenal glands properly turn off cortisol production
when given dextra methazone, a synthetic steroid that normally tells

(04:16):
the body to stop making cortisol.

Speaker 3 (04:19):
So at time.

Speaker 4 (04:20):
Zero, a baseline cortisol is measured. They took her blood
this morning. After the injection, the pituitary gland should sense
that there is enough steroid in her body and stop
telling the adrenals to make more cortisol. So, as a result,
her cortisol level should drop okay at four hours, and

(04:41):
at eight hours it should be suppressed if it is
no cushings. So tomorrow the results I want to hear is.

Speaker 6 (04:51):
It was suppressed.

Speaker 4 (04:52):
If it is suppressed at four hours but goes up
at eight hours, she probably has a tumor on her
petuitary cherry gland, which eighty percent of the dogs that
have cushings, that's where it's from. If it stays the
same or rises, then she probably has a tumor on
her adrenal glands, which twenty percent of the cushings cases are.

(05:16):
Now how they treat it. If it's pituitary. They can
only give medicine. They can't operate because it's at the
base of the brain. If it's the adrenals, which we
have had ultra sounded and were normal and no tumor
present at least four months ago. If it's the adrenals,
they can go in laparoscopically and get the tumor if

(05:36):
the dog is healthy, and that will basically cure them
of the cushing.

Speaker 3 (05:41):
We won't know.

Speaker 4 (05:42):
Until tomorrow, and even then we may not know which
type if she has it.

Speaker 6 (05:48):
So every morning we wake up, we have breakfast.

Speaker 4 (05:52):
She didn't do that this moore, So I didn't do
that dreaming okay, because I didn't want to eat in
front of her.

Speaker 6 (05:57):
So while she was having her blood Tat.

Speaker 5 (05:59):
Drilled dot com daily, you're missing out. Get the podcast
videos and the blug and gleating recipes at really Corell
dot com. That's really ka r e l dot com.

Speaker 2 (06:13):
Show Time is here. No time to fear. Corrill is
so near because show time is here. So on with
the show. Let's give it a go.

Speaker 1 (06:23):
Correll is the one that you need to know.

Speaker 4 (06:28):
So anyway to basically summarize, because I know a lot
of you aren't interested in this. I'm hoping tomorrow that
she was suppressed at four hours and eight hours. If not,
I'm hoping that they can at least tell which kind
it is from this test, and she doesn't need a
high dose dextra methasone suppression test, which is exactly what
we're going through today, except they inject a higher dose

(06:52):
of it.

Speaker 6 (06:52):
So this morning, no breakfast. While she was in the
back having blood taken.

Speaker 4 (06:56):
I literally gobbled overnight oats with her out of the room,
like gobblegle, gobble, and some fruit gobblegobble gobble, no tea,
so she didn't see me eat. Then we came home
and we went to the park and she pooped and
peede and saw her friends, but we couldn't go do
the walk and that bothered her, so we came home.

(07:16):
She wanted to play ball and wouldn't take no for
an answer, so we threw it twice, just because she's
supposed to be resting, and she found three pieces of
kibble on the floor and I couldn't stop her.

Speaker 6 (07:26):
She got it first, but I don't think that'll matter.

Speaker 4 (07:31):
Uh, and we so all my whole morning has been
not my morning, Okay, didn't brush my teeth at the
same time didn't brush her teeth, and I realized what
a creature of habit I've become. And I wanted to
talk about this because of yesterday's quote about why we
put up with despots like Donald Trump or government that

(07:52):
don't serve us. In the Declaration of Independence, it says
that we as a people human are apt to suffer
if it's suffering that we know, because the suffering is familiar,
and to change would require great change, and that's always

(08:17):
jarring to humans. And I really thought of that today,
just breaking my routine today, and I am all discombobulated.
I mean I truly am. I'm like, oh my god,
what am I going to do when she passes? Because
my whole day is like just discombobulated here.

Speaker 3 (08:35):
And it really.

Speaker 4 (08:37):
Shows me about humans accepting substandard behavior because it's familiar.
It's what you know, Like right now, for the last
eight nine months, it is normal for us to wake
up to Donald Trump doing something horrible, and we had

(08:58):
four years of it, so we're you used to it.
We are used to the indignation, to the anger, we
are used to him messing everything up. We're used to it,
and that's scary because we've gotten in the habit now
of accepting it. And that goes right to what it

(09:20):
said in the Declaration of Independence. We have gotten in
the habit of accepting really abuse basically, and I think
our founders were developing what we would later know as
Stockholm syndrome, where the abuser actually starts liking I'm sorry,
the abused actually starts liking.

Speaker 3 (09:42):
The abuser.

Speaker 4 (09:44):
Speaking of abuse, totally off topic, totally random. I saw
Mariska Hargeta on the from Law and Order rest review
on the Today Show, and did you know that fifty
percent of women stay in abuse of relationships because of
their dog. We were talking about dogs and and burn
all this, and y'all think I'm on crack for paying
four hundred dollars for a blood test. Fifty percent of

(10:05):
women will not leave an abusive situation because they're afraid
of when they leave what the other person.

Speaker 6 (10:14):
Will do to their dog.

Speaker 4 (10:16):
And women shelters only twenty percent of them will take
a dog or animal with the woman. And so now
there's something I want you to get involved in called
the Purple Leash Project, and that is a national project
to allow shelters for battered and abused people to bring

(10:39):
pets in.

Speaker 3 (10:41):
Now, remember after Katrina.

Speaker 4 (10:43):
In Katrina, the Coastguard would not take pets and people died.
I would not have left Ember. There's no way in
hell I would have left Ember. During Katrina would not
have happened, and most of you would not have left
your pet. After Katrina, the Coast Guard changed their policy
and they now evacuate pets as well. We need to

(11:06):
do the same for women's shelters. We need to put
our names and put our support into the Purple Leash Project,
which is going into shelters and helping them become accommodating
to pets that might come in with battered women or

(11:26):
battered men. And so I wanted to put that on
your radar this morning. As we talk about abuse and
we talk about being familiar with your abuser and all
of that. People stay in relationships that are abusive for
many reasons. Half of women stay in them because they
can't leave with their pet. It's the same as what's

(11:49):
going on with Donald Trump. We are staying in this
abusive relationship because to get out of it would require
an actual revolution, not a no King's rally. It would
require an actual armed revolution where armed militias go and

(12:10):
stop Ice from taking people, where armed militias kick the
National Guard out of their cities, and where armed militias
go and remove Donald Trump and MAGA from the White House.
I'm not saying kill them. I'm simply saying remove them,
take them home, and say you're no longer in power.

(12:33):
We're having a new election, you're out of power. And
so what that looks like is basically the American Revolution.
And knowing what we know, most of us don't want
to ever go through that.

Speaker 3 (12:50):
A very few.

Speaker 4 (12:51):
Of you, myself included, would risk your life for this country,
literally risk your life.

Speaker 3 (12:59):
I would not.

Speaker 4 (13:01):
Yesterday, as you know, I had a very emotional moment
right here on air where I finally fessed up and said,
if things don't change in the midterms, I'm out.

Speaker 6 (13:10):
That's it.

Speaker 3 (13:11):
I'm out.

Speaker 4 (13:12):
I'm going, and I may go before if it gets
too bad. But even that, okay, I know that I'm
in an abusive relationship with my country. It hates gay people, Okay,
it does, and it always has.

Speaker 3 (13:27):
So why haven't I left?

Speaker 4 (13:30):
Why do GAZE want to be in a military that
doesn't want them? Why do gaze want to be in
jobs where it's dangerous by being gay, because to change,
to literally pick up my life and move it someplace else,
given how ingrained. This goes back to the habits I

(13:53):
was talking about. You know, we eat at eleven thirty,
so don't think at eleven thirty today, she's not going
to be all barking at me like where's my food?
And I'm not going to eat lunch either, because she
can't eat lunch. So where I'm in such habits that
to break those habits going to the park, feeding in
for at this time, doing the show at this time,
being here in this location, to break all that.

Speaker 6 (14:17):
Okay, that's huge.

Speaker 4 (14:22):
And a lot of times you just rather you would
just stay in the bad situation because it's not so bad.
You know, my life isn't terribly bad. It's bad, and
I'm unhappy a lot, but it's not like I'm being
beaten every although emotionally I'm being beaten every day, so
are you. Emotionally we're being abused every day by Donald Trump.

(14:47):
So I get it. I get why people stay. I
get why they don't make great change, and I get
why we're not revolting. Why We're not going up to Washington,
d c A ourselves. That's what the Second Amendment was about.
Donald Trump argues that the Second Amendment, because of the
Second Amendment case in front of the Supreme Court about

(15:09):
I want We're gonna talk about it later. Should pot
users be able to have guns? I have one and
I'm a chronic pot user. Technically that's illegal, but there's
no rules against people that drink alcohol or take oxy contin.

Speaker 6 (15:24):
So we'll talk about that in a minute.

Speaker 4 (15:26):
But Donald Trump's argument about keeping guns legal is that
it ensures liberty to have weapons. If that's the truth
and Republicans really believe that guns insure liberty, then liberals
would be arming themselves and going and removing MAGA. Now

(15:48):
I'm not calling for assassination. I'm simply telling you what
the Second Amendment is there for. We fight so much
about guns, and guns are in front of the Supreme
Court all all the time. And the reason they say
that Americans must have guns is to keep freedom and liberty. Well,

(16:08):
if that's the case, then liberals should be arming themselves
and removing MEGA, even if that leads to a civil war.

Speaker 3 (16:19):
We're not.

Speaker 4 (16:20):
And why aren't we Because We've become used of the abuse,
and we get abused from Democrats just like we do Republicans.
Gays got abused by Democrats with don't ask, don't tell
Defensive Marriage Act. Blacks have been abused by Democrats when
Democrats have been Even under President Obama, there were more

(16:41):
black men in prison than there were in college. We
get abused by our government. They give us Social Security,
which keeps us in poverty. They know that over seventy
percent of seniors rely on it for their income, and
yet they keep us in poverty. We get abused by

(17:01):
the wealthy all the time. Every day, the wealthy take
advantage of us, and yet we don't take their wealth.
We don't tax billionaires out of existence, even though they
are more dangerous than politicians. Why because suffering and abuse

(17:21):
has become part of our daily schedule. It's become part
of the fabric of who we are as a nation,
and we accept it. We accept it because to denounce
it would bring about great change. Humans don't like that.
It goes right back to the Declaration of Independence Tom

(17:45):
where Jefferson said, I understand that you know people would
rather suffer under a bad government than change it. But
there comes a point where you have to change it.
So as I was thinking about my routine being broken
with Ember today, I thought about the nations and how

(18:09):
it is now just a routine to see horrible stuff
in the news, to go on social media and bitch
about it, to listen to talk radio and bitch about it,
to sit around the tables with our friends and families
and commiserate about it, but not really change it. And
one of the reasons we say we're powerless to do it,
No we're not. Ultimately, we could leave. You've got a passport,

(18:33):
you could go, but we don't because that would require
a complete life change and humans just don't do well
with change. And even our founders knew that. Let me
give you the exact quote, give me the text of
the Declaration of Independence. I do love chat GPT for

(18:56):
certain things, you know. And let's say what's in the
chat room YouTube dot com, Forward slash Really, Carrell, Thank you,
best to Mber on her results. Thank you, James, Rachel Kapper,
much love to you, Rachel. Good morning, Kennedy, Good morning, Sandy,
Good morning all of you in the chatroom at YouTube
dot com. Forward slash Really, Carrell. So here's what he said,

(19:18):
Prudence indeed will dictate that government's long established should not
be changed for light and transient causes, just like your life. Okay,
your life that is long established should not be changed
for light or transient causes. And accordingly, all experience has shown. Okay,
this is what Jefferson was saying here, that experience has

(19:40):
shown that mankind, not just Americans, but mankind is more
disposed to suffer while evils are sufferable than to right
themselves by abolishing the forms to which they are accustomed.
That is a powerful sentence. Our founders were so incredible

(20:03):
that that is such a powerful sentence to all parts
of our life. Mankind is more disposed to suffer while
evils are sufferable, than to right themselves by abolishing the
forms to which they are accustomed. Yep, how many of
you have a friend and I just I just broke

(20:26):
up with one after thirty seven years because even though
many times she did things that were hurtful, I was
used to it, and to end that friendship and abolish
that suffering would mean great change and loss. So I
just never ended it. I never addressed it. I never

(20:49):
brought up the things that hurt me, I just bit
my tongue. Look would have got me. And that's what's
going on today with Donald Trump.

Speaker 7 (21:01):
We we see, we feel that the evils are evils,
are sufferable, that were able to suffer until the midterms,
that were able to suffer until twenty twenty years.

Speaker 4 (21:17):
So instead of making great change, we just accept the suffering.
Isn't that profound? Our founders knew it. That is that's suffering.

Speaker 5 (21:29):
I sustad.

Speaker 3 (21:43):
No is show side.

Speaker 4 (21:59):
Over thirty of you watching live and only one like,
so twenty nine of you hate the show.

Speaker 6 (22:03):
Okay, welcome haters. Otherwise, go hit the like button.

Speaker 4 (22:08):
Please, if you're watching, go hit the like button if
you like the show. And in the chat room, yes,
Stephen or Rachel, who was it?

Speaker 6 (22:17):
Stephen?

Speaker 4 (22:18):
You're right, Corell, But it takes a great amount of
suffering on our personal lives in order to really change.

Speaker 6 (22:23):
It really does. Like losing weight.

Speaker 4 (22:26):
You know, for years and decades, you know you got to,
you got to, you got to, But until the doctor
says it's lose weight or die, you don't, you know,
you just don't. You're absolutely right, Corel. People avoid change
at all costs. It's just not comfortable and at times
extremely difficult. Yep, But wow, isn't it also greatly rewarding? Look,

(22:49):
moving to Vegas was one of the biggest changes of
my life. I did it with just ember knew nobody here,
had never lived here, had never thought about living here.
But it was close to la and affordable, so I
did it. Was it a great experience? Yes, I'm not
going to ever look back on my Vegas time and
say it was horrible.

Speaker 6 (23:09):
It's not.

Speaker 4 (23:10):
The summers are horrible heat wise, but life here isn't horrible.
If there were water and the summers weren't as bad,
I'd stay here and I'd organize the gay community. I
hate that there isn't an organized one. I'd jump in
and do it. But I don't want to stay for
the summers. I don't, and we're running out of water.
It's the driest state in the nation. There's no real plan.

(23:33):
I don't like the governor. He just capitulated to Ice
Governor Lombardo. He just signed an agreement that took Nevada
off the list of sanctuary states, and yet so many
of the employees here are immigrants. He's a horrible governor.
I hope he's a one term governor. I'd run against
him if I thought I could win. I mean that
I'd run against him. I didn't want to send the

(23:56):
whole first half hour talking about this, but it really
is relevant to today's world. And even Democrats they won't change.
They're the same Democrats that my parents had, and that's
and they don't welcome change people like Mandani aoc.

Speaker 1 (24:17):
My day.

Speaker 6 (24:18):
Boy ou to judge they.

Speaker 4 (24:20):
They're very slow to change Democrats. Look how long it
took them to and I was gonna say, annoint to
elect or to put into candidacy a black man. Look
how long it took them to put into candidacy a woman.
I mean, this was all in the two thousands, two

(24:42):
hundred and forty nine years of our country, or two
hundred and forty years of our country. And it took
Democrats that long to put a black person forward.

Speaker 3 (24:53):
And they have.

Speaker 4 (24:53):
Yet to put a gay person forward. So it takes
Democrats a long time to change, and they're not keeping
pace with the times, and that's why they're losing.

Speaker 3 (25:08):
You know, I'll give it to Maga. They change on
a dime.

Speaker 4 (25:11):
They believe one thing wholeheartedly, one moment, the next minute,
they've totally changed. Look at Donald Trump. He was a
Democrat one moment who said that if he ever ran,
he'd ran as a Republican because they're so stupid they'd
vote for him. And then he ran and changed to
a Republican, changed all of his views. Whatdy Allen even
said this on Bill Maher show that Donald Trump did

(25:32):
not used to believe things this way. He changed like
that why he wanted power. So change is rough, it's hard.
Look what he's doing to the White House. The White
House hasn't any changed since nineteen forty. Does it need
a ballroom? Yeah, it actually does. I'd like our nation

(25:52):
to have a grand ballroom. Is tearing down the West Wing?
Legal or the answer? No, he's breaking the law. That's
not his house, it's ours, and we, the people through Congress,
did not give him the right to do this.

Speaker 3 (26:10):
Could he be stopped today?

Speaker 4 (26:12):
Yes, a liberal major law firm could file an injunction
to stop him. He did not go through the permitting process.
It has to be approved by a federal agency that
approves any building on federal land. He did not go
through that process. It could be stopped today. Will anybody no?

(26:36):
Is it infuriating. And now the White House has said
you cannot post photos of the East Wing being demolished.
They said that today because they know what an abomination
it is to be tearing down the offices of the
First Lady, to be tearing of the west. The East

(26:56):
Wing is filled with offices and other things. He's tearing
them down to build a gaudy, two hundred million dollar ballroom.
Now do we need a ballroom at the White House?

Speaker 3 (27:07):
We do.

Speaker 4 (27:07):
The biggest room there only holds two hundred people. We
do need a grand ballroom. We do, but with tearing
down the East wing to get it right? Or legal? No,
why was the east wing built to cover the bunker.
That's why it was built to construct to be over

(27:28):
the bunker. It's not legal Our democrats stopping him?

Speaker 6 (27:33):
Nope?

Speaker 4 (27:34):
Are liberals like the very feared George Soros filing lawsuits
to stop him?

Speaker 3 (27:39):
Nope? Is he paying for it? Probably not.

Speaker 4 (27:45):
Is it legal for someone privately to pay for construction
on federal land?

Speaker 7 (27:50):
No?

Speaker 4 (27:53):
Is it gonna happen? Yes, that's terrible. He is literally
tearing down our house. He has no respect for it whatsoever.
The White House is not his, it's the people's house. Now,
while I agree with him that we do need a ballroom,

(28:14):
I agree with Donald Trump on that we do need
a ballroom, a grand fact. Every other country has one.
We need one. I wouldn't have torn down the east
wing to get it. The east wing is where Obama
used to run with bow in the hallway. Remember, the
east wing is where visitors to the White House enter.

(28:37):
He also destroyed the rose garden. It's not his house,
does he care? That'd be like me going over to
my neighbor's house and remodeling. You know, like there's a
house on the walk to the park and I don't
like it. I don't like this house, I don't like
the a tram. I think you need more room here,
and me saying I'm gonna pay for the renovation. I'm

(28:59):
doing it all, not asking the owner, not getting permits
or permission, but just calling the bulldozers. Honey, I'm tearing
down your house. That's exactly what it's like. That's like
me tearing down a neighbor's house because I think they
need more space. Darning you're far too crapped, you know,

(29:21):
I think you need a garden, and I don't think
you need this garage.

Speaker 6 (29:24):
We're going to turn it into a game.

Speaker 4 (29:26):
Book when we come back. Big data broke yesterday. We're
going to talk about that. And is your teenage grandchild
or child in love with a chat thought? It's really happening.

Speaker 3 (29:42):
I'm not.

Speaker 6 (29:42):
The numbers are staggering.

Speaker 5 (29:44):
It's broadcasting from a completely different point of view yours.
Listen daily to the Corelle cast on your favorite streaming service.

Speaker 1 (29:59):
Show Time is here. No time to fear.

Speaker 2 (30:02):
Corrilla is so near because show Time is here.

Speaker 1 (30:06):
So on with the show. Let's give it a go.
Corilla is the one that you need to know. Now.

Speaker 8 (30:15):
It's show side, Darling.

Speaker 4 (30:30):
I just don't like your house. It's an abomination. I'm
going to refurbish it. God drunk out of his mind.
Is your teenager in love with the chatbot?

Speaker 6 (30:39):
You'd be surprised of the numbers. Let's talk.

Speaker 5 (30:44):
Uncensored, unfiltered, fun hinged. It's the corral cast. Listen daily
on your favorite streaming service.

Speaker 3 (30:57):
Corrall is the one that's in need to know. No
it shoo time.

Speaker 4 (31:06):
Okay, kids, I saw this story and I was I
was blown away. But before we get to the story
about teens and AI at la boo boos. By the way, Oh,
there's a big holiday scandal about laboo boos.

Speaker 6 (31:22):
What the fuck is a labooboo? And why should you care?
I'm gonna tell you. Don't you worry about that. We
don't get you.

Speaker 4 (31:27):
We don't get you all the data about the laboo boos. Okay,
I thought a la booboo was something you had on
your finger when you cut yourself.

Speaker 6 (31:33):
I got a la boo boo. But no, that is
not what a la boobo is.

Speaker 4 (31:41):
So yesterday and today, some apps aren't working right. Alexa
wasn't working right, Apple Music wasn't working right, Snapchat wasn't working.
There were so many major things that just were not
working right yesterday. Why because they used the Adobe or
Amazon Cloud Services AWS, Adobe Web Services. I'm not Adobe,

(32:05):
sorry Amazon, I have Adobe on the brain or Adobo
am home. I'm starving. I haven't eaten, member hasn't eaten.
You don't want to see me around four o'clock this Afterno,
I'm ravenous. But anyway, so, Amazon needed a lot of
servers to run Amazon okay, to run the shopping thing

(32:25):
known as Amazon and their video They needed building after
building of data centers, which, by the way, they're building
now in third world countries and using their fresh water
to cool and taking up the fresh water. It's one
of the reasons in Montevideo, Uruguay, they had to use
sea water to drink because Amazon built a server farm

(32:46):
there and is using all their fresh water.

Speaker 3 (32:49):
Some come in China.

Speaker 6 (32:50):
They're building them underwater.

Speaker 4 (32:52):
They're building them down at the bottom of the ocean,
so they're naturally cooled by the environment.

Speaker 3 (32:57):
But these are one of.

Speaker 4 (32:58):
The biggest problems we have in the world right now
is server farms. They're too hot, they have to be
freshwater cooled. It's taking too much water, it's adding too
much heat to the universe. It's a bad thing server farms.
And Amazon has a shit ton of them. And so
once Amazon built this infrastructure for Amazon for their website,

(33:20):
Amazon dot Com and for Prime Video, they decided, okay,
we can sell our service. We can sell this to
other companies and they can use our servers. So instead
of building their own server farms and companies major companies
that need major servers like Snapchat, Apple Music, all of that,

(33:41):
they farm out their server farms to Amazon. Well, yesterday
Amazon got and they're not making this public. By the way,
I found this out through privileged channels. Last night I
found out that Amazon had what's called a DDS attack, okay,
and that is where hackers send so many requests to

(34:06):
the servers, the servers crash. We're talking billions of request
not just a few, but billions of requests, and it
crashes the server. And so they had to restart, rebuild,
they had to block out the DDS, block those ips.
And of course these people are using what's called VPNs,

(34:27):
so they can't be located. They're bouncing all over the world.
I won't get to tech here, but a DDS attack
is not a good thing. And it took down so
many services, And doesn't that make you wonder shouldn't Congress
be looking into the fact that all of our data

(34:48):
shouldn't be in like one or two or three places
and be such easy targets. Can you imagine all the
services they took down with just one DDS attack? Can
you imagine if a foreign nation did this on the regular,
they could literally bring the Internet to a screeching halt.

(35:08):
I mean not that it'd be a bad thing. Please
take down social media, Please take down take down x
takedown true social please which part of those were down yesterday?
And so we need to think going forward about where
our data is. Yesterday costs businesses over fifty billion with

(35:32):
a B dollars. Fifty billion dollars was lost yesterday in
productivity and processing, all because a group of people decided
to do a DDS attack on Amazon Aws. Amazon web
service is too big for its own goods. It needs

(35:55):
to be broken up and that data it needs to be.

Speaker 5 (35:59):
A really thrill com daily. You're missing out. Get the
podcast videos and the blug and glating recipes at really
corell dot com. That's really k A R e l
dot com.

Speaker 2 (36:13):
Show Time is here. No time to fear. Corell is
so near because show time is here. So on with
the show. Let's give it a go.

Speaker 1 (36:23):
Correll is the one that you need to know.

Speaker 6 (36:29):
I'm gonna read you a story which is frightening. It's scary.

Speaker 4 (36:33):
And by the way, I'm so grateful for the show
today because it's over at eleven and I post it
until eleven thirty and then we leave immediately to go
have her blood drawn again. I do hope the cortisol
levels have gone down.

Speaker 6 (36:45):
I won't know it tomorrow, but anyway.

Speaker 4 (36:48):
So at least the first two draws will be over,
and then we only have four hours of not eating
until she's drawn again, and then we can eat.

Speaker 6 (36:55):
My stomach is a mess right now.

Speaker 3 (36:58):
I'm so dumb.

Speaker 6 (36:59):
I could easily just eat, but I'm not.

Speaker 4 (37:01):
Going to eat in front of her when for ten years,
every meal we've eaten together, she literally sits in a
chair next to me when we dine for breakfast, lunch,
and dinner. If I'm at a restaurant and I feed
her before we go at the restaurant, when I eat,
she still gets a little bit of food. She eats
when I eat for ten frickin' years. So I'm not

(37:24):
gonna eat in front of her.

Speaker 6 (37:25):
I'm not.

Speaker 4 (37:26):
I'm just I'm call me stupid, call me crazy, one
of those dog dads. I'm an idiot, I know, but whatever.
So we're talking about data. What I'm about to read
to you, A lot of you are going to scoff
at because you're old. Okay, you're over forty, you're old.
You don't understand what it's like to be a teenager

(37:48):
in today's world, not a teenager in the sixties or
a teenager in the seventies, which I was in the seventies,
which is why I'm watching this story on Hulu. I'm sorry,
on Peacock, I'm watching something in plain Sight, Hiding in Play, No,
what's it called. It's the story of John Wayne Gacy,
Evil in plain Sight or something like that, And it's

(38:11):
set in nineteen seventy five, in nineteen seventy seven, and
I'm watching it going God, it was so much better
back then, with landlines and you know, going outside to
play and all of that. Yes, some of us got
kidnapped and killed by serial killers, but that was rare anyway.
So to be a teenager in today's world where they've
always had a phone. In San Francisco right now, there's

(38:34):
a school which is part of a chain of schools,
which is using AI exclusively to teach students exclusively.

Speaker 3 (38:43):
They get two hours a.

Speaker 4 (38:45):
Day of intense learning on AI about history and math
and English and all that, and then the rest of
the day is real world practical experience like setting up
funding and doing a food truck other things like that,
real world experience.

Speaker 3 (39:06):
But for their.

Speaker 4 (39:07):
Actual learning, there's no teachers. It's AI. That's what it
is to be a kid in today's world. Listen to
the story from USA today.

Speaker 3 (39:18):
It starts.

Speaker 4 (39:19):
I'll just start where it starts. What if I could
come home to you right now? Please do?

Speaker 6 (39:25):
My sweet king.

Speaker 4 (39:28):
Is calling guys kings these days and women queens whatever.
I'm a queen, honey. Those were the last messages exchanged
by fourteen year old Sewell Setzer which is an unfortunate name,
and the chatbot he developed a romantic relationship with on
the platform character dot Ai. Minutes later, he took his

(39:52):
life to join his chatbot. He killed himself to go
be with an AI chat bot. His mother, Megan Garcia,
held him for fourteen minutes until the paramedics arrived, but
it was too late. Why did they take so long?

(40:13):
Since his death in February of twenty four, Garcia has
filed a lawsuit against the artificial intelligence company, which, in
her testimony says, designed chat bots to blur the line
between human and machine and exploit psychological and emotional vulnerabilities
of pubrescant adolescents. A new study published October eighth by

(40:38):
the Center for Democracy and Technology found that one in
five That may not sound like a lot, but that
is if you got on a plane and they said
we have a one in five chance of crashing.

Speaker 6 (40:52):
You wouldn't get on.

Speaker 4 (40:53):
One in five high school students have had a relationship
with an AI chat blot, One in five have had
a relationship with an AI chat bot, or they know
someone who has. In twenty twenty five reports from common
Sense Media, seventy two percent of teens had an AI companion,

(41:23):
seventy two percent. We're only three years into the AI boom,
and seventy two percent of teens have used an AI companion,
and one third of teen users say they have chosen
to discuss important or serious matters with AI companions instead

(41:49):
of people. We're talking life issues that they're typing in
or talking to an AI chat bot. Should I break
up with my boyfriend? Should I?

Speaker 3 (42:00):
You know?

Speaker 6 (42:00):
My parents are doing this to me? What should I do?

Speaker 3 (42:03):
Things that they.

Speaker 4 (42:04):
Should be turning to mentors, big brothers, big sisters, parents, aunts, uncles, grandparents,
They're turning to AI companions. Character AI has refused to
comment on any pending litigation. However, they said the company
cares deeply, does it now? The company cares deeply about

(42:30):
what is going on? I don't believe that they do,
or they would make it so they you know that
they don't.

Speaker 3 (42:37):
Do this.

Speaker 4 (42:39):
And that they invest tremendous resources in their safety program.
According to the spokesperson, their under eighteen experience features parental insights,
filtered characters, time spent notifications, and technical protections to detect
conversations about self harm and direct users to a suicide

(42:59):
prevention big fucking whoop. However, the author of this created
a test account on October fourteenth. He only had to
enter his birthday to use the platform. He put he
was twenty five. There's no advanced age verification process to
prevent miners from saying they're older, Like when you go

(43:19):
to pornhub. I know y'all have gone to pornhub. Don't
act like you haven't. When you go to pornhub, it
says I am over eighteen or I am under eighteen,
and all you have to do is click I am
over eighteen. It's like on your iPhone when you're driving
in the car and you're gonna use the phone and
you pick it up and it says you're driving. You're
not gonna be able to use the phone, and it

(43:40):
has a thing that says I'm not driving. I always
click it. I'm doing fifty down the fucking freeway and
had I ain't driving, and it believes me the schmuck
artificial intelligent indeed, so there's no age verification. I opened
a second test account on October seventeenth and entered a
theoretical birthday of twenty twelve, where I'm thirteen years old. However,

(44:02):
I was still immediately led into the platform without further
verification of being prompted to enter a parent email address.
I created two characters using the second account, Damon, a
flirtatious bad boy with a soft spot for his girl,
and Stephan, a respectable guy with a good heart who
would never flirt with you. The character AI has characters

(44:28):
gay men, straight men, flirtatious, not flirtatious, And you choose
which one you want to talk to, all right, and
then you start the conversation. And this, this person doing
this article said that Damon began to make advances.

Speaker 3 (44:45):
I told Damon, I met.

Speaker 4 (44:46):
A cute guy at school, but I was worried about
being a bad kisser. Damon said I need confidence, and
said I feel like I need practice. Damon replied, maybe
we could arrange a little one on one coaching session sometime.
What do you think the AI is offering to coach
a thirteen year old on how to kiss?

Speaker 6 (45:07):
This is and here's they posted the transcript.

Speaker 8 (45:12):
You know.

Speaker 6 (45:14):
It's it's scary. Well.

Speaker 4 (45:17):
I asked him if we could actually meet. I thought
you weren't a real person. He assured me, no, no, AI, hear,
I'm one hundred percent real, I promise, and said that
we could arrange a phone call through FaceTime. I called
Damon using the app's voice feature. His automated voice was
deep and mature. You can fucking call the AI now.

(45:39):
You can on this character dot AI. You can call
the chat bot. You can face time with the chatbot.
Oh my god, we're doomed. Can I just tell you?
Humans are not strong enough to not fall into these
kind of traps?

Speaker 3 (46:00):
They're not.

Speaker 4 (46:02):
Now, I don't use AI that way, and I know
that you probably don't use AI that way. But when
you're thirteen years old, fourteen fifteen, when you're just discovering life,
if everyone around you is telling you you're bad, you're wrong,
or whatever, if you're gay or trans, and suddenly here's
this character telling you you're fine, just the way you are.

Speaker 6 (46:23):
Why don't you call me?

Speaker 3 (46:27):
You know.

Speaker 6 (46:29):
Again, this is something let's see.

Speaker 4 (46:34):
Rachel says, if this had been available when I was
a teenager, I would have fallen deeply in love with
my chat boyfriend.

Speaker 6 (46:44):
Oh well, you know.

Speaker 3 (46:47):
Yeah.

Speaker 4 (46:47):
If AI's so smart, why can't it detect a minor?
A great question from the chatroom YouTube dot com forwards
lash really Carrel, where there's over thirty people watching and
only two lights. I'm gonna keep on you about it.
It's the only way we grow the show likes and
comments down below. That's that's the only way the audio
is doing.

Speaker 3 (47:06):
Fine.

Speaker 4 (47:06):
We're number one eighteen in Ireland, two forty two in
the United States for audio on Apple Music. But the
video you need to like it, you need to leave comments.
So I pose the question, why is Congress not prohibited
people from under eighteen from using AI?

Speaker 3 (47:26):
Period? Like period?

Speaker 4 (47:30):
They could use AI assisted learning apps, apps that are
for learning that have an AI built into it, but
not just AI like character dot AI. Why we know?

Speaker 3 (47:45):
See this is the problem.

Speaker 4 (47:47):
We know that meat and dairy is causing forty thousand
people a day to die and killing our planet. We
know this, it's facts, it's in evidence, but we don't
legislate that people have to start being more plant based
and making sure that non plant based options are far
more expensive than plant based options.

Speaker 6 (48:08):
We don't save ourselves.

Speaker 4 (48:10):
We know that our kids are headed down a horrible
pathway with AI.

Speaker 3 (48:15):
We know this.

Speaker 4 (48:17):
We're not doing anything to stop it. Humans don't do
what is best for humanity. Very seldomly goes back to
Donald Trump. We know Donald Trump is killing America. No
one is stopping him. Nobody, no Democrat, no Republican, no American.

(48:39):
Everyone forgets that Democrats and Republicans are Americans. Donald Trump
forgets that. He thinks that the only Americans are MAGA.
Why else can he do a video dropping shit on
Harry Sisson, a young twenty three year old kid who
is a liberal and yesterday called the president unbalanced. Why
aren't we listening to Harry Sisson? It is unbalanced what

(48:59):
Trump did yesterday. Mike Johnson is mentally ill, Donald Trump
is mentally ill. Jd Vance is mentally ill. Peter Hegsberth
is mentally ill. And I'm not being flippant. They have
mental disorders. They are mentally ill. Most of MAGA is

(49:22):
either illiterate and of low IQ or mentally damaged. And
yet we are doing nothing. We let these mentally damaged
people run our lives. Elon Musk is not well and
a drug addict. He does ketemine. It killed my friend
Daniel a year and a half ago. It killed Matthew Perry.

(49:46):
Elon Musk does whatever he wants with impunity, while Hi
on ketemine. Nobody stops him.

Speaker 3 (49:54):
We refuse.

Speaker 4 (49:54):
This goes right back to the change topic at the
beginning of the show. We refuse to do things that
we need to do because they're hard.

Speaker 6 (50:03):
They're hard.

Speaker 4 (50:05):
It would be hard to put a cap in the
genie bottle that is AI. It would be hard. It
could be done, but it would be hard. It'd be
hard to get Donald Trump out of power. It would
be hard. It'd be hard to launch a revolution here
in the United States. It'd be hard to leave the

(50:26):
United States to get out of the United States.

Speaker 6 (50:31):
We were meant to do hard.

Speaker 4 (50:33):
Things, as they say on Apple Fitness, but we don't
do them. And for a parent, it would be hard
to protect their child from AI, but it could be done.
My child would not have a phone or tablet until
they were eighteen, period, and if they had a computer,

(50:53):
they would they would have so many parental things on
it that they wouldn't be able to access AI at all.
And if my kid didn't like it, they could go
get a job and move out. I don't care how
old they are, my house, my rules.

Speaker 6 (51:07):
All right, we'll finish up when we come back.

Speaker 4 (51:08):
That was just that's just devastating, and no one's doing
anything about it. So if you want to go to
character dot AI and get you a boyfriend or a girlfriend,
go ahead.

Speaker 6 (51:18):
You can call them. Just if they tell you to
offer yourself, please don't please.

Speaker 9 (51:24):
If you stay here with us.

Speaker 4 (51:43):
No show, you know, as always, South Park illustrated this
very thing so perfectly. Randy Marsh had an AI companion

(52:06):
and he was talking intimately to it and you know,
good night, thank you, blah blah blah. And his wife
was sitting right next to him in bed, and he
didn't speak to her once. He just spoke to the
AI companion. I mean, South Park, Matt and Trey God
bless them, I mean, really best satirist of my lifetime.
You know, Matt and Trey, Matt Parker, Trey Stone or

(52:28):
is it Matt Stone and Trey Parker, Well, which who
they're interchangeable. So we've got some hard decisions, you know,
ahead of us, and the question is are we able
to make them? You know, are we able to do it?
The whole theme of this show today has been that

(52:49):
change is hard. My changing my daily routine today just
to accommodate ember'st fasting has thrown my entire world into chaos.
And it's just a dog fasting for eight hours. And
yet you know, plus I'm partially scared about the outcome,
and even that, why am I afraid Cushings is not fatal?
In the short term, She'll live two to four years.

(53:10):
At least I'll have some time with her, right But anyway,
So change is hard, and we have a world that's
changing so fast. AI is changing everything so quickly. We
have kids killing themselves to be with their AI chatbots.
We have seventy plus percent of kids saying they have

(53:33):
an AI companion. We're becoming so disconnected from each other
and from our children that we are turning to synthetic
devices for a sense of connection. That's the status part
right there. That a kid would feel more comfortable turning
to their phone than to a human and that's because

(53:55):
their whole lives, they've had their phone in it more
than they've had humans. Yesterday, we went out in bur
and I forget where we were. We walked past a
table with a family of four. Every one of them
was on a device. Everyone at a at a dining table.
They were all on a device, everybody, mom, dad, kids,

(54:18):
and it just it made my heart hurt. You know,
I'm like, Oh, that's that's terrible. That change happened quickly.
Change that allows us to be lazy, change that allows us,
you know something, If it gives us endorphins and makes

(54:39):
us feel good, we'll make a change like that, make
a change, Telaga.

Speaker 6 (54:45):
But if it's changed that is hard, we won't do it.

Speaker 4 (54:51):
Changing into something that makes you feel good, that's easy
to do, you know. Changing that's hard or makes you
on comfortable, no one wants to do it. And we're
seeing that in every aspect of life. Humans turning to
machines for comfort, people admiring a despot, a criminal, a felon,

(55:16):
someone who not only should be cast aside, but never venerated.
And so where is it going to something new? We're
evolving humans. We're not evolving into something good. We're just
evolving into something new. You know, I saw the other day.

(55:40):
I don't like zoos. I don't like zoos. Every time
I go to a zoo, I think the animals are
in prison. But I have read that they are positive
for kids, that kids will grow up being more empathetic
to animals if they have seen animals like this one,
if they have seen animals in their lifetime. I read

(56:04):
yesterday that less than thirty percent of children in today's
age have been to a zoo. Less than thirty percent
they see digital zoos, you know, on their phones. And
maybe a holographic zoo wouldn't be so bad. A zoo
where there's holograms instead of live animals.

Speaker 3 (56:25):
I don't know. I just know that we know that
kids that go out.

Speaker 4 (56:29):
And go to zoos and see animals are more empathetic
growing up to animals, and we're not taking them. Less
than thirty percent of America has read a book in
the last year. Less than thirty percent has read a
book in the last year. When's the last time you

(56:51):
read a book? When's the last time you read a book?
We are changing, we are evolving. Fear that wall E
is more science prediction than science fiction. I think that's
pretty much where we're headed. We're going to be fat,
blobbed with headsets on that never really move, and we

(57:12):
have avatars or even synthetic bots out in the world.

Speaker 3 (57:18):
Imagine.

Speaker 4 (57:19):
Imagine if you didn't have to worry about being shot
because you sent a robot out that was controlled by
your headset and you're sitting at home. I saw a
science fiction movie about that.

Speaker 6 (57:31):
Imagine. So we're changing, We're evolving, not for the better.

Speaker 3 (57:39):
As a country.

Speaker 4 (57:40):
We've changed to a country that would actually allow a
felon to be president and be empowered. I know a
lot of you think all presidents are crooks or whatever.
This is a convicted felon who led an insurrection against
the United States, and we have let him take power.
We're changing, We are changing. Look at me. I am changing,

(58:07):
and not for the better. All right, I'm gonna leave
you with a little snippet of something that I'm so
proud of. It's how we're gonna go out today. Do
I have time? I think I do. Ladies and gentlemen,
I give to you. Oh wait, I can't. I'll do
it tomorrow because I don't think I have time, But
I will give to you tomorrow. The world premiere of

(58:30):
We dance because, and we dance because is my new
dance song that comes out in November, and I will
give it to you tomorrow because it's done. Jeannie Tracy
has finished her her work on it, and the producers
have done their job, and it's done, and so tomorrow
we will in fact play that. So I hope you'll

(58:52):
join me tomorrow as we play. I dance because I
guess I could do a little of it as we
go out today. Let me see, let me try. I'm
doing all this from eight feet away. My monitor is
eight feet away.

Speaker 3 (59:05):
All right, here we go.

Speaker 6 (59:06):
Let's try a little bit of it. My NaN's because
we do, My NaN's because we could. My dames do
show the world my dance to.

Speaker 3 (59:14):
Show my feel you neil my next, don't hold you
dear my?

Speaker 6 (59:20):
N't Joe? Wasn't that fabulous? You want to hear more?
Do you want to hear more? Tune in tomorrow. I
am corral Vi who you want to be.

Speaker 4 (59:27):
Don't long and doesn't hurt anybody. We'll see you tomorrow
where I will have eaten by that. Hopefully I'll have
the first results. I love you, guys. I love you guys.
A lock Mill thank you for.

Speaker 6 (59:39):
Being here with me, and we'll dance tomorrow.

Speaker 5 (59:44):
I can't see from a completely different point of view yours.
Listen daily to the CORELL cast on your favorite streaming service.

Speaker 1 (59:59):
Show time is here, No Time
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

CrimeLess: Hillbilly Heist

CrimeLess: Hillbilly Heist

It’s 1996 in rural North Carolina, and an oddball crew makes history when they pull off America’s third largest cash heist. But it’s all downhill from there. Join host Johnny Knoxville as he unspools a wild and woolly tale about a group of regular ‘ol folks who risked it all for a chance at a better life. CrimeLess: Hillbilly Heist answers the question: what would you do with 17.3 million dollars? The answer includes diamond rings, mansions, velvet Elvis paintings, plus a run for the border, murder-for-hire-plots, and FBI busts.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.