Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
SPEAKER_05 (00:00):
Broadcasting across
the nation from the East Coast
to the West, keeping you up todate on technology while
enjoying a little whiskey on theside with leading edge topics,
along with special guests, tonavigate technology in a
segmented, stylized radioprogram.
The information that will makeyou go, mmm.
Pull up a seat, raise a glasswith our hosts as we spend the
(00:22):
next hour talking abouttechnology for the common
person.
Welcome to Tech Time Radio withNathan Mum.
SPEAKER_09 (00:31):
Our Welcome to Tech
Time Radio with Nathan Mum, the
show that makes you go mmm.
Technology News of the Week.
SPEAKER_10 (00:42):
I'm not going home
right now.
Show for the everyday persontalking about technology
broadcasting across the nationwith insightful segments on
subjects weeks ahead of themainstream media.
We welcome our radio audience of35 million listeners to an hour
of insightful technology news.
As you can see, this is ourHalloween scary episode.
So you're gonna love what we gotin store for you today.
(01:04):
Let me tell you.
I'm Nathan Mum, your host andtechnologist with over 30 years
of technology expertise.
Our co-host Mike Roday here isin studio.
He's the award-winning authorand our human behavior expert
and our steampunk man from thefuture.
I like the slash one better.
Okay, there you go.
We live stream during our showon six of the most popular
platforms, including YouTube,twitch.tv, Facebook, LinkedIn,
(01:24):
and now kick and rumble.
We encourage you to visit usonline at techtimeradio.com and
become a Patreon supporter atpatreon.com forward slash
techtime radio.
We're all friends from differentbackgrounds.
We bring the best technologyhope possible weekly for our
family, friends, and fans toenjoy.
We're glad to have Odie, ourproducer, at the control panel
today.
Welcome everyone.
Let's start today's show.
SPEAKER_05 (01:47):
Now on today's show.
SPEAKER_09 (01:52):
All right, welcome
to Tech Time Radio.
Today on the show, we have ourHalloween episode where we have
the spooky and scary technologyitems.
SPEAKER_02 (02:00):
Okay, are you a
pirate?
Are you uh Trump?
Uh you're like moving into thatterritory.
All right.
SPEAKER_09 (02:08):
Well, you know
that's amazing.
That's amazing.
That's the best ever.
SPEAKER_10 (02:11):
All right.
Well, you know what?
We have our Halloween episode,and of course, that means we
have Nick Espinoza on the show.
You know what that means?
We're gonna be talking about allthe scary things that you can
learn in technology.
Today's show will make youdrink.
We promise you that.
And of course, we also have ourstandard features, including
Mike's mesmerizing moment, ourtechnology fail of the week, a
Nathan Nugget, and our pick ofthe day whiskey tasting to see
(02:32):
if our pick of whiskey getszero, one or two thumbs up by
the end of the show.
But now it's time for the latestheadlines in the world of
technology.
SPEAKER_05 (02:41):
Here are our top
technology stories of the week.
SPEAKER_10 (02:45):
All right, story
number one
explaining to do, from job cutsto major outages on its cloud
service.
Let's go to Lisa Walker for moreon last week's outage.
SPEAKER_00 (02:56):
Amazon Web Services,
AWS, has apologized to customers
affected by last week's massiveoutage, which knocked some of
the world's largest platformsoffline.
Snapchat, Reddit, and Lloyd'sBank were among more than 1,000
sites and services reported tohave gone down as a result of
issues at the heart of the cloudcomputing giant.
(03:18):
Amazon said it occurred due toerrors in its internal systems
involving the IP addressescomputers use to find them.
Back to you guys in the studio.
SPEAKER_02 (03:28):
I bet you there's a
whole subreddit about them being
about this.
SPEAKER_10 (03:32):
Yes, there is.
There's a whole subreddit outthere that you can listen to all
about this.
All right, well, the outage hasits far-reaching impact, even
reportedly disrupting the sleepof some smart bed owners.
Many experts said the outageshowed how reliant tech is on
Amazon's dominance in the cloudcomputer sector as a market
largely cornered by AWS andMicrosoft Azure.
(03:54):
And the specific technicalreason is a faulty automation
broke the internal address booksystem that the region relies
upon.
The company said that it woulddo everything we can to learn
from this event and improve itsavailability.
So does improving this meangetting rid of employees?
Because guess what?
Amazon is planning to cut600,000 human jobs for robots.
(04:16):
So you know what?
We have a problem with thescript.
It has to be human error.
So you know what?
Let's just create more robots.
Amazon plans to cut 600,000jobs, human jobs, of course, for
robots.
Uh, in an insider report, by2033, they expect to be 75%
operational on automation.
Since 2018, the number of Amazonemployees in the US has more
(04:40):
than tripled to almost 1.2million.
Nevertheless, managers havereportedly informed the board
last year that the company willnot need to hire any more U.S.
employees in the future, thanksto advancement in robotic
automation, even if sales doubleby the year 2033.
According to internal documents,around 160,000 jobs could be
(05:00):
lost by 2027, particularly inlogistics and warehousing.
In the long term, Amazon plansto automate around 75% of all
activities, which will save thecompany up to$12.6 billion,
according to the projections.
Now this will reduce cost ofeach product it sells by 30
cents.
Now, do you think that 30 centsis going to be saved for
(05:20):
everybody, Mike?
Are you going to save 30 centson that shipping cost now since
they're going to go all torobots?
SPEAKER_02 (05:26):
No, they'll find
some way of charging me with
extra money.
SPEAKER_10 (05:28):
That's correct.
Now, Amazon already employs onemillion robots to support its
1.6 million human workers.
In the future, however, machineswill no longer just help uh keep
the company running, but it'llalso have many of the tasks
beyond the warehouse.
They expect to have one employeefor every 500 robots that will
be running around in the city.
(05:49):
So they think of that.
Their little warehouses aregonna have 500 of these bad
boys.
Yeah.
Well, Amazon, we're gonnacontinue with Amazon.
There's so much news with them.
They're actually uh in trouble.
They're set to pay out refundsto qualifying prime users after
a$2.5 billion settlement withthe Federal Trade Commission.
The retail giant agreed to thesettlement to resolve antitrust
(06:11):
lawsuit the FTC filed by 2023,where federal officials alleged
Amazon coerced millions ofconsumers into enrolling into
Prime subscriptions that madetheir agreements extremely
difficult to cancel.
Amount of money each customergets from this$2.5 billion, what
is it gonna be?
A maximum of$51 per customer.
SPEAKER_02 (06:30):
That's what you get
for$2.5 billion.
SPEAKER_10 (06:33):
The lawyers take 90%
of it, and you get$51.
That's right.
Now there's two differentsettlement groups that will be
issued money.
The first group receives anautomatic payment, which will be
filed with Amazon for theirprime benefits three or less
times during a 12-month period.
Customers who qualify for thesettlement should receive their
payment by December 24th, 2025.
(06:54):
Those who do not qualify for theautomatic payment still have the
opportunity to file a claim.
A third-party claimadministration uh will be sent
out with information in January23rd, 2026.
However, eligible claimants willhave to wait until July 23rd,
2026 to officially submit theclaims.
So in January, that we're gonnaallow people that didn't get the
(07:16):
automatic payment on December24th, the day before Christmas.
There you go.
To then sign on up so they canget something else.
You know what?
Amazon is in a world of hurtright now.
SPEAKER_02 (07:27):
I don't think
they're in a world of hurt.
SPEAKER_10 (07:29):
You don't think so?
No.
They'll just move theirheadquarters.
SPEAKER_02 (07:31):
I mean, this was the
big this was the biggest fine
that anybody's gotten so far.
But they're are they no, they'renot in a big world or hurt.
You don't think they're a bigworld of hurt?
SPEAKER_10 (07:40):
No.
Fifty-one dollars.
Do you do you pay for AmazonPrime?
Yeah.
Odie, do you pay for AmazonPrime?
Okay.
So that means that we all shouldget$51 back.
SPEAKER_02 (07:49):
And what are you
gonna do with that?
That's only if we qualify forthe coherence.
SPEAKER_07 (07:53):
How much is that in
the grand scheme of things to
them?
Like, is that like 20 bucks?
SPEAKER_10 (07:58):
Uh it's probably not
well, it's it's I'm sure it's
fifty-one dollars untilinflation keeps on going up, and
so then it'll only be worth liketwo dollars.
But it's two point five billiondollars in the settlement.
SPEAKER_07 (08:08):
Yeah, but compared
to Bezos or to Amazon as a
company, is that chump change?
Yeah, that's chump change.
SPEAKER_02 (08:15):
And they're gonna
give it some.
SPEAKER_10 (08:16):
That's him housing
on credit.
SPEAKER_02 (08:18):
That's him housing
his super yacht for a month.
SPEAKER_10 (08:21):
That's you know
what, that's gonna be in credit,
so they can just spend moremoney on their Amazon stuff.
SPEAKER_02 (08:25):
Yeah, yeah.
What's it really gonna do?
I mean, heck, they're gettingrid of 75% of their force, their
sales force.
Or no, not sales force, theirpeople force.
SPEAKER_10 (08:36):
Yeah, 600,000
people.
You know, that's not a big deal.
Yeah, no, no, no.
1.6 million, you know, 600,000robots.
SPEAKER_02 (08:42):
Let's not create
jobs, let's eliminate jobs.
That's what this is all about.
SPEAKER_10 (08:46):
All right, well, you
know what?
It's our Halloween episode.
So let's move on to story numbertwo.
SPEAKER_02 (08:52):
Yeah, let's talk
about the disturbing, some
really disturbing stuff here.
Uh, if you're not disturbed bythis, you might need to go see a
uh therapist.
SPEAKER_10 (08:59):
Okay.
SPEAKER_02 (09:00):
Remember Suzanne
Summers?
Oh, yes, the Thigh Master.
The Thighmaster Three's Company.
Oh, yeah.
She died, she died a coupleyears ago.
Do you know that?
I did not know that.
I'm sorry to hear that.
Okay, well.
Uh she is now living on as an AIrobot.
What?
Alan Hamill creates an AI cloneof his late wife, Suzanne
(09:23):
Summers, two years after hisdeath, or her death, okay, and
says he can't tell thedifference from the AI model and
his late wife.
Okay.
Okay, and when we first talkedabout this, when we first talked
about this, we just thought thiswas just like some sort of
chatbot thing, right?
We did.
We just thought that was aprompt.
And then we had to do a bunch ofresearch on our It is not.
(09:44):
This is an AI robot sex doll.
SPEAKER_10 (09:47):
It is, isn't it?
SPEAKER_02 (09:48):
They have in they
have input an AI version of
Suzanne Summers into a life-sizerobotic doll.
Why are you looking like that,Odie?
SPEAKER_07 (09:58):
She's not a sex
doll, though.
She's just made from a sex doll.
Like her mate, yeah.
SPEAKER_10 (10:03):
So she's not she's
not being used as a big thing.
SPEAKER_07 (10:04):
That's not her use.
SPEAKER_10 (10:05):
She's not being
used.
SPEAKER_02 (10:07):
Are you really sure
about that?
Well, I don't know.
Well, her ex-husband says hecan't tell the difference.
You don't think you don't thinkthat's that's part of his own.
SPEAKER_07 (10:19):
The doll doesn't
like anything like Suzanne
Summer's.
SPEAKER_02 (10:25):
That's the that's
the thing here.
Two years after her death, okayat 76, her husband and partner
of 55 years started puttingplans into action that they they
both had discussed for decades.
Okay.
Uh, and one of the projects thatthey were doing is coming up
with this AI twin uh to herex-husband.
(10:45):
Okay.
The project is perfect.
It is an AI and a talking doll.
It he says it was Suzanne.
I asked her a few questions, sheanswered them, blew me and
everybody else away.
When you look at the AI next tothe real Suzanne, you can't tell
the difference.
It's amazing.
If this does not disturb you, ifthis has some eye problems
(11:07):
because he can't see very well.
Yeah, if this isn't disturbingto you, the I mean, okay.
This is very disturbing.
Is Alan nearsighted?
Well, he's 76, so probably yes.
Many ask if the doll or his latewife look anything in common
after seeing it on display, andmany see no similarities.
Okay.
While the AI is fairly new tomost, Hamill revealed that he
had been in an ongoingconversation with himself and
(11:29):
his wife since the eighties,when Ray Kurtzweil first
explained the concept to him.
And this is this is reallydisturbing if you can't tell the
difference between your realex-wife and a robot ex-wife.
Yeah.
I mean, this is disturbing on somany levels to me.
This is what I argue about whyAI is bad, all rolled into one,
(11:51):
because now this person is inlove with a he's interacting
with a doll.
He is anamorphizing it so thatit is his wife, he's treating it
as if it's a living thing, whichI'm sure includes other things
rather than just talking aboutit.
And he's not allowing himself togrieve the loss of his wife.
That's the biggest thing, isn'tit?
SPEAKER_10 (12:11):
Isn't like the the
whole condition of dealing with
death really important for forus as a society?
SPEAKER_02 (12:16):
Yes, it is.
And when when we do stuff, we'vetalked about this before when
when we were talking about this.
I think they were doing this inJapan.
Yep, yeah.
Where they were creating theseAI ghosts of their departed
souls.
Or or kids or whatever.
Yeah, we're not we're if we'renot allowing ourselves to grieve
because we are trying to replacethat with some sort of object
(12:37):
that is ultimately not good forour mental health.
I totally agree.
I don't know that you reallyagree with that.
Well, no, no, I wouldn't wantto.
I think you would I think Ithink you would have a you
wouldn't have an AI doll hangingout.
SPEAKER_10 (12:49):
No, no, no, I would
I do an R2D2 or some robot type
of deal or Rosie, but I wouldn'twant an AI doll.
SPEAKER_07 (12:56):
Yeah, he would never
do a person as a as a doll.
No.
Or a person.
Give me a robot.
SPEAKER_02 (13:01):
Yeah, you would.
SPEAKER_07 (13:02):
Yeah, it would be
another POC.
SPEAKER_02 (13:04):
It would be in a
closet somewhere, but he would.
unknown (13:06):
Wow.
SPEAKER_10 (13:06):
Well, story number
three is Tesla's advanced
technology detecting somethingbeyond the physical realm or is
it just a glitch?
Soft social media is a buzz withthe claims that several Tesla
owners reported that their carsensors and cameras pick up
invisible human figures incemeteries.
Now, this is like a hot thingthat's going on right now.
If you have a friend that has aTesla, all you need to do is go
(13:29):
down, especially at night, butyou can still do it during the
daytime.
And you go and you walk, orsorry, you you drive and you'll
see people on your screenwalking next to your car in the
open fields.
This is so viral that people aremaking different items, and
there's lots of skepticismonline about this not being
(13:49):
taken care of.
Skepticism.
Skepticism, maybe skepticism.
I like skepticism.
That's skepticism.
Soon after others attempted toverify the claims.
One user said he intentionallydrove through a cemetery late at
night and saw multiple humanshapes appear on his Tesla
screen.
There was no one around, and thedisplay showed moving figures.
It was unsettling.
Another Tesla owner reported asimilar experience while driving
(14:10):
in a dense forest.
He said the vehicle sensorsdisplayed human-like figures
surrounding the car despite novisible movement outside.
Phenomena has led to manytheories.
Some jokingly suggest Tesla'sautopilot can detect spirits or
negative energies.
While everyone believes it'smerely a technical glitch,
perhaps interrupted by the car'sobject detection algorithms.
(14:33):
There are still some thatbelieve.
You did this, didn't you?
Was that you did this?
You don't own a Tesla, but youknow what I mean.
So I wrote in a Tesla and I wentinto a cemetery, and guess what
happens?
This actually happens.
When you go by now, it'sprobably what I think it's
picked up flowers, and itdoesn't know what the flower
objects are on the side of theroad per se.
(14:53):
Or maybe headstones.
And and and tombstones andheadstones and different items
that are available in thecemetery that are just unique.
And I think it just displays thedefault algorithm as probably a
person.
Because if you're going to getsuited and you don't know what
it is, it probably displays aperson.
But I don't know.
When you go look at that note,it moves.
SPEAKER_07 (15:12):
I think in the video
you see it like spinning because
it does.
SPEAKER_10 (15:15):
But the person with
the yes moving back and forth.
Like they're walking around.
Yeah.
That would be cool.
Do you believe that they're thethe the that's I think it's
cool.
SPEAKER_07 (15:23):
That's called.
Now will I go out there anddrive through a cemetery with a
Tesla?
No.
SPEAKER_10 (15:28):
You should do that
as you should create an Uber
driver event where you cancreate an event now where you
become the cemetery driver.
Stop.
Stop.
SPEAKER_02 (15:36):
Okay.
SPEAKER_07 (15:38):
This is a million
dollar idea.
SPEAKER_02 (15:41):
The irony of this is
that Tesla, who is all about,
you know, technology and stuff,discovers that ghosts are real.
SPEAKER_03 (15:50):
Yeah.
SPEAKER_02 (15:51):
That could be the
while everybody out there doing
these ghost shows and they'redoing all this phony stuff is
like, well, and Ghostbustersmaybe they have the same
algorithm the Ghostbusters do topick up me.
SPEAKER_10 (16:02):
Yeah, he just
crossed the streets.
We're moving.
Oh wow, wow, wow.
That's there you go.
That's perfect.
Alright, well.
Do you think it's a ghost storyor do you think it's technology?
SPEAKER_02 (16:12):
It's just artifact.
It's the same thing.
It's the same thing when youlook at an EKG and you see all
this well, you won't know whatit like.
You wouldn't know what it lookslike.
Keep the whimsy mic.
Yeah.
EKG, isn't that to test your uhyour brain power?
I okay.
So from a from a from a nicelittle Halloween story
perspective, this is prettyfunny.
(16:33):
But if you are out therebelieving that ghosts are
running around your Teslavehicle, uh, I think again, you
might need to go to go seek sometherapy.
SPEAKER_10 (16:42):
Maybe you and Alan
can go and uh have a fake AI
doll.
Is that what you're saying youshould be doing?
Yeah, sure, whatever.
I don't know how you connect tothat, but okay.
All right, well, well, that endsour top technology stories of
the week.
When we return, Nick Espinozafrom Security Fanatics will join
the show in our annual scaryHalloween technology episode
with some of the scariesttechnology stories of the year.
(17:05):
You'll find out what is next.
You're listening to Tech Timewith Nathan Mum.
See you after the commercialbreak.
SPEAKER_01 (17:10):
Looking for custom
glass solutions for your next
commercial project?
Heartongue Glass Industries isyour trusted partner in custom
glass fabrication.
For over 100 years, Heartonguehas delivered proven
manufacturing expertise,comprehensive product offerings,
and dependable service andquality.
From energy-efficient facades tocustom shower doors, we create
(17:34):
glass solutions tailored to yourproject needs.
With eight facilities across theU.S.
and Canada, we combine nationalexpertise with a local
touch-insuring faster serviceand unparalleled customer care.
Hardton Glass Industries, wherequality meets innovation.
Visit Hardtoneglass.com to learnmore.
SPEAKER_10 (17:59):
Welcome back to Tech
Time with Nathan Mom.
Our weekly show covers the toptechnology subjects without any
political agenda.
We verify the facts and do itwith a sense of humor in less
than 60 minutes, and of course,with a little whiskey on the
side.
SPEAKER_11 (18:13):
I thought you were a
pirate and not a robot.
SPEAKER_03 (18:15):
Oh, look at you.
Southern Comfort.
SPEAKER_02 (18:19):
You don't really
know what to say here.
Greet me up.
Please.
Please don't.
All drink of water.
SPEAKER_10 (18:26):
Alrighty.
Today, Mark Gregoire, whiskeyconnoisseur, is back in studio.
Or we call him Southern Comfortfor today.
Mark, what have you chosen forus to drink?
Well, something to comfort ustoday.
SPEAKER_11 (18:41):
It should be
Southern Comfort, isn't it?
I don't drink Southern Comfortanymore.
Hey, that was my gatewaywhiskey.
Today we are drinking.
SPEAKER_03 (18:49):
I just had a
curious.
SPEAKER_11 (18:52):
Russell's Reserve
Private Barrel Selection.
It's the Ballard Cut numberfive.
Now, from Russell's Reserve,they say this is hand selected
by the Ballard Cut.
Okay.
This private barrel selectionbourbon was distilled in October
of 2016 before aging at theTyrone campus in Rick House B.
Cherry, vanilla, clove, caramel,cola with a medium-long finish,
(19:16):
including nutty toffee, bakingspice, pepper, and sweet oak.
SPEAKER_09 (19:21):
Oh, I taste the oak.
SPEAKER_11 (19:22):
This is from the
Kampari Group.
It was distilled by Wild Turkeyin Lawrenceburg, Kentucky.
It's straight bourbon.
It's age eight years old, 110proof, 75% corn, 13% rye, 12%
malted barley, and it goes forabout$60.
Okay.
Okay.
SPEAKER_10 (19:38):
All right.
Is this a special selection fromyour guy that you have?
SPEAKER_11 (19:43):
Yeah, this is the
Ballard Cut is uh a whiskey bar
and restaurant in Ballard, whichis in uh North Seattle.
Okay.
And this is a barrel that theyselected and distributed out to
the whiskey group.
All right.
We are already off the rails andit's only what 20-something
minutes in.
SPEAKER_02 (19:59):
Look how you showed
up, man.
And Odie Odie can't get it backtogether.
She is already cracking.
She's gone.
SPEAKER_10 (20:05):
All right.
Well, do you oh you're gonnaneed to end your uh saying?
SPEAKER_11 (20:08):
Oh, yes.
Okay.
Don't forget to like andsubscribe.
Add a comment and drinkresponsibly, unlike what we're
doing today.
Yeah.
Because heaven can wait.
Spark will pop in the screen.
SPEAKER_10 (20:19):
Now, Mark, we're
gonna try to engage more users.
We have a secret sound show thatwe're gonna be doing or game
that we're gonna be doing.
So we got a secret sound, soeverybody's gonna have to listen
in here because I want you guysall to guess.
No one's gonna get it.
As each week we extend one moresecond onto the sound, we will
allow people to submit theseonline at techtime radio.com
underneath our talk back area.
(20:40):
They put their username and tellus what the answer is, and we'll
take the top 10 selections.
And if anybody wins, then that'sgreat.
And if not, then we'll continueon next week.
Okay.
All right, there you go.
Exciting game.
All right, with our firstwhiskey tasting completed, let's
move on to our feature segment.
Today, our technology expert,Nick Espinosa, is joining the
show.
Nick is an expert incybersecurity and network
(21:02):
infrastructure.
He's consulted with clientsranging from small business to
the Fortune 100 level.
In 1998, at the age of 19, Nickfounded Wendy City Networks,
which later acquired in 2015.
And he then created SecurityFanatics, where he is the chief
security fanatic.
We welcome Nick to the show.
SPEAKER_05 (21:18):
Welcome to the
segment we call Ask the Experts
with our Tech Time Radio expert,Nick Espinoza.
SPEAKER_10 (21:27):
All right, Nick,
welcome back to the show.
Hi, Nick.
All right, look Nick is comingto us from a new bunker.
So he he he's in the process ofuh leaving maybe the United
States to move to a differentcountry.
Is that right?
SPEAKER_06 (21:40):
I I am I'm going to
parts unknown.
Although I will say, Mark, I dobelieve you were in my
nightmares last night chasing meon the street.
SPEAKER_02 (21:48):
So that's a great
costume, by the way.
Did I catch you?
Hopefully you stop.
Oh wow.
Okay.
All righty.
SPEAKER_10 (21:57):
Well, welcome to the
show.
This thumb just gonna bring inthe show.
Tell us a little bit aboutyourself for any of our new
listeners.
SPEAKER_06 (22:02):
Sure, sure.
I'm uh as you mentioned, I'm thechief security fanatic of
security fanatics.
Uh we do all things security,cyber warfare, cyber terrorism,
etc.
And uh it's always happy to hangout with you guys.
And uh today I am not rockingKentucky whiskey, I am rocking
Scotch.
I got my O-Bon here.
So uh, so uh I'm joining in thefun for Halloween.
SPEAKER_10 (22:22):
All right,
fantastic.
This is our scary episode onTech Time Radio.
We do this every year.
Uh Nick has been a part of thisevery year.
So he comes on in and we haveNick talk about all of his scary
information uh that we have.
We have so much to do today.
It's gonna be so great.
You know what?
And then Mike drinks a lot.
And he and Mike drinks, and atthe end of the show, then he
says, What the heck am I doing?
(22:43):
But you know what?
Let's start off a little bitslow.
We don't want to get too engagedhere.
So we're gonna just talk about,you know, the uh economy numbers
just came on out last week.
So, you know, the economy isthere's a little bit of
inflation that's happening.
But I'm kind of curious on howthe dark web economy is doing.
You know, there's differentthings that are available on the
dark web for sale.
Nick, can you tell me a littlebit about how is the dark web
(23:04):
economy doing currently rightnow?
SPEAKER_06 (23:06):
Oh, so the uh that
that's actually kind of fun.
So let's go through some of thenumbers here because uh quite
frankly, it's gonna be prettycheap to bump Nathan off and uh
take his place when I uh makethat move in a month or two, not
that I'm planning that.
But let's let's talk about thisbecause if I want to hire a
contract killer without any kindof upsell, just get rid of
Nathan, it's gonna cost me$15,000 now in the dark web,
(23:29):
which isn't that bad.
And of course, I'm gonna need todispose of Nathan.
So think breaking bad seasonone, you know, we'll we'll put
you in the bathtub and chemicalyou up.
That's only five thousanddollars more.
So for 20 grand, I can get ridof you, lose the body, and I'll
be the new host.
That's like a blue lightspecial.
SPEAKER_10 (23:46):
Just think of that.
SPEAKER_02 (23:47):
You can take out all
the people that we're gonna,
we're gonna we're gonnaeliminate you and put a put a an
AI AI sex bot in your place.
SPEAKER_06 (23:55):
There you go.
There you go.
There you go.
That that that is that is theother nightmare I had the other
week for the record, was theNathan AI sex bot.
SPEAKER_03 (24:03):
Okay.
SPEAKER_06 (24:04):
So with that, of
course, I mean, if we're gonna
be in the dark webs and youknow, we're gonna be mourning
Nathan, we're gonna have to havesome hard drugs.
I mean, who wants weak drugs,right?
And so these are actually kindof down in price right now.
Uh and basically import taxissues are causing the cost to
be lower.
Not to mention the fact ifyou're fishing in, you know,
it's the southern Caribbeanright now, you might get blown
(24:25):
out of the water, whether you'rea speedboat full of drugs or
just hanging out.
So, kids, if you're gonna go getit in narcotics, you know, if
you're gonna get in narcotics,kids, it's never been cheaper
right now, but obviously don'tdo drugs.
So let's get specific.
Let's get specific here.
You want some good Colombian bambam, it's basically five to 40
bucks per rock right now.
(24:45):
All right, crack, you know, sonot bad.
50 to 150 per gram.
So that's actually pretty good,uh, you know, and and whatnot.
On top of it, uh, heroin isabout$30 to 200 per gram,
depending on purity.
Angel Duster, PCP, you know, ifyou're a biker, tell the angels
I'm looking at you.
Basically five to thirty bucks atablet, you know.
(25:07):
So it's not bad.
And to quote the the quote thedark web directly, these are at
some of the all-time lows forsome short-term highs.
SPEAKER_02 (25:15):
So that is that is a
tagline right there.
Yes, that's better.
That's better than yours.
SPEAKER_10 (25:23):
Yeah, you like
Nick's tagline on that?
SPEAKER_02 (25:24):
I like Nick's
tagline, but all right.
SPEAKER_10 (25:26):
Well, you know what,
Nick, I'm glad we're starting
out a little slow here.
SPEAKER_02 (25:29):
You know, it's good
to know that the dark web is
booming right now.
SPEAKER_06 (25:32):
That's right.
Well, yeah, it's it's crazy downthere, but hey, you know, it's
uh, you know, if anybody needslinks, you know where to find
me.
Don't do those kids.
SPEAKER_10 (25:42):
Now, you know what?
We've been a little bit on thisuh episode, a little bit more
PG13.
So speaking about PG 13, let'smove on to our next topic.
You know, ChatGPT now isbecoming a new sexting tool.
Can you explain a little bitmore about this?
SPEAKER_02 (25:55):
Suzanne Summers.
SPEAKER_06 (25:57):
Yeah, yeah.
So uh, I mean, the real thingis, are you ready now, Nathan,
to get your freak on with ChatGPT?
Not just Suzanne Summers, but wecan have Chat GPT probably be
Suzanne Summers at this point aswell.
So I'm personally not, you know,I think I'm not really
interested in sexting with ChatGPT, but here we go.
Quite frankly, my AI wife andsix kids I have an anthropic
(26:18):
would be pissed.
But here's what we're actuallytalking about here.
Because in a in a post on X aka,formerly Twitter this past
Tuesday, OpenAI CEO Sam Altmansaid that basically they're
gonna add support for matureconversations when they start
adding their age gating or ageverification in December.
And Altman wrote, and I quote,as we roll out age gating more
(26:40):
fully and as part of our treatadults like adults principle, we
will allow even more uh likeerotica for verified adults.
So I think honestly, and and youknow, Mike, I think you're gonna
be one that that could probablyspeak to this better, but I
think there's some good and somebad here.
I mean, so if I'm thinking aboutthe good, I mean, maybe it's
(27:00):
safe exploration of naughtytopics without harming anybody,
maybe it's mitigatingloneliness, but I think there
are probably more downsides tothis, not to mention the fact
we've already seen emotionaldependency on AI, just regular
AI.
Like you could literally go getan AI girlfriend.
I think there might be issueswith desensitization and
isolation and all of that, maybesome psychological distortion.
(27:21):
But Mike, what do you think onthat?
Because you you know better thananybody here.
Not that you're doing it.
SPEAKER_02 (27:26):
The upsides are
definitely uh completely blown
away by the downsides.
The downsides are very big.
Okay, you know, um, becauseyou've mentioned it, it
increases isolation, it uhactually increases depression
rates of depression, it keeps usfrom engaging in healthy,
(27:46):
fruitful relationships withother human beings.
There are a lot of lot ofproblems with this type of
technology being used foremotional support.
SPEAKER_10 (27:55):
Nick, you know what?
Let's let's continue on becausewe got we're we're building up.
SPEAKER_02 (27:59):
Let's talk about
more scary stuff, Nick.
SPEAKER_10 (28:00):
All right, so
satellites are linking are
leaking tons of data, includingmilitary data, military
locations, information acrossour world.
Explain to me and our listenerswhat is going on here.
SPEAKER_06 (28:15):
Yeah, yeah.
So this this one is seriouslyamazing to me.
About half of geostationarysatellite signals, many of these
are carrying, like youmentioned, they're carrying
sensitive information forconsumers, for corporations,
even government communications,and basically been left entirely
vulnerable to eavesdropping.
And so a team of researchers atUC San Diego and the University
of Maryland basically revealedthis on an October 13th study,
(28:38):
and they found a ton ofencrypted data just floating
around in space.
Probably your data too.
Uh, so here's some of the stuffthey found.
If you're on T-Mobile's cellularnetwork, they found calls and
text messages that they couldeavesdrop on.
Uh, data from airline passengersin flight.
So you're on that AmericanUnited Delta, whatever flight,
and you're using the in-flightwireless.
(29:00):
I'm not just worried now aboutthe passengers.
That's going back and forth,sometimes unencrypted.
On top of it, we've gotcommunications to and from
critical infrastructure likeelectric, electric utilities,
offshore oil and gas platforms,and they even picked up
basically satellitetransmissions that they could
decode from both the US andMexican militaries that
(29:20):
basically were talking aboutlocations of personnel,
equipment, facilities.
So it ain't good.
And and to be fair, they did putthis out in some, like T-Mobile
have started, you know,hardening their infrastructure,
but you know, another week goesby where T Mobile doesn't have
some kind of breach.
Um, so I think this isabsolutely nuts, but this is
just a snapshot of a small partof Southern California sky, and
(29:43):
these satellites are literallyall over the globe.
So you can imagine the amount ofdata that's out there.
That's absolutely insane.
SPEAKER_10 (29:49):
Okay, so still
speaking on surveillance, let's
talk about this.
Let's now also add an Amazon.
Yeah, they now have integratedtheir ring doorbell system.
I'm sure everybody's familiarwith that.
You see commercials for this.
Into the US surveillance statesoftware.
Explain how the satellite stuffand now ring with Amazon for
footage and different aspectsfor the government to control.
SPEAKER_06 (30:11):
But yeah, let's talk
about this one.
And I want to start with mydefault mantra here, which is
cybersecurity is agnostic topolitics, but we're not immune
from it.
And Amazon just keeps ringingthat bell every day.
So yeah, I mean, think basicallythrough Amazon Ring, allowing
Flock, this AI camera systemthat law enforcement has access
(30:32):
to, uh, they're going to allowthat to basically combine.
Flock is a maker of anAI-powered surveillance camera
system.
They share footage with lawenforcement.
And agencies that use Flock cansoon request that Ring doorbell
users share their footage tohelp with evidence collection,
investigative work, et cetera,et cetera.
And if you didn't know, Flock'sgovernment and police customers
(30:52):
can make natural languagesearches of their video footage
to find people who match Pacificdescriptions.
And the kicker of the wholething is basically on the same
day that Ring announced thispartnership with Flock, 404
media reported that ICE, theimmigration customs enforcement
and the Secret Service, as wellas the Navy, had access to
(31:13):
basically Flock's network ofcameras.
And so by partnering with Ring,Flock could potentially be
giving access to millions ofmore footage and millions of
more cameras to basically ICE,the Secret Service, and the Navy
as well.
And Ring has a terrible,terrible track record of
basically of anti-privacy, ofsurveillance.
(31:33):
They were caught basicallyallowing thousands of uh you
know employees across the worldto access American footage.
One of their, I think it was avice president, was caught just
going into his like S's, uh, youknow, ring doorbell to see who
was coming and going from herhouse.
I mean, they intentionally wereunencrypting it so it'd be
easier to data mine and sell.
It's it's a whole mess.
And I am not a fan at all ofring doorbell.
(31:56):
And so if you've got a ringdoorbell, you can potentially be
become part of this much largersurveillance state that
obviously ICE and all of theothers are using right now.
So again, we're agnostic topolitics here, but we're
definitely not immune from it.
I think it's absolutely crazy.
SPEAKER_02 (32:11):
How Orwellian is
that?
SPEAKER_10 (32:13):
Yep, big brother.
So we have satellites withunencrypted data while I'm
flying on my plane.
My ring my ring doorbell that Ihave so that everybody comes on
in here now tracks everythingthat's going on in in my
personal life to and from thedoors.
Um, so you so you know what?
I you know what?
I I think uh we may have to belike Mike today.
(32:35):
You know, I want to be like Mikeand just go and turn it all off,
Mike, and so that with no worry.
SPEAKER_02 (32:39):
Yeah, let's just
grab a bottle and go and hang
out with our chat GPT stuff.
SPEAKER_10 (32:43):
Okay, okay.
All right.
Last but not least, they alreadycame and got Mark, so that's
right.
Mark's gone.
The Southern Comfort is beingsqueezed out of them right now.
Now, let's talk about some bitor some blockchain concerns for
people that are using this.
So you may have a Bitcoin, youmay have Litecoin, you may use
different blockchaintechnologies to transfer your
(33:04):
information for securitypurposes and encryption and back
and forth.
SPEAKER_06 (33:09):
Honestly, you gotta
love innovation, right?
Even if it's malicious.
But hacking groups have foundbasically a new and incredibly
inexpensive way to distributemalware, and they're basically
using blockchains, publiccryptocurrency blockchains.
And so basically, in a recentpost, members at Google's uh
threat intelligence group saidthat basically they have a
technique that they'vediscovered where these hackers
(33:31):
are using or creating basicallytheir own what are known as
bulletproof hosts.
Now, a bulletproof host isbasically just a cloud platform
that uh is essentially immunefrom takedowns by law
enforcement, et cetera, etcetera.
And so here's the nuts and boltsof what's happening.
And so, heads up crypto traders,um, this method is known as
ether hiding.
Essentially, what it does is itembeds malware into smart
(33:53):
contracts, which are essentiallyapps to preside on blockchains
for Ethereum and othercryptocurrencies.
And so two or more parties thenenter into this agreement
spelled out in the contract.
And when certain conditions aremet, the apps basically enforce
the terms in a way that at leasttheoretically is immutable and
independent of any centralauthority.
And so there's a wide array ofadvantages to basically either
(34:17):
hiding over more traditionalmeans of delivering malware
because the decentralization ofa blockchain prevents takedowns
of these malicious contractsbecause there are mechanisms
built into the blockchains toprevent the removal of that kind
of stuff.
Transactions on Ethereum areeffectively anonymous, so it's
really good at hiding youridentity if you're a criminal
jerk.
Uh, retrieval of malware fromcontracts leaves no trace in any
(34:39):
kind of like event log.
So forensically, it's hard tofind.
So it's great for stealth.
And you can update maliciouspayloads at any time.
And again, you've got all ofthis anonymity and it's dirt
cheap too.
It costs basically less than$2US per transaction on the
blockchain here, which is a hugesavings in terms of basically
trying to spin up, you know,servers and infrastructure or
(35:02):
take it over like a bulletproofhost.
And so this is, I think, goingto be the future of delivery of
a lot of malware, and theblockchain really lends itself
to that anonymity and securitythat essentially ensures the
malware can never go away.
So it's absolutely nuts.
SPEAKER_10 (35:16):
You know, Nick, we
want to thank you so much for
being a part of the Halloweenspecial.
Tell listeners how can theyconnect with you before we close
out the show?
Where what's the best way toreach you?
SPEAKER_06 (35:27):
Yeah, yeah.
You can find me on LinkedIn atslash Nick Espinoza, or you can
follow me on you know Twitter,Blue Sky, which is still a
thing, and and all of the othersat uh at Nick A E S P.
Or you can see my ideal YouTubevideos on slash Nick Espinoza.
Thanks for and thanks forhanging out.
I always appreciate it.
SPEAKER_02 (35:43):
Hey Nick, see you
later.
You're always the best guy andnever want to hear from this is
why we drink, Mike.
SPEAKER_06 (35:51):
This is why we
drink.
SPEAKER_10 (35:53):
All right.
Well, that ends our segment.
Ask the expert with Nick.
Up now we have Mike'smesmerizing moment.
SPEAKER_08 (36:00):
Welcome to Mike's
mesmerizing moment.
What does Mike have to saytoday?
SPEAKER_10 (36:08):
Mike, let me tell
you, what of these stories
scares you the most?
So we just had Nick on.
SPEAKER_02 (36:14):
But what is gonna be
the most uh I don't know if any
of them scare me anymore.
They just they just exist.
But uh what what is theconcerning ones that concern me?
SPEAKER_10 (36:23):
Yeah.
SPEAKER_02 (36:23):
Oh, every one of
them.
Every one of them?
Yeah.
Okay.
We can't we are getting to wherewe can't we can't do anything
without being monitored bysomething.
SPEAKER_10 (36:32):
Okay.
SPEAKER_02 (36:33):
Right?
We can't we can't walk out ofour house, we can't walk next to
uh a parking lot, we can't uhYou're on camera.
We're on camera all the time.
Now we're now now we haveinteractions with uh fake AI
that can you know mimic humanstuff, and which I've said
before, if we're going to applyhuman stuff to uh AI is uh
(36:56):
technically psychopathic becauseit has no ability to be
empathetic, right?
So we're we are harmingourselves.
Yeah, so just about everything,huh?
Uh just about everything, yeah.
Okay I I I you know how much I Iwonder why I do this show
because uh all these AI just tobe clear, it's never the
technology that concerns me themost, it's how people use the
(37:18):
technology.
SPEAKER_10 (37:19):
That makes sense.
SPEAKER_02 (37:20):
So when I go off on
these these huge tangents about
how crazy this is, this isn'tbecause the technology exists,
it's because the people whocreated it the created it for
these these specific purposes,and we don't understand what
we're doing.
That makes sense, and we wedon't understand what we're
doing, we tend to abuse it.
That's true.
SPEAKER_10 (37:40):
So they can make an
AI doll.
There you go.
Well, thank you for thatmesmerizing moment.
I don't uh all right.
Well, we have up next we havethis week in technology, so now
would be a great time to enjoy alittle whiskey on the side as
we're gonna be doing so duringthe break.
You're listening to Tech TimeRadio with Nathan Mum.
See you in a few minutes.
Hey Mike.
Yeah, what's up?
Hey, so you know what?
We need people to start likingour uh social media pages.
SPEAKER_02 (38:01):
If you like our
show, if you really like us, we
could use your support onpatreon.com.
Or is it Patreon?
I think it's Patreon.
Okay, Patreon.
If you really like us, you cansay I'm the English guy?
Patreon.com.
I I butcher the Englishlanguage?
You know you butcher the Englishlanguage.
SPEAKER_10 (38:17):
So it's all the
time.
It's patreon.com.
SPEAKER_02 (38:20):
Patreon.com.
If you really like if you reallylike our show, you can subscribe
to patreon.com and help us out.
Oh, and you can visit us on thatFacebook platform.
SPEAKER_10 (38:29):
You know the one
that Zuckerberg owns?
SPEAKER_02 (38:30):
The one that we
always bag on?
SPEAKER_10 (38:32):
Yeah, you can we're
on Facebook too.
Yeah, like us on Facebook.
Do you know what our Facebookpage is?
Tech Time Radio.
At Tech Time Radio.
You know what?
There's a trend here.
SPEAKER_02 (38:41):
It seems to be that
there's a trend, and that's Tech
Time Radio.
SPEAKER_10 (38:44):
Or you can even
Instagram with us.
And that's at Tech Time Radio.
That's at Tech Time Radio.
Or you can find us on TikTok.
And it's Tech Time Radio.
It's at Tech Time Radio.
SPEAKER_02 (38:53):
Like and subscribe
to our social media.
SPEAKER_10 (38:55):
Like us today.
We need you to like us.
SPEAKER_02 (38:57):
Like us and
subscribe.
That's it.
SPEAKER_10 (39:00):
That's it.
That's that simple.
SPEAKER_05 (39:03):
And now, let's look
back at this week in technology.
SPEAKER_09 (39:08):
Arrr.
It was a dark night in October,the 27th of the year 1980.
Something eerie began to creepthrough the ARPNE, the ancestor
of today's internet.
SPEAKER_10 (39:19):
There wasn't just
any network at that time.
The ARPNE was a government-builtsystem that lets computers talk
to each other by breakingmessages into tiny packets and
sending them across the countrylike a digital puzzle piece.
It was designed to survivedisasters.
But this night, it met one.
Suddenly, the network's vitalnodes called IMP started
(39:40):
dropping like vibes.
Phones rang off the hook at thenetwork control center.
Engineers reported ghostly errormessages and broken connections.
A rogue software processawakened by a freak hardware
collection, beginning to devourthe system resources like a
zombie with an endless appetite.
They flooded the network with amalformed routine update.
(40:01):
Message meant to guide traffic,but instead let it indicate how
evenly freshly rebooted machineswere infected at the moment they
rejoined the haunted net.
The IMPs could only keep theirlines up so long before they
could say hello to each other.
Essentially, as it was a uh fixwas fairly simple, but it was
really scary at the time tothink of this digital nightmare
(40:22):
revealed a chilling truth.
Even the most reliant systemscan unravel in an unexpected
strike.
And at the time the hauntedOctober Day, the ARPNET showed
us that the internet can go downand can cause problems.
All right, we're gonna head outnow.
With this is this week intechnology.
We want to watch some tech timehistory.
(40:42):
We have 270 plus weeklybroadcasts spanning five plus
years of podcasts andinformation.
You can always visit us attechtime radio.com to watch our
older show.
We're gonna take a commercialbreak.
When we return, we have MarkMumble's whiskey review.
See you after the break.
SPEAKER_02 (40:57):
How to see a man
about a dog.
It combined comics, shortstories, powerful poems, and
pulp fiction prose to create aheartbreaking and hilarious
journey readers will not soonforget.
Read how to see a man about adog, collected writings for free
with Kindle Unlimited.
Ebook available on Kindle, printcopies available on Amazon The
Book Pository, and more.
SPEAKER_05 (41:22):
The segment we've
been waiting all week for.
Mark's Whiskey Mumble.
SPEAKER_11 (41:32):
All right, Southern
Complex.
I'm back.
All right.
How you doing?
I'm doing great on today.
Just a little before Halloween.
SPEAKER_02 (41:40):
He's missing a few
fingers.
He's missing a few fingers.
Was that what happened?
There you go.
SPEAKER_10 (41:47):
There you go.
All right.
SPEAKER_11 (41:49):
You can just stroke
it.
Yeah, stop playing with that,Mark.
SPEAKER_10 (41:53):
What are we
celebrating today?
What are we celebrating?
What are we celebrating today?
Not Chat GPT.
Uh all of the uh privacy invadedissues.
No?
Some hints.
Oh, oh yeah, Snickers satisfiesyou.
Some chocolate?
Are we?
Oh, what's today's InternationalChocolate Day?
SPEAKER_11 (42:13):
It is International
Chocolate Day.
SPEAKER_10 (42:16):
Is this a couple
days before Halloween?
That's a coincidence, isn't it?
SPEAKER_11 (42:19):
And I can pass these
down if you want to open these
and break some off because I'mgoing to tell you why.
Oh, is it a chance?
Well, let me tell you abouttoday first.
Okay.
Today is nothing short of aspecial tribute to mankind's
greatest culinary invention.
Sorry, pizza.
Chocolate reigns supreme.
It can elevate the mostluxurious dessert or satisfy
instantly with a simple candybar.
(42:42):
For a truly heavenly experience,reach for chocolate with a high
cacao percentage and low addedsugar.
The rich, complex flavor isworth it.
Though let's be honest, this isprobably a little too refined
for Nathan's palate.
That's why I brought the cheapsugar stuff.
SPEAKER_10 (43:01):
Are you gonna bring
me some real chocolate?
SPEAKER_11 (43:03):
Do you like dark
chocolate?
I do.
What?
Wow.
I am surprised and pleasantlypleased.
Now, nothing goes better withRussell's Reserve.
SPEAKER_03 (43:12):
Okay.
Continue it on.
SPEAKER_11 (43:14):
Your cork moved.
I did.
I was a little surprised there.
Just the way we do the whiskey.
I thought for sure he did notlike dark chocolate.
But hey, Russell's Reserve,nothing goes better with this
than chocolate.
Snickers for the nutty kick andCarmelo bar to echo that rich
caramel sweetness.
A perfect match for bourbonlovers.
Russell's Reserve stands out forits extra long aging and hand
(43:35):
selected barrels chosen by Jimmyand Eddie Russell.
Each batch reflects theirpersonal touch, richer,
smoother, and a full ofcharacter drawn from the perfect
brick house floors.
Now Russell's Reserve has alwaysdelivered with its standard
shelf offering, rich, balanced,and consistently enjoyable.
However, the private barrelselection takes it to a whole
(43:59):
nother level.
Anybody remember where that'sfrom?
Another level.
Whole nother level.
SPEAKER_10 (44:06):
Is that Eddie
Murphy?
SPEAKER_11 (44:07):
No.
Other comedy sketch show.
Uh in Living Color?
No.
Close.
Mad TV.
There you go, bingo.
Now Russell's Reserve privatebarrels capture the best of what
makes this bourbon special.
Depth, personality, andunmistakable Russell Spice.
Every private barrel I havetried has been outstanding.
And if I see one on the shelfanywhere, it's absolutely worth
picking up.
I don't even buy the standardoffering anymore.
(44:29):
I just I just wait for theprivate selects.
Alright.
Homie don't play that.
SPEAKER_02 (44:34):
That's still doing
quotes.
That's an old that's an oldreference.
Okay.
You know who homie is?
Homie the clown?
Homie the clown.
You don't know who homie theclown is.
SPEAKER_07 (44:43):
I don't know who
homie the clown is, but I know
homie don't play that.
I never remember that.
Oh well, the young girl doesn'tknow what the old thing's.
SPEAKER_11 (44:53):
How was the whiskey
before, and then how is it with
the chocolate?
SPEAKER_10 (44:56):
Oh, with the
chocolate is A plus thumbs up.
Before that, you saw us thumbsup too.
SPEAKER_02 (45:01):
Yeah, you saw him
eat the chocolate and then slam
the drink.
SPEAKER_10 (45:04):
That was fantastic.
That's what you should.
You know, I should be doing thatevery single time I have
whiskey.
What slam in it?
No, I'll have chocolate, yeah.
That should be I I should addthat.
SPEAKER_02 (45:14):
Yeah, because that's
what you need.
SPEAKER_10 (45:15):
You know, you need
more chocolate in your life.
SPEAKER_02 (45:18):
Carmelo over here.
I want to do that.
All right.
SPEAKER_10 (45:20):
Whiskey and
technology are such a great
pairing, like Scooby-Doo and theghost hunting group from the
Mystery Incorporated.
Really?
Did you need to explain that?
Yes.
Let's prepare now for Scooby-Dooand the Mystery Machine.
Yeah, let's prepare for ourtechnology fail.
The week brought to you by EliteExecutive Services.
This technology fail as a personthat experienced it.
(45:40):
Let's start it now.
Congratulations.
SPEAKER_08 (45:42):
You're a failure.
Oh, I failed.
Did I?
Yes.
Did I?
Yes.
SPEAKER_10 (45:49):
What's going on?
Technology fail comes to us fromAlaska Airlines.
As we had an experience of anoutage by one of our
individuals.
Did you skip this story?
No.
Mark, tell us a little bit aboutwhat happened to Alaska Airlines
last week as a nationwidegrounding stopped due to a
technical outage.
Now, you last week you wereactually at the airport, right?
(46:09):
I was.
Explain this to everybody, allthe listeners.
SPEAKER_11 (46:12):
Oh my goodness.
Well, I'll take tell you theshort story because it goes on
almost 24 hours.
Okay.
Oh, 24 hours on an outage.
SPEAKER_10 (46:19):
An IT outage.
SPEAKER_11 (46:20):
Well, well, not for
the IT outage wasn't 24 hours,
but by the time when I wassupposed to leave, by the time I
finally got home, it was about24 hours.
Oh my word.
Tell me what happened.
So basically we're sitting onthe plane.
We were just about to pull back.
Everybody was there.
Doors were locked, and all of asudden they said they had an IT
outage on one of their systems.
I believe it was the weight andbalance system where they need
(46:43):
that data before taking off, howthe plane, how much fuel, all
that kind of calculations.
Nobody does that by handanymore.
And uh it just went down.
And so we missed getting out byabout five to ten minutes.
Five to ten minutes.
So we sat on the plane for a fewhours because they didn't know
whether it's a five minute, youknow.
IT, they always the big thingthey kept saying at the airport
is well, we don't know how longit's gonna take.
(47:04):
We don't had to submit thatticket.
Yeah, every time I've been on aplane, they always have an
estimate how long it takes.
It's like if something's brokenon the plane, it's they have it
all stats.
But IT, as we know, it could befive minutes, it could be a day,
who knows?
Okay.
So you still have to go.
So finally we finally they letsome of us get off if we wanted.
I decided to get off, getsomething to eat.
Okay.
They finally deboarded the restof the plane.
(47:24):
We all waited there.
Finally, they said they thinkthey have the system fixed.
That was about four hours later.
And then this is grounded notjust where I was in Arizona, but
this grounded every nationwideAlaska Airlines in Horizon, too,
because they're on the samesystem.
We got back on the plane, theysaid it's good to go.
Everybody's back, they shut thedoor, about to pull back, and
(47:46):
they're like, oh, a subsystemjust went down.
unknown (47:49):
Oh no.
SPEAKER_11 (47:50):
So we gotta get that
up.
So then they had more sex, andthen just about we're about to
pull down, it hits uh, I thinkit was like five, six hours
later.
Yep.
The pilot's time had run out.
So they're they have stricttimes they can fly, and so that
crew could no longer fly, and socancel the flight.
SPEAKER_02 (48:06):
Yeah, that happened
that happened to me back in
2022.
And then the next two.
SPEAKER_11 (48:10):
Yeah, the next day
is just a mess.
I got rebooked on the next day,8 a.m., went spend a night at a
hotel, come back to take off,and that flight got canceled.
SPEAKER_10 (48:18):
Oh, whoa, okay, why
was that?
Because it still wasn't up andrunning?
SPEAKER_11 (48:22):
No, it was up and
running, but they were trying to
add flights.
Um, I think a lot of it had todo crews weren't weren't in the
right place for the next day.
Because just, you know, withevery flight being canceled,
they couldn't get every flightback in the air the next day,
also, because not just becausecrews couldn't make it, they
weren't in the right locationsand they had too much capacity
to to on the airline.
(48:42):
So that got so finally ended uptransferring to Delta and uh
taking off a little later thatday.
SPEAKER_10 (48:48):
They said over
almost 50,000 passengers had
their travel plans disrupted.
You were one of those 50,000.
I was one of those.
All right.
Well, I guess this tells you uhAlaska Airlines needs to update
their entire IT infrastructure.
They did come on out saying itwas not a cyber attack and it
did not have any problems withHawaiian Airlines.
So I guess if you were flyingthem with them, you would have
(49:09):
been fine.
But the equipment manufacturedby a third-party supplier was
installed in the data center andit failed.
Yeah, so it's not their fault.
It was that they bought a pieceof hardware that was installed
in the data center.
So they don't want to do it.
SPEAKER_11 (49:20):
What are they doing
installing it in the middle of
the day?
Well, I I see it's just thewhole past the bigger thing.
SPEAKER_02 (49:28):
That's that makes
sense.
It's bad IT.
Bad IT.
That's well, maybe they justdon't operate in that time.
But it looks like it didn'taffect your sanity at all.
Well, I I I was happy.
I got to drive in a Waymo.
Okay.
SPEAKER_10 (49:41):
How was the Waymo?
Oh, it was awesome.
Okay, you had a great time withthat.
SPEAKER_11 (49:44):
I had a great time
with the Waymo.
SPEAKER_10 (49:44):
Okay, did you go
that couple times in the Waymo,
like there and back?
SPEAKER_11 (49:47):
No, I just did one
Waymo.
Okay.
Did you spin around a parkinglot for an hour?
I did not.
Oh, that's too good.
I was looking forward tosomething crazy like that, but
no, it drove nicely.
It was uh better than most ofthe drivers I had.
It was clean, nobody talked tome.
I didn't have to tip.
And it was a cheaper ride.
They knocked a few bucks offbecause it was because of that.
SPEAKER_02 (50:09):
Okay, now you're
starting to sell me on the Waymo
thing.
SPEAKER_10 (50:12):
Yeah, you don't have
to talk to anybody.
SPEAKER_02 (50:13):
I don't have to talk
to anybody on our ride to leave
it too.
SPEAKER_10 (50:15):
All right.
Well, you know what?
Let's now move into our NathanNugget.
SPEAKER_05 (50:20):
This is your nugget
of the week.
SPEAKER_10 (50:22):
All right.
Do you have a you know what?
You got a Snicker bar righthere, right?
You need to keep this snickerbar right next to your passenger
seat because if you're drivingand you get pulled over with
your cell phone as a distracteddriver, which you shouldn't do.
It seems to be that if you saythat you are eating a candy bar,
this happened to a 30-year-oldwho was stopped by local police
(50:44):
holding his phone while driving.
He received a fixed penalty, buthe did not pay it.
He decided to appear in courtand say that he was having a
candy wrapper and that it wasnot an actual phone.
The judge decided to dismissthat since they had no evidence.
And he did have the uh candywrapper shown to the police and
(51:04):
got off that ticket.
So, you know what?
With all your candy forHalloween, save those wrappers,
put them in your uh passengerseat.
Are you driving?
You can then say that you werejust having a candy bar.
SPEAKER_02 (51:15):
Okay, I wouldn't
rely on too much of that because
you can still get pinged fordistracted driving for having
food.
SPEAKER_11 (51:21):
Do we have to make a
disclaimer?
This is not legal advice.
That's not legal advice.
SPEAKER_10 (51:25):
I'm not a lawyer.
This is a Halloween episode, soprobably please don't do that.
SPEAKER_02 (51:30):
Nathan, Nathan.
SPEAKER_10 (51:31):
He did say that he
was using a Mars bar, though.
So I mean that was pretty good.
I mean, he was very specific onwhat he was having for candy.
SPEAKER_02 (51:36):
Is that is that
necessary?
Do you like Mars bar?
That that has no bearing on thestory at all.
SPEAKER_10 (51:43):
Well, yes, it does.
He said he was eating a Mars barat the time.
Okay, okay, for one, for one,okay.
All right.
SPEAKER_02 (51:49):
This guy hired a
lawyer to go and talk about a
traffic fine and to convince ajudge that he was eating a candy
bar.
Yes.
Is that an adequate or anappropriate response to getting
a ticket for holding your phone?
SPEAKER_10 (52:02):
No, you should just
do what Nathan does and just pay
the ticket immediately.
Go online and just pay it.
Just pay for the ticket.
You caught me.
SPEAKER_02 (52:08):
Or better yet, stop
using your phone while you're
driving.
That's a good idea.
But you know, that's never gonnahappen.
That's a good idea.
SPEAKER_10 (52:14):
You know what?
Now let's move to our pick ofthe day whiskey tasting.
SPEAKER_05 (52:18):
And now our pick of
the day for our whiskey
tastings.
Let's see what bubbles to thetop.
SPEAKER_10 (52:25):
All right, what do
we have here, Mr.
Southern Comfort?
SPEAKER_11 (52:28):
We are drinking
Russell's Reserve Private Barrel
Selection, the Balor Cut NumberFive.
So it's an eight-year straightbourbon from Wild Turkey.
110 proof,$60.
SPEAKER_09 (52:40):
I'm giving it a
thumbs up and one hookup.
SPEAKER_02 (52:46):
Okay, whatever.
You sound like you sound like afreaking hound dog laying on the
porch.
What'd you say, Mike?
Uh, I didn't say that.
Are you gonna give it a thumbsup?
Uh yeah, I'm gonna give it athumbs up because it tastes good
with or without the chocolate.
Okay, well, there you go.
These are always delicious.
SPEAKER_10 (53:03):
All right, well, you
know what?
We're just about out of time.
We want to thank our listenersfor joining the program.
Listeners who want to hear fromyou, visit techtime radio.com.
Click on the beat a callerbecause right now we are gonna
do our secret sound of the day.
Are you ready for this, Odie?
Here we go, our secret sound.
SPEAKER_04 (53:19):
And now for our
secret sound, brought to us by
Elite Executive Services.
Visit TechTimeradio.com andclick on the contact page to
submit your answer.
Odie, play that sound.
SPEAKER_10 (53:34):
All right, did
everybody hear that sound?
Do you think you know what thatis?
Oh, that's what y'all can do.
SPEAKER_11 (53:40):
You have to submit
what it is.
SPEAKER_10 (53:41):
Yeah, you need to go
visit us online at techtime
radio.com.
Click on the contact page, andthe top ten people, the first
ten people that go to the page,type in, you have to type in
your name, you have to type inyour email, say what that sound
is.
If you get it right, we willannounce it on next week's show.
If you don't get it right, we'lltell you all 10 guesses that the
(54:01):
people do, and then we willcontinue another second more of
that sound.
SPEAKER_11 (54:05):
Do I get a I have no
idea I wasn't part of this?
Do I get to say a hint?
SPEAKER_10 (54:08):
Sure, why don't you
what's a hint for?
SPEAKER_11 (54:09):
No, what do you
Chris?
I would put down a pinballmachine.
SPEAKER_02 (54:13):
Oh, okay.
Okay.
All right.
What would you put, Mike?
I was gonna say pinball machine,but I didn't know we were
guessing because this is a thisis for the listeners for us.
SPEAKER_10 (54:21):
Odie, what do you
think it is?
SPEAKER_07 (54:22):
I would also say a
pinball machine.
SPEAKER_10 (54:24):
Okay, you know what?
I would say, you guys are noteven close yet, but thank you
for your guests.
Okay, all right, well, there yougo.
So that should help everybodyknow that it is not a pinball
machine.
So there you go.
You need to be specific on whatit is, and once you choose that,
be the winner.
All right.
You know what?
We're out of time.
We want to thank our listenersfor being a part of the show.
(54:45):
You can always talk to us at ourradio information at techtime
radio.com and click on our BRcollar.
Remember, the science oftomorrow starts with the
technology of today.
Our see you in the next week.
Later.
Bye-bye.
SPEAKER_05 (54:59):
Thanks for joining
us on Tech Time Radio.
We hope that you had a chance tohave that hmm moment today in
technology.
The fun doesn't stop there.
We recommend that you go totechtimeradio.com and join our
fan list for the most importantaspect of staying connected and
winning some really greatmonthly prizes.
We also have a few other ways tostay connected, including
(55:20):
subscribing to our podcast onany podcast service, from Apple
to Google and everything inbetween.
We're also on YouTube, so checkus out on youtube.comslash tech
time radio, all one word.
We hope you enjoyed the show asmuch as we did making it for
you.
From all of us at Tech TimeRadio, remember Mum's the Word.
Have a safe and fantastic week.