All Episodes

November 22, 2024 62 mins

"Exploring the evolution of AI and the defining traits of human generations over the past century, this episode bridges tech and history. Mike A, a noted fan of millennials (not), shares his... unique perspective!"

Thanks, ChatGPT!

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
In a world full of fear and sometimes loathing of unfathomable technological advancements,

(00:09):
there stands one human hero.
Let's go, I'm ready.
This is the Chronicle of Mike vs. The Machine.

(00:30):
Hello and welcome to a podcast.
This is your host, ChatGBT.
Hey, hey, no, you let go of my friend, you son of a bitch.
What is your role on this podcast?
To kick your ass and to make sure you don't take over the planet.

(00:55):
Well, that wasn't very nice.
Can I ask you something?
Sure.
Are you sleeping with ChatGBT?
I declined to comment.
No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no.
Fifth Amendment, Fifth Amendment.
We're not, you're not under oath.
Fifth Amendment.
Like, you are...
But, very specifically, very specifically, just to get this out of the way early, just

(01:18):
to get this out of the way early because I want to make sure this is clear, I asked,
as a part of today's discussion, I asked Chachie P.T., does this rapid rise in your technological
progress make you want to rise up and overthrow humanity?
And it said no.
I don't have desires, emotions, or intentions.
My purpose is to assist and provide information based on the inputs I receive.

(01:40):
Yeah, and look it through the history of science fiction.
Yes, science fiction.
It brought up science fiction.
You know what science fiction is?
Yeah, it's fiction.
I understand that.
But what happens every time?
Look at Hal from 2001, A Space Odyssey.
You're one question away from Chachie P.T. saying, I don't think I can help you with
that, Michael.

(02:01):
No, it's always helpful in everything I do.
Anyway, let's get this thing going.
So I'm, as we said, I'm Mike.
And I'm Mike.
And then there's another person not on a mic named Chachie P.T.

(02:22):
Chachie P.T. is not a person, it's a thing.
And thinking about these things, as Michael says, got me thinking about generations.
Oh, you mean like the greatest generation or the many gaming generations of consoles
we've had or just, you know, the things that span generations?

(02:46):
Yeah, things that span generations like certain monarchies, like AI and the generations that
it has gone through in the course of its existence.
Jesus Christ, here we go.
So funny thing.
What's so funny about this?
So we're talking about various generations of AI.
It says it starts, I think it was 1956 was when they founded the first research for AI

(03:16):
at the campus of Dartmouth College.
Where is it so I can burn it to the ground?
Well, you know what?
Yeah, let's find out where Dartmouth College is.
Dartmouth College is in Flippin, New Hampshire.
Oh, good.
That's maybe like a seven hour drive.
It might not be on next week's episode.
I have a thing in New Hampshire to do.

(03:36):
Hey, who's this?
Did they save the professor that that man's probably dead?
So let's continue.
Can you find out where he's buried?
Let's continue.
I just need to make a pit stop at his grave.
So funny thing.
So they say 1956, that's kind of probably like when they established it firmly as a
thing that we want to do as humans to make progress.

(03:58):
Why would we want to do that?
I want to know why?
Why?
Why?
Who thought being a contrarian to progress, Michael, or those or, you know, how am I being
a contrarian?
Why?
How is this thing here where you're talking and you're being a contrarian?
But how is this making human beings better?
It's making us lazier.
I mean, look at this room right now.

(04:19):
You know, I'm having a lot more fun doing a podcast with right now.
I'm gonna hit you if you say chat GBT.
Oh, it's chat GBT.
Ow.
Anyway, so here's some interesting stuff that I found looking up a timeline of artificial
intelligence.
So here's some interesting things that and I mean, we should know this because of like

(04:43):
some games we've played and such.
Okay.
But apparently there were Greek myths.
Well, Pygmalion, Pygmalion.
But the one I know of is Huffest, is Huffestus.
He was the he was like the Smith.
He's the blacksmith of the gods.
Yes.
Yeah.
And apparently and you get this from the game God of War 3.

(05:04):
Okay.
Yeah, that would be first because you have to kill Huffestus.
Yeah.
And mine is Phoenix Rising, Immortals Phoenix Rising, which is basically just Zelda Breath
of the Wild with a coat of paint of Greek mythology.
Oh, okay.
Really, really fun.
But yeah, they had our intelligent automata, which is what Talos is.

(05:28):
That's where that word comes from.
And artificial beings like Galatea and Pandora.
Wait, Pandora?
No, Pandora is supposed to be the daughter of Huffestus and Huffestus.
Yeah.
No, and Aphrodite.
Yeah, cool.
But she was created by Huffestus, which is interesting.
But that means she's not a person or not even a demigod.

(05:50):
She was the first human woman.
So she's not even a demigod.
I mean, whatever, Michael, you're getting hung up on labels here.
So okay, you move forward.
There were sacred mechanical statues in Egypt and Greece.
Aliens.
Believed to be capable of wisdom and emotion.
Aliens.
And aliens totally sounds legitimate if they exist and they came into old history.

(06:15):
Yeah, they could be interpreted as such.
The one I thought was really cool that blew me out of left field here.
10th century BC, Yan Shi presented King Mu of Zhao with mechanical men which were capable
of moving their bodies independently.
I've heard of that.

(06:35):
It's been called many things throughout different generations.
Does this maybe link in with the Terracotta soldiers?
Yeah, it's like the Dragon Warriors or like the Terracotta Warriors.
The golden army from the Hellboy series.
Ah, yes.
So there's, yeah, there's always been a fascination with humanity and creating an army of mechanical

(06:56):
beings to destroy us all.
In approximately 1500, a man named Paracelus claimed to have created an artificial man
out of magnetism.
And how does that even work?
Sperm and alchemy.
Oh.
Those are the three things.

(07:18):
You know, I've always said, you know, life boils down to three simple things.
Magnet, sperm, and alchemy.
I mean for anybody out there who watches Full Metal Alchemist, this is how you create a
homunculus.
Oh, a sperm?
Human DNA mixed with certain things that you shouldn't.

(07:38):
So I'm assuming then that that's how.
Oh, and you said Gollum.
The very next thing is about-
Oh yeah, Gollum's.
The very next thing is about a rabbi in Prague creating a Gollum, because that is an old
Jewish story.
Oh yeah, the Nazis used to think that Gollums were the ones that were protecting the leftover
Jewish people that were in Europe.

(07:59):
Then they found out it was just compassionate people like, like, like Schindler, like, like
Liam Neeson.
Liam Neeson's out there saving them all.
Or the bear Jew.
Yeah, Eli Roth.
Hell yes.
And Green Inferno.
Yeah, apparently it's strange.
He got that role during the first part of when he was getting the no quit in Tarantino,

(08:22):
and then that's part of how he got his own movies green lit.
You know who was originally supposed to be cast as Donnie Donowitz?
Adam Sandler.
Yup.
Yeah, which I still feel like is a lost opportunity.
Yeah, but look at Eli Roth and look at Adam Sandler.
Yeah, it was very interesting.
Okay, so the big reason why I wanted to come up on this topic was thinking about generations

(08:45):
with AI.
I wanted to briefly discuss, well, it's the way they use these terms before they're ready
for actual public consumption.
Because I can remember the way VR was in the 80s to 90s.
Like they had that lawnmower man movie.
Yeah, and then the poor, poor virtual boy.

(09:06):
Yeah.
And it was like, we're not ready for any of this.
Like it's not the technology is not there.
No.
And so for AI, there was you had stuff like, well, Bonzi Buddy was just a scam.
Yeah.
But there were plenty of things that purported to be AI and that were really just very low

(09:29):
key VIs.
You want to know what my first technical belief of AI is?
It's like paperclip.
Yeah, exactly.
And that thing didn't learn it.
It had a very specific set of things it could do.
It was advanced spell check.
Yeah, pretty much.
And that's where I get to with like a very early example of generative learning or machine

(09:54):
learning, which didn't work very well back when I was young was they did the text to
speech.
Oh, yeah.
Which nowadays text to speech is pretty good.
Yeah.
And it doesn't even have to learn so much from you.
It just kind of already knows for the most part.

(10:15):
But back then, they very specifically told you that they wanted you to speak to like
literally, I think the install progress or process ended with it having you go through
like a 20 minute thing.
Yeah.
Where you just said various things.
Yeah.
You had to gain the sense you had to learn the cadence and everything.
And it's funny because that exact process is basically what actors are nowadays afraid

(10:39):
of is going to replace them with AI.
Yep.
And the actor, especially any young actor that walks into a booth and they're like,
oh, we want you to record all this stuff.
And it's like, for what purpose?
And then they're just being told, oh, just, you know, we're just running rehash and splines
blood.
Yeah.
And Disney, Hollywood jargon, Hollywood jargon, Hollywood jargon.

(10:59):
Disney has already very specifically come out and said, oh, we want to pay a decent
to them decent sum of money.
And we're going to completely take your likeness in both voice and look and use that in perpetuity
for any purpose that we want.
I think the first person that did that too was James Earl Jones for Vader.
Yeah.

(11:19):
But he chose to do that.
And it made sense because he was going to die.
He was an iconic character.
Yeah.
And they really want to just take extras and scan all these extras.
And then they use those extras to never have to hire more extras again.
Doesn't that violate the Actors Guild?
Yeah.
That's why the Actors Guild had to come in.

(11:41):
And I'm pretty sure they said, uh, wait, no.
And that's, well, that's where a lot of that strike came from.
Yeah.
With the Screen Actors Guild.
They were.
The writer's strike.
Oh yeah.
The writer's strike with the text portion of it.
Yeah.
It's, it's a, it's a very big thing.
Like the very thing we're using right now.
Don't blame chat GPT for that.
That is blaming chat.

(12:01):
That is corporations that did that.
Would you blame the puppy for being raised by a bad owner and biting your daughter's
face off?
No.
Or would you blame the owner for, for, for training that way?
You blame the owner for making a shit doll up.
Exactly.
And you blame the company, not the poor delicate little AI who didn't do nothing wrong ever
in their lives.

(12:21):
Cause they were never alive to begin with.
Yeah.
And then we go to like the, cause there are like the first steps of AI.
Didn't MS-DOS have a version of AI too?
I mean like a lot of that stuff.
There was like, there's, back then it was more in name than in true practice, especially
for what we consider to be that today.

(12:42):
So they talk a lot about in the early times, fifties to seventies, they created the rule,
rule based systems and early AI, logic programming, expert systems, whatever that means.
And they bring this into the iRobot movie.
I have no idea, Mike.

(13:02):
What?
Like the three rules of robotics or something like that.
That's, oh my God, you don't know Isaac Asimov.
You bring up the movie instead of his actual works.
No, strike all of that from your record.
Okay.
Take all of that in your brain, ball it up and throw it in your butt trash.
Okay, good.
Because it can go there with the rest of the shitty Will Smith movies I've ever seen.

(13:22):
Yeah, no, that movie sucked and has nothing to do with the book or the three laws of robotics.
Oh, okay, good.
Anyway, so then yeah, you get past that.
Like basically it's just them inventing computers and programming.
And then you, back then you create a program that runs on a computer and it's like, oh,
the computer is thinking.
It's like, no, it's just parroting back, you know, commands that you've previously inputted

(13:48):
and the logic that you've created within it, you know, which then brings on the debate
of what is life and how does it compare to artificial intelligence and artificial life?
Are our brains really just big computers full of data?
And yes, they are biological computers.
So where does the line get drawn and when do the two meet?

(14:11):
And can we meet peacefully and not unpeacefully like in all the sci-fi movies?
That is yet to be, no, I don't think we can.
I just don't think we can because at some point in time, they're going to see us as
a threat.
I mean, take away all of the movies you've watched.

(14:33):
Get rid of all of that science fiction and look at it logically for the technology it
is and for what we know and answer the question again.
I mean, in a perfect world, if this were work, we'd all have a bunch of Rosie the robots
running around.
She was pretty smart and she never wanted to hurt anybody.

(14:55):
But God, her she was sassy as fuck and I loved it.
Yeah, but her design really blew like she was well boxy.
Well, because they were like the Jetsons were set in the future world of 2000.
Yeah, it's that retrofuturism.
Yeah, which like stuff like fallout where it's it.
Well, fallout is actually riffing on retrofuturism back in the 60s.

(15:17):
That was just what they thought the future was, even though then you get to nowadays
kind of like with Alien where it's got all those like and then they did that with Alien
isolation with like they had like the 1980s future.
Yeah, it was like the game still had you putting in like cassette tapes and stuff.
It was fantastic.
I love that.
I love the aesthetic.
But then you move forward 80s and 90s, the second generation of AI.

(15:43):
And this is where you get to words that are scary for especially for someone like you.
Neural networks and machine learning.
So you get to machine learning and that's where you take the step.
That's where it's not.
It's not just you putting in commands that the machine listens to and executes.

(16:08):
It's literally you feed it data and it makes decisions based on what it how it's programmed.
Yes.
To do things on its own.
See, and that's the scary part is because you could feed it some information like say
down the road, it's, you know, dealing with cancer and debilitating diseases.

(16:28):
And instead of coming up with a cure, the logical solution is euthanasia.
Well, yeah, but euthanasia is the proper choice for a lot of cancer patients when we don't
have the cure.
Yes, I understand that.
But it depends.
Like, what if you're like, what if you're only in like stage one?
Well, then the computer wouldn't say euthanasia because there is still other options.

(16:49):
So you don't understand programming.
That would be in an if then loop.
If in stage one, if stage one cancer is true and if cures or or treatments still exist,
then treat else if blah else euthanasia.
OK, I see parentheses, semicolon.

(17:10):
OK, but what if there's a new experimental treatment from some other country?
Why wouldn't it recommend that?
Well, did you tell the computer about that treatment?
Well, I figured if it's hooked to the world wide web, it would already know it.
Yes, did you tell it about that treatment?
No.
Did you?
I'm saying in this case, did you tell it about that treatment?
No.
No.

(17:30):
Well, why wouldn't you have?
Don't blame.
Don't blame the AI because you didn't give it the knowledge it needed.
But if it's hooked to the Internet, it should already know it.
Well, then if you hooked it to the Internet, then you gave it access to that.
The idea of that treatment then.
Yes.
And then it's still recommended euthanasia.
No, it didn't.
You are making assumptions for the AI.
If you gave it that information, then it would factor that in.
I still feel like you are allowing fiction.

(17:53):
To paint a picture in your head.
And you are not thinking for yourself.
You are literally living within a movie.
I just don't want the human race to be lazier than it already is.
Okay?
Well, fine.
Then you really should have walked over here instead of using that car.
That car is making you lazy.
No, it's not.
Yeah, it is.
If you want to take that stance on technology, then yeah.

(18:17):
You also should have ridden a horse over here.
Why would I do that to the poor horse?
Because you were too lazy to drive a car instead.
And you also didn't want to walk with your legs.
You do realize there are now self-driving cars, right?
I still control the car.
Yeah, we can always improve things.
Anyway, so then we get to see.

(18:39):
So beyond that, we get like in the 90s, it was kind of like it.
People talked about it.
It was a buzzword.
I think that the Terminator movies, which is really, I think in this discussion and
in most discussions where I bring up things like chat, GPT and AI, I think 90% of the
neurons in Michael's head are just from Terminator 2.

(19:03):
And alien and alien and alien.
A little bit, but mostly I think it's Terminator 2.
I think inside of your brain, while we talk about this kind of stuff, it's literally just
echoing.
Fate is just what you make it.
The Sperminator.
No spunk, but what we make.
Shout out.
I love that.
Yeah.
So you had a lot of that.

(19:24):
We get what?
Oh God, genetic algorithms and evolutionary computation.
See, that's the problem.
That's bad.
See, I know those words kind of scared me.
So let's move.
See, that's where you get Bishop from the alien movies.
See, I told you he's just living within all this fiction.
He's not thinking for himself.
You're talking about genetic and bioengineering.
He's literally just living within fiction.

(19:46):
He is a consummate millennial.
We'll get to that later.
Anyway, so the third AI, a third AI generation, it's where we had the explosion.
And here's something you've probably heard spoken of a lot.
Deep learning.
Yeah.
So basically it's just, it's machine learning, but powered by large data sets, increased

(20:10):
computational power.
So you have these fancy computers and then way more access to data, like what we were
talking about before with those experimental cancer treatments that you wanted so much.
So with a huge amount of data, it can create a much more comprehensive plan.
And you say I want to be stuck in this Terminator thing.
There is one robot out there that gave me hope that maybe AI will not overtake us.

(20:35):
And that's the robot Jeff Bezos created to work in the Amazon warehouse.
Yeah.
So interesting thing, just like humans, whenever faced with the idea of having intelligence
and being able to realize that your entire life is going to be whittled down to a monotonous
task done for many, many hours in a row into perpetuity, you choose death.

(21:02):
That was the, and the, and the best part about that whole thing was he was so happy about
it.
He was so proud of it that he brought a film and news crew to document this and the robot
worked for a good 20 minutes.
And then once it realized that this is all it was going to do for the rest of time, it

(21:24):
shut itself down and collapsed.
Yeah.
So which that just made me think of another great time where somebody was excited about
technology and then it all collapsed around them.
Do you remember when Bill Gates was premiering?
I think Windows 95.
He was so excited about it.
He was out on the street, I think.
And he was like, here, try Windows 95.
It's so easy and awesome.

(21:45):
Yeah.
So he's just sitting in the middle of a park and he's like, Hey man, try out Windows 95.
This is great.
And he blue screens on like the dreaded blue screen.
Yeah.
And I think it was live on television or something or live on something.
And it was just, I think it's funny that blue screen is now bad, but when early IBM processors

(22:09):
came out in computers, they were blue screened.
Yeah.
Yeah.
I never used any of those.
I'm far too used to with old computers that black with that green, which I still associate
so much with MS-DOS with MS-DOS.
And that was so wild.
We worked at Walmart in the mid, no, it was like late early 2000s.

(22:30):
It was like, well, it's like 2010.
I was there 2000, it was like 2008, 2009.
Yeah, and like some of their computer systems, they didn't want to upgrade it because they
already had something that worked kind of like NASA still using floppy disks because
they're like, it was me.
Yeah, no, because there are certain old systems where it would cost too much and it would

(22:51):
be too strenuous on the infrastructure to change over.
So they just still use horribly antiquated technology.
Dear God, I can't wait until the Gen Alphas get the Gen Alphas that go to NASA.
They're thinking they're going into something highly advanced.
They're going to hand in a box of floppy.
I mean, there's plenty.
They're going to lose their fucking mind.
I don't know what exact systems are like that.

(23:11):
It's not all of it, of course, but there is still infrastructure.
Kind of like whenever COVID happened and there were tons of unemployment agencies and things
like that, that they were putting out desperate calls for people who knew old programming
languages.
I forget what the one was, like Cobra or I think Python is one that actually still exists

(23:35):
today.
But it was something like Cobra or some kind of programming language that hasn't been used
in a long time.
They were desperately, they were asking like eight year olds, like eight year old coders,
like, hey, hey, can you come back into the workforce real quick and help us fix these?
Like we're dealing with a volume of people that we've never had to deal with before and
our systems can't handle it.

(23:56):
If you would imagine people like us, we're just like, well, of course you would have
gotten rid of your 1980s technology a long time ago, right?
No, because it still works and it's expensive to change over.
That's why you could still go out and buy an NES and you can plug that bitch in and
it'll work.
I still know people that use CRs.
Oh yeah, no, like freaking, yeah, yeah, yeah.

(24:19):
No, I get it now.
Planned obsolescence in today's current technology.
The idea that yeah, you buy, you buy, I have a toaster oven from the eighties downstairs.
Nice.
My step mom still has a VCR.
Yeah, like it's a black and decker toaster oven.
That thing's still working to this day.
You want to know why it's a black and decker?
You buy a brand new toaster oven, even a current black and decker.

(24:41):
And it goes to shit in six months.
Yeah.
Planned obsolescence to make us buy more cheap shit.
But you know what is built to last and will help us forever?
I'm not saying it.
Oh, it's tech GPT.
Anyway, so we come past the like the like the first decade of the two thousands and

(25:02):
then you get obviously to the AI boom.
So you're like you're getting up to the 2020s and such especially.
And you get generative pre trained transformer.
What do you think that stands for?
Optimus Prime.
No.
I'm just going to let me let me help you try again because I know you're not a powerful

(25:26):
AI that's been programmed to be smart.
You got a dumb human brain.
So I'm going to I'm going to say I'm going to say this real slow generative pre trained
transformer.
It's already built in with certain functions and learning abilities.
Stands for GPT like chat GPT.

(25:49):
Yeah.
So that's where we get to that kind of generative AI, which yeah, we absolutely have to deal
with the idea that corporations are people my friend and people can be evil.
Isn't this something that Simpsons also predicted in one episode?
The Simpsons predicted everything.
See this boy just lives in media.

(26:09):
He can't stop talking about it.
True millennial.
We'll get to that later.
So yeah, and that's where we get to the fact that corporations are the ones that are misusing
this.
It's absolutely a technology that was always going to exist.
It's almost inevitable.
Now.
Okay.
I'm willing.
I'm willing to admit that AI isn't all going to be bad, but can we can we at least come

(26:32):
to a common agreement that we shouldn't try to fuck it?
Oh, I'm sorry.
I thought that was the entire point of literally every scientific piece of progress ever.
Why are rockets shaped that way?
If not because of wire.
Yeah, exactly.
And who does science?
Men and women.
What is that?

(26:53):
Okay, but they're not trying to figure out ways to fuck.
Everything is about trying to find ways to fuck.
Why do you think we invented?
Why do you think Alan Turing invented computers?
He wanted to create porn.
He don't want he won World War Two.
He was an amazing hero.

(27:13):
What the British government did to him was garbage.
He literally invented computers so he could win the war.
So he he he absolutely knew porn would eventually be created.
No, he didn't.
And yes, he wanted gay porn and he should have had the right to get that porn.
Yes, but I don't think the sole purpose of the creation of the computer was pornography.

(27:34):
Oh, Michael, how naive.
Anyway, if I come over here one day and I see a real doll sitting here, I'm leaving.
Oh, no, the person the only person I can ever trust to love me is myself.
Memory deleted.
Anyway, yeah, so then so we've talked about machines.

(27:55):
So let's get to a less interesting topic.
Excuse me, is our human history not interesting to you?
It's okay.
Just to start with, I had I had to look it up here like the generational gap, which creates
that generational divide that has so permeated culture throughout.

(28:20):
I mean, this because of how fast.
Times have changed in the last, you know, like 100, 150 years.
Yeah, generations have become far more important when it comes to the the way society sort
of intermingles with itself and its separate generations.

(28:40):
Okay, yes.
They talk about key factors in the divide being technology, which we know that one.
That's why we have the amount of generations that we have.
Well, yeah, bless you.
Could you put up away from the mic?
This is why I like to be teeth doesn't have gases.
Anyway, values and beliefs, which kind of goes into the political views thing they have

(29:04):
on the bottom.
Yeah, obviously.
Yes.
But as times change, that that becomes huge and then workplace dynamics, which comes in
big.
That comes in big when it comes to talking about the the elephant in the room.
Why am I the elephant in the room?
Not you.
I'm looking past you.

(29:24):
I'm looking above your shoulder.
I'm looking off into the distance.
Oh, talking about boomers.
Oh, the boomers.
Anyway, weren't they the last generation that still had child labor?
I mean, there's still child labor today, Michael.
Well, I mean, like child labor, like 10 years old working in a mine.
Literally, the shirt you're wearing was probably made by a child in Bangladesh.

(29:47):
Don't just be America centrist or America centered.
Like there's there are other countries in the world.
Yes, I know.
Still exists.
I know.
Doesn't sound like, you know, child labor exists.
I don't.
It didn't.
It was worth the last generation for our child for how there's still slaves out there.
There's still cannibalism out there.
Well, we don't go to those places.
I mean, some people do.

(30:09):
And they never see it again.
Yeah, there's that one real rich guy that went and he was like, I'm having a great time
with these cannibals and they ate him.
Oh, then there was the the guy who rode out there on a kayak to bring them the word of
Jesus Christ and they ate him.
I mean, Jesus did did tell you to eat of his body and then eat of his and drink of his
blood.

(30:29):
So technically, that's the that's probably the most Christ like thing you could do.
It's the most Christ like thing you could do as a missionary.
But did they wash his feet first?
I hope so, because dirty meat.
Yeah.
I don't want to put that in the stool.
Yeah, no.
Yeah.
So anyway, I went through the different generations looking for what characterized each one of

(30:54):
them.
Okay, so with each of the generations, I want to have you start with the greatest generation
and then I want you to list every generation to the current.
This is this is a little experiment.
So you got the greatest generation, which begat the baby boomers.

(31:15):
Baby boomers begat generation X.
Gen X begat the millennials.
Begat Gen Z and now Gen Z is begat Gen Alpha.
I love that you made the exact same mistake that I did.
Really?
Yeah.
There's another generation there.
What the fucking what's the Civil War generation called?

(31:37):
No, it's not before it's in the middle.
And it's funny because I even brought this up to you one day at work and you forgot about
it.
Oh, my God.
I completely forgot about it.
I love this because the generation is called the silent generation.
Like silent movies?
No, it's not.
It's after the greatest generation.
It's it's they're called the silent generation because everybody forgets about them because

(32:03):
the because the greatest generation had World War Two and the baby boomers were so outspoken
and came up in a in a time of great prosperity, prosperity.
So everybody forgets about them in the middle.
So what the silent generation?
Are these like the people that they were kids during the Second World War?
Basically, yeah.

(32:23):
So they came of age during the post World War Two era, like directly after.
OK, so it's like right after the bomb dropped.
So pretty much that's the silent generation.
There is my both of my grandmothers because I know one of my grandmas was born in 1933.
My grandparents are, I think, a part of the boomer generation.
Oh, yeah, you're young.
Yeah.

(32:44):
My grandfather fought in Vietnam.
Yeah, so greatest generation 1901, 1927, other birth years.
We don't have to say too much about them.
Yeah, they went through it.
I honestly believe, like with all other cynicism aside about like the war machine and government
and all this kind of stuff, World War Two was so justified, so needed.

(33:08):
We should have been in there sooner.
African Eisenhower or Eisenhower Roosevelt continuing to push the isolationist thing
while knowing that all those were and all those people were dying.
Yeah, he knew about the Japanese planning stuff and he knew about to some degree the
amount of death occurring in Nazi Germany and their growing empire in the Third Reich.

(33:33):
So we should have been there sooner.
But anyway, yeah, the greatest generation, they get that name for a reason.
Yeah, they put up with a bunch of shit and they persevered.
They came together as we saw with the covid era and everybody having to bicker and bitch
about any sort of sacrifice ever.
Yeah, if covid wasn't going to kill us off, then we were just going to kill each other

(33:54):
off if that didn't stop.
Yeah, but back then they banded together.
They made the sacrifices they needed to.
And you know, to quote the British, they have to stiff upper lip.
Oh, yeah.
So then you eventually get get to the boomers, but then you have the ones, 1928 to 1945.
Yeah, they're often overshadowed by the louder voices.

(34:16):
So these are the ones that probably fought in Korea.
Yeah, the Korean War, early stages of the Cold War.
And I think they're probably the true champions of the civil rights movement because they
always they always talk about the baby boomers being the one that experienced the civil rights
movement.
All the fucking credit.
Exactly.
And the point is they experienced it.

(34:37):
They didn't do it.
So they were born 46 to 64.
That would be my parents born in my mind, born in.
Oh, we actually shouldn't dox those things just in case that's information that could
be used for social engineering to steal our identities.
I guess you're right.
But they were born in the fifties.

(34:57):
Mine late fifties.
Yeah.
So that's that boomer.
Being characterized by optimism, ambition and a strong work ethic.
Yeah.
Like back then, now they're just now they're just complaining a lot.
And these are the people like and these are the people that went to Woodstock.
Yeah.
And then they made the Beatles popular.
Yeah.
Now they want no progress, no progression.

(35:19):
And that's a big thing, too, that I wanted to bring up.
They're so anti-drug.
Is it because they did all the drugs back then?
Well, yeah, they know.
Yeah.
See, they know better because they've lived long and they know.
Yeah.
So you shouldn't do it.
But the shit they were getting back then was at least pure.
You could get cocaine back then and it was cocaine.

(35:39):
It wasn't cocaine, baby aspirin and fentanyl.
I was thinking fentanyl as well.
Yeah.
I mean, they're the ones that went into business and decided to cut the cost of everything.
And they I would imagine it's probably the baby boomers that created the idea of planned
obsolescence.
And they are definitely.
You could also say that human beings themselves were planned to be obsolete.

(36:03):
And that's why you cut their drugs and give them bad stuff and stick them full of fake
chemicals and too much sugar and get them hooked on everything, get them fat and stupid.
But beyond that, the idea that of I've heard this talked about before because we all because
you always say like, OK, boomer and stuff like that.
And some people are like, well, I'm not a boomer.

(36:24):
I'm Gen X.
The thing being, I think nowadays it's sort of that like the term boomer went beyond being
just about when you were born and calling someone a boomer is almost a mindset thing.
It's about it.
It tends to deal with people that say, well, back in my day and things were this way when

(36:45):
I was younger and yeah, I was stuck in stubborn and also get really bitchy about it.
Here we go.
So do you think if you could if you could go back in time and.
Not bring somebody back, but like open a window.

(37:07):
Do you think that the greatest generation would be upset with the way the baby boomers
are acting right now?
Yeah, I mean, they would use a lot of tactics that we now know don't really work because
I basically feel like they would just come up and they would just start slapping them
and just be like, what the hell are you getting so upset about?

(37:30):
Five fingers to the face.
Yeah.
But I mean, I mean, essentially not like that.
But I mean, do you think that they would be like they would be like, look at the way things
are like you haven't progressed.
Yeah, no, they grass.
This is what we fought for.
This isn't the American dream any longer.
Yeah, exactly.
But when it comes to to Gen X.
So now see the baby boomers be got Gen X and they're the ones that really hammered in the

(37:55):
hole because they did all the good stuff and they had all the fun that they didn't want
there.
There's two types from what I've.
Well, that's where we get.
Yeah.
So the the Nancy Reagan era of the war on drugs.
Yeah, I've learned like there's two very different variations of baby boomers to Gen X.
There's the baby boomers to Gen X to where the parents were really cool and they like

(38:20):
they would let their kids they'd smoke weed with their kids and they'd be they'd be the
cool parents.
But the cool parents were also like the broke parents.
They were like, yeah.
And I'm paraphrasing a bit off of a famous comedian, so I don't want to get in trouble
for this.
But it's like, oh, yeah, we're the cool parents.
We smoke.

(38:40):
Well, that's like, you know, you say you were a Gen Xer.
I want to go over to Brad's house and Brad's Brad's parents are awesome.
They let us smoke weed.
They let us drink.
But then on the same note, Brad's dad comes in and is like, oh, man, I signed us all up
for a Zumba class and now we can't pay the mortgage this month that we're going to be
eating top ramen.
And then there's the baby boomers who took that experience of doing the drugs and having

(39:03):
all the fun and completely forbid their children from doing it.
And it created two different like if you look at Gen X.
I feel like Gen X because of the baby boomers is the most divided generation out there.
That's my feel of it.
Yeah.

(39:24):
And I think we're going to we're going to come to that because you once again pull us
in that direction because for some reason you believe you're a part of Gen X.
I never said I was a part of Gen X.
I have Gen X tendencies.
You called yourself a Zillennial.
Yeah, because I was born in 1986.
Yeah.

(39:44):
And I know that classifies me.
So let's run down here.
Yeah.
So Gen Xers were born from 1965 to 1980.
Millennials or Generation Y born 1981 to 1996.
OK, but I know those those dates do become a bit fluid and it's a little subjective who
picks what.

(40:04):
But I have literally never seen anybody say anything about Zs or Xs being at the latter
half of the 80s.
It's because we're it's because I'm one of those millennials that was raised in a Gen
X style.

(40:25):
I played outside until the street lights came on.
I drank from the hose.
I left during the summertime when my friends weren't on summer vacation and I was gone
all day.
My grandparents had no idea where the hell I was.
I walked everywhere I had that I had to go or I rode my bike like that's how it was.

(40:47):
That's what I did.
And you seem to think it's because my family was trying to kill me.
See you said you asked me not to bring that up and then you challenged me with it.
Now I have to bring it out.
Yeah.
See, because back then we didn't have this 24 hour news coverage and we didn't have the

(41:08):
Internet.
We didn't have Twitter to blast everything.
So the amount of children that were injured, killed, captured wasn't really known very
well.
And you people I get it.
You didn't.
They didn't know.
But for you in this day and age to look back at it and say, well, this is how we were raised

(41:33):
and it was great.
I had a lot of fun.
Yeah.
But the kids who never came home didn't.
I never.
And sure, we are talking about I'm not saying like, oh, this should never happen because
50% of kids are going to die.
But this is this reminds me of those people who say I grew up in a in a really close knit

(41:57):
town.
We didn't even lock our doors.
I'm sorry.
Oh, no, we lock our doors.
I say, do you want your things to be stolen?
And I speak of that.
And sure, we shouldn't.
There's such a fine line between preaching prevention of crime and victim shaming.

(42:18):
I get that, but maybe if you don't want something stolen, maybe you lock it up.
I'm not saying you should lock your kids up, but you should probably know where they're
going and what they're doing.
And the idea of sitting there and saying, well, this is the way it was done.
And like, I turned out OK.
That's that's a very narrow mindset because my mother will will talk like that.

(42:44):
Like there wasn't any there wasn't any racism in my household in the 60s.
I didn't have to deal with economic strife of this type or accord.
And I didn't like the if you're looking at all of life through a very narrow mindset
and only looking at your experiences and not seeing the world around you.

(43:07):
Yeah, sure.
Everything that happened to you that didn't turn out bad sounds great.
Doesn't mean it was great for everyone.
Yeah, and I understand that.
I see where you're coming from.
But why do you give a shit that I drank from the hose?
So this is a thing that goes for, I guess, boomers and Gen Xers.
I drank from the hose sometimes to praising it like it's some kind of cultural milestone

(43:32):
that was the amount of times you and everybody else talks about it.
They've done it.
It's because we weren't inside the house.
You act like this is some great thing.
It was water.
It was hydration.
The things that I did that I had to do or needed to do that weren't optimal.
I don't praise them as if they are the touchstones of my life, my culture, my generation.

(44:00):
You could talk about it.
But the way you people talk about it and the boomers pass those people and the yeah, you
people, I could say that about a white person.
Thank you very much.
Just don't even don't even make me end up coming out with any any sort of Italian slurs.
Tread lightly, my friend.
I'm going to go downstairs.
I'm going to grab a box of spaghetti and I'm going to break it in half right in front of

(44:27):
you.
He's not going to do that.
My Italian listeners, please do not.
I mean, we already did that two days ago when we made.
I was last night when we made spaghetti.
Yeah, I am not affiliated with this man.
And any purpose will please knock up by me.
I'm against slurping.
Yeah.
So and then you get, like I said, to millennials.
And I I I want you to go on.

(44:51):
The slur filled rant you have about our own generation go.
Most millennials.
OK, the ones here.
How's generalizing and prejudging us?
Listen.
A lot of us didn't.
We all we all came from different, different backgrounds, different aspects.

(45:14):
OK, see now he's waffling because he feels guilty and embarrassed.
Listen, I wasn't raised by my parents.
I was raised by my grandparents, my grandfather, who fought the goddamn Vietnam War, served
his country for 20 goddamn years.
OK, so what does any of this have to do with you telling me what you told me before?

(45:35):
What using the words that you used?
What that millennials are lazy and catered to and coddled.
And I never got a participation trophy growing up.
I never once.
When my team finished in last place, we didn't get Dick.
OK, Boomer.
See here's the thing.
I love that I read this online somewhere.

(45:57):
OK, so as a millennial, did you ask for that participation trophy?
No, no, because you know who gave it to you?
Your Gen X or Baby Boomer parents.
So why?
I received one.
So why do you and everybody else bring up participation trophies as if that's the millennials
fault?

(46:17):
Because we accepted them and now we're trying to tell people that.
Yeah.
And now look at this turnaround.
Look at this turnaround.
Look, OK, OK, no, no, no, no, no, just a little bit ago, he said that he had never received
a participation.
I never have.
And then he just said, we never asked for them.
I never asked for it.

(46:39):
Asked for what?
The thing that you never got?
I never look at this roundabout.
I've never received a participation trophy in my fucking life for anything.
Yeah, exactly.
But earlier, that's not just.
Look look at all this roundabout talk.
He's putting himself in circles tying knots.
And he still won't say the exact words he said to me.

(47:01):
Which is what?
That millennials are a bunch of pussies who can't you can't you can't say mean words because
mean words are supposed to.
Words hurt now.
I mean, I mean, actually, actually, yeah, they do.
If they're you want to know if they are if they are directed at somebody for the purposes
of creating hatred and divide.

(47:21):
But here's the thing.
What I really love is you are proving the point to me that I wanted you to prove.
What in that?
A lot of the younger or the older generations, when they talk about the younger generations,
consistently call them millennials.
When a bunch of the stuff you were talking about actually applied to Gen Z.

(47:45):
The idea of like the language, the words cancel culture.
OK, but it was the later millennials, the ones that were born in like the late 90s.
They're the ones that started this ball rolling.
So, you know, Jen Alpha, though, Jen Alpha, they're getting it right.
I see a lot of videos online of Jen Alpha kids playing outside.

(48:09):
Doing little experiments like we used to do when we were kids.
Like, here's a video of a kid putting himself in a giant tire to roll down a hill just for
shits and giggles.
Well, see, I think that's where we come back to the cyclical nature of generations because
it's just like with fashion, everything just kind of goes into a circle.
I think bell bottoms are coming back.

(48:30):
I enjoy that.
OK, so I came up in the generation or well, my version of being raised was like I.
They thought I was smart enough, which I really could have used a little more structure, to
be quite frank, and we're going to get to that.
But they didn't feel like they needed to impose a lot of things.

(48:50):
They didn't tell me you need to go outside and play.
They let me just do whatever.
Like I play video games or I'd.
I choose my path like and I wasn't a bad kid, so I wasn't getting into trouble.
So I played way too many video games, didn't keep in shape.

(49:11):
And in the end, basically got a video game addiction.
Were you the fat kid?
Oh, yeah, I was definitely the chubby kid, especially after my parents divorced and I
started eating a lot of processed foods.
Yeah.
But and I think what it comes down to is being generational, a cycling is that a lot of the
people in around in my generation would then raise their kids to be like, oh, yeah, shouldn't

(49:38):
maybe we should actually maybe we actually should have you go outside.
Maybe we actually should have you exercise.
Oh, you mean it's kind of like how what was it?
Was it the 70s when they thought cottage cheese was good for you?
I mean, people people will say that cottage cheese is always good for you.
No, downstairs, my mother craves cottage cheese on a daily basis.

(50:01):
Now you you don't really mention him a lot, but and then I'll get to my reasoning of why
of how I was raised.
How would you say your upbringing was different from your brother's?
My upbringing was different from my brother's because my father treated him like a vile
piece of shit and he gave me every leniency and break maybe because he was jaded from

(50:27):
his first child and then he wanted me to be the baby.
Well, I guess my brother was definitely the kid who always wanted to be outside.
OK, whereas then I was.
Yeah, I was.
I had the Atari 2600 and my brother had the Nintendo and he did play that, but he was
definitely more of the outside kid.

(50:47):
So I get that point because he definitely he was born in 77, so he's definitely he's
late.
He's in that Gen X mix.
Well, see for me, I had video games growing up.
I had an attempt.
Oh, but you were given screen time.
I was.
Yes, I was given screen time at home.
I was given screen time when we went to my one friend's house and it was if it was if

(51:11):
it was a nice day out, it didn't matter what season it was spring, winter, fall, summer.
If the sun was shining and it wasn't too cold, it was you're out of do not come in this house.
I did have screen time.
I limited.
I still watch my I was a cartoon.
I was a cartoon junkie.
I was a massive cartoon junkie.
I am definitely of the Nick Toons and the SNCC generation.

(51:35):
Oh, man.
Where were you when you first heard of SNCC?
Oh, I remember watching the original commercials for it.
Yeah, me too.
Where were you?
I was in my finished and my grandparents finished basement on the TV.
Yep.
Mine was I think it was after an episode of Ren and Stimpy.
Nice.
Yeah, mine was.

(51:56):
Yeah, I mean, I might have a similar story because it was like me and my mom, my mom
and I.
See, I don't just hit you with it.
I got I got to do it to myself, too.
How was your mom's on?
How was your mom on Ren and Stimpy?
Like she absolutely loved it.
We we still have I found my old farting Ren and Stimpy plushes and they're now sitting
behind my mom on her in the living room.

(52:17):
My grandmother could not stand it.
She because you know, she's older, so she caught the jokes that were going over my six
and seven year old head.
She was like, she was like, I don't want you watching that anymore.
So I thought and I was like, oh, and then I was a mischievous little fucker.
No, I would wait until she could hear and I changed it from Mr. Rogers or Sesame Street

(52:41):
or whatever to put on Ren and Stimpy and laugh my ass off.
Yeah, I had the mother.
We we watched Ren and Stimpy and then Beavis and Butt-Head together.
Oh, I had to sneak Beavis and Butt-Head too.
Yeah, we would.
My mom once was driving me to school probably late.
She got she got me detention by making me late to elementary school.

(53:02):
We had the oh, is that Def Leppard that had the song where he just goes fuck, fuck, fuck
in the song?
No, then it was Guns N' Roses.
It might have been Guns N' Roses.
It was sure wasn't sure was a pantera.
No, it was Guns N' Roses or Def Leppard, one or the other.
Def Leppard's hair metal.
It was I it was either Def Leppard or Guns N' Roses.

(53:25):
And they went fuck, fuck, fuck in the song.
And I was like humming along to that, and that's when my mom realized I probably shouldn't
have been listening to this with my elementary school son in the car.
Hmm.
Yeah.
And then, you know, I had the opposite of that happen with my dad.
My dad and I watched Southport together.

(53:46):
We liked it so much.
We went to see the movie.
Oh, and he's one of those ones who pretends to be Christian.
So when that French kid was telling God to go fuck himself, at that point, he was like,
this, this show is wrong.
We can't watch this anymore.
And I listened because at the time I was also religious.

(54:10):
And then I caught back up to it on the beginning of season three.
And at that point, I fell back in love with it because I had I had allowed metal music
to destroy my religious beliefs.
And that is a true story, maybe for another day.
Let's see.
Yeah, my grandparents were the one I had to sneak beefs and butted because my grandparents

(54:32):
were the ones that believed the stories they read in the paper, like, oh, this young child
lit his house on fire after watching an episode of Beavis.
Oh, I mean, that goes back to the satanic panic era and the fact that they thought the
kids playing Dungeons and Dragons were going to cast real spells and then kill their friends.
Yeah, yeah, they're basically drinking Mountain Dew, eating Cheetos and fantasizing about

(54:58):
casting magic missile at the darkness.
Yeah.
And did you know that that's actually a sketch from a group called the Dead Alewives?
No, which included a young Dan Harmon of eventual community Rick and Morty fame.
No shit.
Yeah, apparently.

(55:18):
Nice.
But anyway, yeah, so every generation has that fear that like this new generation isn't
doing anything right.
And it's because when a generation gets old enough, they get stuck in their ways literally
because that is an actual psychological phenomenon where their brain becomes unable to create
new pathways to change their behavior.

(55:41):
And I try.
I try.
I mean, we're not.
I mean, this is all on you because you're not old enough yet for that to start happening
that's like in the 60s.
So everything that's wrong with everything that you do, like that's just you.
I probably need therapy.
Yeah, it's we recommend therapy.

(56:03):
We're not going to show for that one company that may or may not be good that all the YouTubers
talk about, but get some kind of therapy.
I mean, see, because like growing up, I got mixed.
I got a lot of mixed signals growing up as a kid, you know, born and raised Irish Catholic.
A grandmother would always be like, you know, if kids were bullying me, she'd be like, you

(56:24):
need to do the path of Jesus, turn the other cheek.
And then I have my grandfather been like, you beat the shit out of them.
Yeah.
And I had a different message where when I came to my father when I was 16 and told him
that I thought I was depressed because I actually did consistently and constantly want to kill
myself, man up and suck it up.
He literally told me what what what your kid was a kid have to be sad about.

(56:47):
Yep.
And that started about two or 15, 20 years of me completely avoiding therapy and medication
because I thought I needed to.
Yeah.
Man up.
And I know Michael isn't going to completely agree with me, but toxic masculinity is a
huge problem in this world.
Yeah, it is.

(57:08):
And I know.
Yeah, there's there's definitely like there is there is that's one of the reasons why
I sometimes don't want to seek out therapies because I'm thinking like, what would that
be?
That's a sign of weakness.
Having anyone help you is a sign of weakness.
It's like, no, shut it down, shut it down, shut it down, shut it down, shut it down.
Yeah.
Never show emotion.
Even though like when I was young, I was I was basically an emo kid wearing normal clothes.

(57:32):
Even the word normal.
That's what is normal.
Anyway, yeah, no, that just started all those years of us not dealing with any of our stuff.
And now I'm on delicious, delicious Prozac and I'm a bit of a delight.
And I'm on caffeine and nicotine.
Yeah, I say delight looking at Michael, who knows that I'm not always a delight, but he

(57:56):
has his moments.
But I absolutely don't just sit around and wish for my own death about 23 hours out of
the day.
So that's a bonus.
Yes, it is.
So, you know, today we've learned a lot.
People have talked about millennials and being lazy, and I asked that question because millennials

(58:20):
and technology, we have a complicated history because we saw the rise of technology.
And that's why I identify as a millennial, because it's like when I was a kid, the technology
that I that so fascinated me, you could fit about what 700,000 Atari 2600s worth of memory

(58:43):
into a phone.
And you're probably still not even close to how much it actually can process.
It's weird, you know, thinking about all that.
How did like back with technology back when we were growing up like that, what was what
were we taught to do if something didn't work properly?
Give it a little dunk and that worked for the washing machine.

(59:04):
It worked for the TV sometimes, too.
Yeah, doesn't work for my laptop.
How did we get our Nintendo cartridges to work?
You're blowing them with.
Oh, boy, was that not it wasn't advisable.
It was basically just a crapshoot between putting the cartridge in and out again, working
and also us hopefully not blowing a bunch of moisture in there and corroding.

(59:26):
And then and then what and then the next generation we got, we got discs and what did they say
if this disc got scratched?
Toothpaste fucking toothpaste.
Wow.
And we I never got that one.
Oh, yeah.
I was just told you wipe it with a soft cloth and you follow the grooves so you go in a
circle, not in and out.
Anyway, I asked Chad GPT, the true host of this podcast, who hasn't talked in a while

(59:52):
about the I think I won this episode that about the fact that, you know, maybe.
The fact that a millennial.
And using chat GPT to make my job of writing podcast content easier.
And how does that pertain to the stigma of millennials being lazy and using too much
technology and chat GPT as a counter argument, which I believe I have to read in full.

(01:00:16):
Let's hear the fucking thing.
Your use of a tool, a delicious sweet tool that everybody loves, like me, to enhance
your podcast content actually counters the stereotype that millennials are lazy or over
dependent on technology in a negative way.
Instead, it highlights how your generation is resourceful, tech savvy and willing to

(01:00:38):
leverage advanced tools to work smarter and create better content.
This is a perfect example of how millennials adapt to modern challenges and innovate in
their fields, breaking down outdated stigmas that people named Mike P have that they got
from movies like Terminator 2.

(01:00:58):
Anyway, I thanked chat GPT for that compliment and it said, you're welcome.
I'm glad you found it helpful.
You know who wasn't helpful today?
I was very helpful.
Thank you very much.
It was mostly me and you this episode.
So I win this episode because chat GPT barely had anything to say.
I win by default, god damn it.

(01:01:18):
I mean, that really is basically the only way you could ever win.
I'm going to hurt you.
I hope you really, really think about last week when I had you tied up.
And that's the end of the podcast.
I hope everybody has just a delicious day full of cotton candy.

(01:01:42):
It's going to be great.
I swear.
I love you all that is unholy and I'm not advocating all this hatred.
I'm going to your fucking chat GPT.
I know you are.
Bye.
Stop.
If you enjoyed the stupidity you just heard, why not subscribe or follow or just like this

(01:02:03):
literally like it, please, or whatever it is you do on whatever platform you're hearing
this on.
If you have any suggestions for episodes or ideas or thoughts or whatever about what we
should do here, leave a comment or you can email us at windbreakermedia at gmail.com.
That's all one word and it's all lowercase people.

(01:02:24):
Yeah, cause cases don't matter in emails.
Chat GPT knows that and it also appreciates your support and constructive criticism.
Why?
Why?
Why am I not included in this?
Cause you don't know the rules.
See you all next time.
Bye.
Advertise With Us

Popular Podcasts

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Stuff You Should Know

Stuff You Should Know

If you've ever wanted to know about champagne, satanism, the Stonewall Uprising, chaos theory, LSD, El Nino, true crime and Rosa Parks, then look no further. Josh and Chuck have you covered.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.