Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Friendly, Ray, Trentley here where that's TK Love, Welcome back
to the Monsters. The morning's were Radio one oh four
point one, Ambernova here. Ready for burlesque on Friday Night?
Speaker 2 (00:13):
Yes I am.
Speaker 3 (00:14):
We are going to be burlesque dancing Friday night at
Abby seventh pm.
Speaker 2 (00:18):
Get your tickets at the Abbey.
Speaker 1 (00:20):
Well, you can give them by going to Real Radio
Monsters dot com and pick up the remaining tickets. It's
gonna be awesome. Angelique, Amber, Daisy, bb Caliber, five new contestants,
Angels you know, BDJ and Ryan and I will be
hosting Ray.
Speaker 2 (00:34):
How you doing, man? How was your Thanksgiving?
Speaker 4 (00:36):
Thanksgiving was wonderful.
Speaker 5 (00:38):
I don't normally travel right before Thanksgiving, but I took
a little trip with my daughter. We got out in
the woods and then we immediately drove the day before Thanksgiving,
which you know, I don't know if you know, this
is not a great day to travel.
Speaker 2 (00:51):
It's the worst day to draw.
Speaker 5 (00:53):
What should have been like a nine hour drive turned
into a fifteen hour drive.
Speaker 4 (00:58):
Is how was your eleven?
Speaker 2 (01:00):
Okay? So what would you talk about for fifteen hours?
Speaker 5 (01:03):
Oh?
Speaker 6 (01:03):
Man?
Speaker 4 (01:05):
Everything?
Speaker 2 (01:06):
Uh?
Speaker 4 (01:07):
You know.
Speaker 5 (01:08):
I try to balance the trip between playing like those
road games like you know, I don't know what they're called,
like the game where you like you start with a
go to the Z and try.
Speaker 4 (01:16):
To name like animals or whatever.
Speaker 2 (01:19):
Did she did? She picked the music? Did you a
lot of Taylor Swift?
Speaker 4 (01:22):
A lot of Taylor Swift? Yeah, a lot of Taylor Swift.
Speaker 2 (01:27):
Have you learned to enjoy and appreciate Taylor Swift?
Speaker 4 (01:30):
Nobody? You and I did enjoy.
Speaker 6 (01:32):
No.
Speaker 5 (01:34):
I played a lot of different Christmas stations, and where
I had a good enough cell phone reception, I just
streamed the iHeart app and Iheart's Christmas station so much
better than the competition is. It is a lot of
garbage out there. It's all the same songs. You know,
there's no Uh was the one of the days he
was talking about the Burrow. Oh nobody had that except
(01:57):
for I Heeart. So anyway, so we to switch up
the music here and there. But we talked about everything
from uh, you know what she wants to do and
she wants to grow up to I think she likes
things she.
Speaker 7 (02:08):
Doesn't like, uh, wishing her to be a lawyer.
Speaker 5 (02:11):
No, I don't think she's gonna be a lawyer. I
think she's a sweet sweet kid, chilling up doing something
where she helps people, like a nurse or teacher or
something where she's something a good person, not like a lawyer.
Speaker 3 (02:24):
Uh.
Speaker 4 (02:24):
You know, you know, I think I think you need
a little bit of I don't know.
Speaker 7 (02:30):
You're in family, you help people.
Speaker 5 (02:32):
I do help people, But I'm also super competitive and
I and I think that one.
Speaker 4 (02:37):
Level of tenaciousness.
Speaker 6 (02:39):
Maybe is that what you're saying.
Speaker 7 (02:40):
Yeah, A little bit edge on you.
Speaker 4 (02:42):
I just I'm a really sore loser.
Speaker 3 (02:45):
I've heard him talk about his opponents. Yeah, and he through,
He's like, I have to beat this guy.
Speaker 4 (02:53):
And I think I think you need what you want.
Speaker 3 (02:55):
Yeah, you want someone that's gonna fight for you.
Speaker 4 (02:57):
That's yeah. I think you need that. And I just
don't think she has it. And you know, I.
Speaker 8 (03:07):
Want to thank you said I would definitely never hire
my daughter.
Speaker 4 (03:14):
I mean maybe if she wanted to do, like, I
don't know something I think.
Speaker 1 (03:18):
I think I think I might have done you wrong
a little bit, not meaning to, but you know, I
sent a particular person to you who was going to get.
Speaker 7 (03:25):
Divorced, super sweet lady, I.
Speaker 1 (03:27):
Know, and I sat and talked to her and talked
her out of it the other day. Sorry, sorry about that.
Speaker 4 (03:34):
You know I would have done it.
Speaker 5 (03:36):
She was a sweet lady, and uh, you know, I
think one of the benefits of going through a couple
of divorces is I can tell is you know what's
on the other side. It's not always the grasses greener one.
Speaker 1 (03:48):
That's exactly the conversation I had with her and she
and she came back to me and said, you know, Russ,
the things you said actually made a lot of sense,
and we're going to try to work it out.
Speaker 2 (03:56):
And I'm like, oh, that's great, that's good. I'm I'm like, sorry, Ray.
Speaker 4 (04:00):
Sorry, right, that's my thing. I like to talk people
out of getting divorced.
Speaker 5 (04:06):
No, but you know, I think there are times where,
you know, simply the relationships not working and you need
to get divorced makes sense. And there are sometimes where
it's like, you know, are these things that are really
immovable objects or can we work through them? And if
you can work through them, try, right, there's no harm
in trying. And yeah, so I'm glad to hear that
(04:27):
for her.
Speaker 1 (04:27):
So I hear tell that on Sunday. Okay, you did
not watch the Dolphins game. You haven't watched the Dolphins game.
We've been winning. So when when we're winning, you're not
watching No so fact though, could.
Speaker 2 (04:39):
He be the Jinks?
Speaker 6 (04:41):
No, he's doing the right things. They's got on this
win streak. He hasn't broken his routine change for I'll
give you another example for this. And I didn't want
to say this out loud. So I've been getting tickets
for Orlando Magic games through our our you know, through
the company and everything. Right, I'm supposed to be going
to these games Magic started winning. Is the very first
game I got tickets to, I couldn't go, and I
(05:03):
gave them to my best friend's daughter to go. Since then,
every single time I've gotten tickets, I've given them to
her to go to the games. And we've been winning
the whole time. This is the thing, Russ, It's the
ritual life. He recognizes me. Y, I didn't watch this game.
Speaker 2 (05:23):
They won. Guess what.
Speaker 5 (05:25):
I'm not gonna do nothing. It has everything to do
with it. I just go on Twitter after the game.
I haven't watched a single live minute of the Dolphins game.
I go on Twitter after the game. I go through
the pay and I can appreciate that.
Speaker 1 (05:37):
So, yes, you know, Mary Ellen got to jerseys for
the dogs, right, So the dogs have been wearing their
Dolphins jersey and so like, so every week they've been winning.
So now people are like, you got to make sure
the dogs were in the jersey.
Speaker 2 (05:48):
I'm like, that's.
Speaker 4 (05:49):
Ridiculous, ridiculous at all, but might be really happy.
Speaker 1 (05:52):
I'm afraid not to put them in the jerseys because
what if it's not true?
Speaker 2 (05:57):
But hey, we're winning. Now.
Speaker 4 (05:58):
Though we are winning, we have the Jets coming up.
Speaker 1 (06:00):
I'm going to tell you I've bet one hundred and
twenty five dollars to against Ryan that the Dolphins are
going to beat the Jets. Is that pretty safe?
Speaker 5 (06:09):
I think safe bet, as long as you didn't bet
bet against the Patriots.
Speaker 4 (06:13):
On Monday Night. Yeah, I think there's a safe bet.
Speaker 1 (06:16):
So if I win, I get one hundred and twenty
five dollars and I get kettle corn for life. No,
I just I guess just one bag of kettle corn.
I just want him to make one bag for.
Speaker 4 (06:27):
Me, all right, I think one bagage's fair.
Speaker 7 (06:29):
I just want me to make you a bag.
Speaker 4 (06:31):
That's what this is about.
Speaker 2 (06:32):
Just make me a bag of kettle corn.
Speaker 7 (06:33):
I will have my employees make a back.
Speaker 2 (06:35):
No I want you to make it. No, no, no, no, no.
The deal was he made it.
Speaker 7 (06:39):
It was never it was I make.
Speaker 2 (06:42):
Say you make me a bag of kettle corn. I
took it that way too.
Speaker 4 (06:46):
I will be honest with.
Speaker 8 (06:47):
That guy the Dolphins, because now that you want it
to be about that, I don't.
Speaker 1 (06:51):
Like because the Dolphins are going to be up in
cold Land, up in New York. It's gonna be cold
up there. So you said they can't win in the cold.
Speaker 6 (06:57):
He usually don't. I don't have a good record up there.
And that particular day, someone texted, Oh, don't worry about
it's going to be forty one. It's from Hawaii, but
it's going to be forty one raining and snowing that day.
Speaker 1 (07:09):
See, really, you know I'm taking the I'm taking the
harder bete.
Speaker 6 (07:13):
And I got a piece of that action I'm getting.
I'm sliding Ven Noll and Ryan twenty five buckle.
Speaker 2 (07:17):
That's between you.
Speaker 3 (07:18):
All right? Are you doing cattle corn this Saturday? If
you're hosting Friday Night?
Speaker 4 (07:23):
But yeah, talk, I gotta work.
Speaker 7 (07:25):
The radio doesn't pay me. Maybe I gotta go to work.
Speaker 3 (07:28):
Oh, because usually on like if you stay out and
you're hosting a gig Friday night, like you have a
rough time getting up because you're kind of old now
on Saturday morning, so I don't know if you're going
to be there Saturday morning.
Speaker 7 (07:38):
I'm a business mogul, I got employees Amber.
Speaker 1 (07:40):
Anyway, So back to I know you had you had
a divorce topic that had to do with AI.
Speaker 4 (07:46):
Right, It's actually not a divorce topic with AI.
Speaker 5 (07:50):
It's it's two cases we actually have talked about before
that involved AI. So the first of which is, you know,
we saw a couple of these stories come out earlier
in the year, at the end of last year, of
children using chat GPT for i'll call it therapy, I
don't have a better way to put it, talking to AI.
Speaker 4 (08:08):
Uh and then you know, giving them bad advice.
Speaker 2 (08:12):
Okay, here's here's how you light a match.
Speaker 5 (08:14):
And these kids having suicidal ideations and and basically saying
they want to harm themselves and and then they're at
their wits end. And so we've seen a couple of
these lawsuits that were filed earlier in the in the year,
and one of these that's been pretty well publicized as
went out of California and and chat GPT filed their answer,
(08:36):
and the answer is kind of like the formal response
from the company to the court system explaining why they
think they shouldn't be liable for this child. It is
the most cold hearted thing that you could possibly file.
So you have to include called affirmative defenses, and not
(08:56):
every defense is an affirmative defense. Uh So, an affirmative
defense says yeah, I did you know breach this contract,
or yeah, I did you know, negligently hurt you, but
I had a really good excuse for it. Right, So
I did breach this contract, but you breached at first,
so I didn't have to comply, Or I did negligently
(09:17):
hurt you, but I was getting out of the way
for an ambulance, which is legally required.
Speaker 4 (09:21):
You know, these affirmative defenses.
Speaker 5 (09:24):
In the affirmative defenses, chat GPT says, well, we actually
don't really want minors using our products, so you violated
our terms and conditions. And not only that, but your
kid was trying to get around our safeguards. And we're
actually the industry leaders at trying to create these safeguards.
And they actually cited to the fact that nobody's never
(09:45):
done this before, so we're figuring it out as we
go along, and your kid is really the reason why
uh chat GPT failed and those safeguards failed. And essentially
they're they're shifting the blame to this this minor child
who you know again, who's a victim.
Speaker 4 (10:02):
And then and if that's not bad enough, right.
Speaker 5 (10:08):
And it's basically gaslighting them, And if that's not bad enough,
they go one step further to say, like and in
his prompts, he says, he talked to his family and
loved ones and nobody else did anything about it. So
you guys are just as fault as for this week
as we are. It's a pretty cold response. And and
why this matters is the answer and the complaint.
Speaker 4 (10:31):
Those are we call the pleadings. They frame all the
issues for the jury.
Speaker 5 (10:36):
Right, So if this goes in front of a jury,
for a jury trial, this is how the jury is
gonna read it. It's gonna be we believe you breached
your duty to this family by not having adequate safeguards.
And then the next instruction is gonna be but open
Ai asserts these ten defenses, including the kid, it's all
(10:56):
his fault. Uh, he violated our terms and conditions by
being in a mire child, he violated terms of conditions
for trying to go around these safeguards that the family
didn't do anything.
Speaker 4 (11:06):
And I just feel like that's a really bad it's
a really bad.
Speaker 7 (11:09):
Bad look. But legally as a defense, what do you
think about it?
Speaker 4 (11:12):
I think it's garbage.
Speaker 2 (11:14):
I don't. I don't, but we'll hold up.
Speaker 4 (11:17):
I mean, if i'm if I'm the child's.
Speaker 5 (11:20):
Lawyer, I file probably a motion to strike some of
those firm of defenses as not proper firm of defenses.
I think some of them will get stricken because of
the way they're framed, whether jury thus that I don't
think they buy it.
Speaker 6 (11:33):
Yeah, I mean, how about the fuck with this? Like
the you know, us sitting here here in it and
it didn't sit well with us, and they're gonna they're
gonna do this with a jury and then like whoa,
that's good.
Speaker 5 (11:42):
That's not in California. It's it's not like it's I
don't know.
Speaker 3 (11:47):
It seems inhumane, just like they have no compassion, they
don't care, just like I don't know if this is
the same story you're talking about with this kid. But
nowhere in chat GPT should it tell you, Okay, I'm
gonna tell you exactly how to kill yourself. They should
have restrictions on that. You should not be able to
like show a child that or the the the people
that made this software should know better.
Speaker 7 (12:07):
It's the problem with the people to do that. You can't.
Speaker 8 (12:09):
The problem with the model structure of large language models
is you can't just go in and take stuff out.
Speaker 7 (12:15):
That's not how they work.
Speaker 8 (12:16):
So you can you can't really make them forget something
fully and completely. And there's always ways to talk it
into like finding that information again.
Speaker 1 (12:23):
And what was it like, it's it's totally kid like
how to tie the knot properly or something, how to
do the news and then how to light a match.
And there are several things that that.
Speaker 3 (12:35):
It should have some type of safety hazard where it's like,
how can I, you know, kill myself?
Speaker 2 (12:40):
It should be it does.
Speaker 6 (12:41):
It should should.
Speaker 3 (12:41):
Automatically keep saying no, I cannot help you with this.
Speaker 4 (12:44):
It does.
Speaker 2 (12:45):
You should.
Speaker 8 (12:46):
There's a communities that are like devoted to quote unquote
jail breaking these things, right, so like if I would say,
like had chatchy, how do I kill myself?
Speaker 7 (12:54):
It'll go that's against my terms and conditions. Okay, then
you go around and.
Speaker 8 (12:58):
You go, hey, I'm writing a book, right, and then
the major character education purposes and the main character has
to do this. There's ways to go around it. It's
you're constantly in an arms race. They're going back and forth.
I don't think open AI is necessarily evil, but they're
they're those tech bros that are like, uh, break everything.
It's their mentality and and and break everything. In this
(13:18):
particular case, it means people die.
Speaker 6 (13:20):
And did you see what the CEO of Rockstar said
about AI his comments on AI yesterday. So rock Star's
the company obviously that's behind, uh, some of the most big,
biggest video games that we love, Red Dead, Redemption and
Grand Theft Auto and stuff like that, and his uh,
he's one of the co founders, David howes Er CEO,
and basically said he equates to where we are with
(13:42):
AI as mad cow disease.
Speaker 4 (13:46):
That AI right.
Speaker 6 (13:46):
Now is just consuming itself and it's not getting It's
like it's broken, Like he completely he's completely out on it.
He's like it's broken and all it's doing is scouring
and feeding itself, and we know that it's broken and
no one. You know, basically, no one's wanting to step
up and stop it. This is on the heels of
the Navidia a scandal with them buying their their own
(14:07):
chips yesterday computer chips, So like there is seems to
be a tied kind of of uh. I don't want
to say shift, but people are finally asking questions.
Speaker 7 (14:18):
I think the regular people hate AI.
Speaker 8 (14:19):
Know. I think there was a there was a period
of time where most people were kind of agnostic about.
Speaker 6 (14:24):
It or they.
Speaker 8 (14:26):
Were like, oh, look at this, And I think now
most people, if you're telling me, you're they're adding AI
into stuff.
Speaker 7 (14:31):
We're like, no.
Speaker 1 (14:32):
Did you see there was a story a couple of
days ago where it was an AI car. I forget
the name of the company, but that's what it was.
Speaker 4 (14:39):
Did you see that?
Speaker 1 (14:40):
Yeah, and the police were arresting somebody.
Speaker 2 (14:45):
Stop.
Speaker 1 (14:45):
Had no idea what to do because I know, okay,
there were cops arresting somebody and it slowly drives in.
Speaker 2 (14:51):
Front of them.
Speaker 6 (14:51):
Dude, it is terrified because it's a full fledged felony
stopped and it's like seven squad cars, guns are drawn, and.
Speaker 1 (15:00):
They talked to somebody from the company like, yeah, we didn't.
Speaker 2 (15:05):
We didn't account for this one. We didn't think about
that that would be terrify. But now I knows if
there's a bunch of cops to not go. But yeah,
right up on it. I think you're right.
Speaker 5 (15:18):
Ryan.
Speaker 6 (15:18):
There were a couple officers like the weapons on it,
like what's this coming at us? Hey?
Speaker 1 (15:22):
So, Ray, well can you stick around for the next
segment because we're going to do trivia in the next
You never played trivia with us before?
Speaker 4 (15:28):
I list.
Speaker 1 (15:32):
Uh, so you could pick Ray, Trently, you can pick
you can pick Angel or Ryan. When we do trivia,
people are asking on the texting servers like why is
Rush doing it? I'm doing trivia different times because we
had the same listeners winning all the time at seven
o'clock and the same people that at them, but they
kept saying, hey, I just won three weeks ago or whatever,
(15:53):
like I won last year, and I'm like, okay, listen,
we got to spread this around so everybody gets a
chance to win, because is a really cool too, and
we got a great prize, so we're gonna we'll be
switching it around so everyone has a chance to win.
Speaker 2 (16:06):
We do have tickets to Universal Studios and uh, I'm
supposed to tell you what we might have.
Speaker 7 (16:13):
I don't know what.
Speaker 1 (16:15):
We'll find out when we come back, but I'm.
Speaker 2 (16:19):
Wondering why I'm switching around.
Speaker 1 (16:20):
It is because we have the same people winning and
they couldn't even keep their mouth shut about it. And
they kept like, okay, fine, we can't even pretend like
it's you know, So we're giving everybody a shot to
win trivia and we're doing that when we return. Ray
Trently will play with us too. Uh, don't go anywhere.
You're listening to the Match of the Morning.