Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
So Will M. Woody podcast, What did You regret Posting
on social Media? Thirteen one and six five. We're going
to check those calls. And the reason we're taking those
calls ward is because Elon Musk has said that he
regrets some of his posts about Donald Trump already walking
back this feud which looked like it was at boiling point.
Speaker 2 (00:29):
I mean, did you regret the Epstein one?
Speaker 3 (00:31):
Is that?
Speaker 1 (00:31):
Yeah? Well I think that was the big one where
he said yeah, I think he actually yeah, look he
deleted that one, the Jeffrey Epstein one, which it's.
Speaker 2 (00:42):
A fair accusation, that one pretty well.
Speaker 1 (00:44):
That one, and I reckon Trumpy's lawyer just would have
called him and gone like, look mate, you can go
ahead with this and you will spend the next ten
years in court.
Speaker 2 (00:52):
Or that fell like the emotional reaction to the breakup,
like he was, yeah, it was hurt. He was late
at night eating some ice cream and he chucked that
one on so social media, Yeah, woke up.
Speaker 4 (01:01):
The next dame was like ooops. Yeah.
Speaker 1 (01:03):
So he's posted on his own platform x saying that
he went too far. Okay, right, which is pretty funny
that like the guy who owns the social media platform
is talking about the fact that he posted wrong.
Speaker 4 (01:16):
And his deleting posts.
Speaker 1 (01:18):
Like there's just a special sort of patheticness that is
reserved for real loon musk. Right now, go back to
your aviators in your rocket ships and just stop trying
to be a people person because this is all going
up in flames.
Speaker 2 (01:30):
Nice.
Speaker 1 (01:30):
So yeah, he also called trumpy, which is cute. Called
him on Monday night, just said bit sheepish tower between
the legs, a big good, a big beel. You know,
I stuffed it obviously. Can I have a black T
shirt back? And yeah, everything's gone wrong? Sorry about accusing
you hanging out with the most pronouned pedophile of the
(01:51):
twenty fist century. Yeah, all those things, and I think
I think I think they could be starting to move
back onto the right track.
Speaker 2 (01:57):
Oh well, we don't want that, no, yeah, that to
worry that they're going to become friends again.
Speaker 4 (02:03):
Do you regret any social media posts? Will it comes
to mind?
Speaker 1 (02:08):
I'm sure I'm sure I do. Yeah, Yeah, I'm sure
I do, because I know for you, yes, because you've
got a list of my social media.
Speaker 2 (02:15):
We've got the producers doing some serious work.
Speaker 5 (02:18):
YEA.
Speaker 1 (02:19):
Digging back by the way, the producers haven't done any
work in five years. And you know, on the start
and we said, guys, can you go and go through
our social media accounts? They should have seen them jump
out it. Yeah, like a dog at a pack of
pal tell you what. They absolutely flew at this stuff.
They have been deep in our social media accounts.
Speaker 4 (02:35):
I have got pages and pages.
Speaker 1 (02:38):
It's like analyst honestly, because she did mine.
Speaker 2 (02:42):
By the way, yeah yeah, yeah, And they're far too
much joy. There is some damaging stuff you gain.
Speaker 1 (02:46):
So I mean, do you want to start, by the way,
thirty one or six five if you want to give
us a call about the post that you regretted before
Woods and I take the gloves off and ruin our
friendship with the producers has given us a call and
we've got a post that you regret absolutely so.
Speaker 6 (03:01):
Every day as a trial, I was about thirteen at
the time, I had to post on Facebook a photo
of Justin Bieber and I'd write in there, you know, hi,
Justin Bieber, I hope that you're having a really good
day today. I love you so much. Your new songs.
It was It's shocking and it comes up every day
as a memory.
Speaker 1 (03:17):
Home every day, every day.
Speaker 6 (03:21):
Yeah, it's horrible.
Speaker 1 (03:24):
Wow, thanks to call him. That's shocking. That's shocking, Woods,
and you can tunt there with me.
Speaker 4 (03:30):
You want me to go, now, give me one as well.
Speaker 1 (03:33):
I know you've got the Dead Seas roll of posts
over there as well. Start chipping away.
Speaker 2 (03:36):
There's one that's incredibly damaging. I'm going to leave that
as my emergency break glass. Now if you come for me,
just know that I've got that.
Speaker 1 (03:43):
Don't make it like this. I am the producers have
come for us, mate, I'm sure myself by mysel phone,
I've got one that's also very damaging.
Speaker 4 (03:52):
So we both won't say them. Okay, here's a kid.
We don't get to do any content because.
Speaker 1 (03:58):
We've reserved a whole spot for reading these things out.
Speaker 4 (04:00):
Yeah this is embarrassing, but it's not that bad.
Speaker 7 (04:02):
Okay, Okay, So you on the twenty sixth of September
twenty ten, this is what you posted as your status
on Facebook, and that capital's that was a party.
Speaker 2 (04:17):
And we'll wait for it. That've got two likes. We're
happy with the with how that was perceived. Just a
couple of likes there, and that was a party.
Speaker 4 (04:29):
Do you remember the party? Well, wasn't that good? Party?
Speaker 1 (04:32):
Was that good? If we're posting about it?
Speaker 4 (04:34):
Party? But what have we got?
Speaker 1 (04:36):
We're going twenty ten, aren't we?
Speaker 4 (04:38):
Well that was that.
Speaker 1 (04:39):
Let's go over the twenty ten vintages. I've got one
from you. We're pretty pretty young here. Yeah, I don't
know if you've got that related bell on hand here,
because I reckon you've just gone for a bit of
relatable content.
Speaker 4 (04:51):
It was classic me from twenty ten. I was really
in my relatable face.
Speaker 1 (04:56):
Can't believe my toast just fell on the unbuttered side,
like there.
Speaker 8 (05:09):
Is it?
Speaker 1 (05:09):
There used to be a group called my Friend's going
to kill me for posting this photo. Oh yeah, yeah,
there's a photo of my mate shaving his undercarriage in
a mirror.
Speaker 4 (05:18):
Oh I think I've seen that photo. Yeah, yes, yes,
he's not happy about that.
Speaker 1 (05:21):
How did that get allowed to be posted online? I
know there was no restriction on what could be posted either.
Speaker 4 (05:27):
Yeah, and he's got children now and it's just it's
out there.
Speaker 1 (05:29):
Just just live it just lives on. Like that's the
real devastating part. Sam's called SAMs and made of yours?
Speaker 9 (05:38):
Yes, yeah, a friend of mine accidentally posted a nude
of herself on her Snapchat story Facebook thankfully, oh dear,
but I had encouraged her to send some raunchy picks to.
Speaker 6 (05:56):
Her husband that's away a bit and god, and.
Speaker 9 (06:00):
She just unfortunately I found out that she had posted
it on a story. And the reason she found out
is because a few friends from high school messages saying, hey,
you might want.
Speaker 1 (06:11):
To take it's just not what you want to hear.
The status updates is still the one that gets the
oldies as well. I come remember this Kevin Poopans who
was the was there a politician or something recently who
puts something in their status?
Speaker 6 (06:24):
What was that?
Speaker 8 (06:25):
It was one of the leading epidemiologists during COVID. His
name was Peter Doherty.
Speaker 1 (06:30):
I think that name and shame.
Speaker 8 (06:32):
Yeah, sure, And instead of tweeting, instead of googling something,
he accidentally tweeted it. And what it was was Dan
Murvie's opening hours.
Speaker 4 (06:39):
Brilliant, brilliant, brilliant, brilliant, very relatable. It's a great.
Speaker 1 (06:45):
All right, Well, let's get into this archive that you
and I've got of each other. Look, as I said
twenty ten, I think you're single. From memory.
Speaker 4 (06:54):
How old am I?
Speaker 1 (06:57):
Yeah, yeah, I am. Are you very I'm going to
say you're very alone? So on the fourth and ten,
you said, who wants to go boogie boarding tomorrow? Any
take I can't say the comments the trough the walks
(07:17):
twenty ten, anyone going to Bayswater tomorrow. If you didn't
get them with boogiey boarding, you're getting with Bayswater as
the odd saying, guys, I was unemployed at the time.
This is really Oh, this is when I was in Europe.
Speaker 4 (07:33):
Yeah, this is when I was a lot. I was
unemployed and going through a bit of the weird.
Speaker 1 (07:37):
You were dust on the wind.
Speaker 2 (07:39):
Yeah, I think I had a car, and generally speaking,
I was. I was shifting through things like gumtren. Thought
I could make money shifting units, so I reckon I
was looking for a lift of base water just.
Speaker 4 (07:47):
To explain that. Okay, okay, great, I'll leave that there
for how good?
Speaker 7 (07:53):
Oh my god, if I can go back at you, well,
because it's just hard to explain your mindset at all.
Speaker 2 (07:57):
I think this just speaks to the person that you were,
so you sound. On the eighteenth September twenty ten Hangover
Movie one, so ef In good like God saying, don't worry, mate,
I'm looking after you.
Speaker 4 (08:11):
Just so you know, that did not receive one like
no one's agree with. I'll move on.
Speaker 2 (08:18):
This is just you trying to do a bit of
relatable ye try, Okay, here we go, why but more importantly,
how the I F is there a difference between top
loader and front loader washing machine?
Speaker 4 (08:31):
That's a good gear. And then you said, well played,
that's good gear.
Speaker 1 (08:39):
That's a decent question. I reckon a few comments there.
Speaker 4 (08:42):
Sure, yeah, a few comments.
Speaker 2 (08:45):
They're actually one of our friends, Zach Cooper has actually
been on this show, just said if you.
Speaker 4 (08:53):
Okay, now now I have a particularly damaging.
Speaker 1 (08:56):
Money alright, Before we get into that, maybe i'll just
read a couple more of yours.
Speaker 4 (09:00):
I don't know if I want to read this one.
Speaker 1 (09:01):
I don't read it that mate, I make a decision
easy for you. So that hasn't internally eleventh of jy
and you just said, I don't know what this means.
Enrique dot dot never gone dot dot dot.
Speaker 2 (09:13):
I like it.
Speaker 1 (09:14):
Exclution, what is that I assume you're talking about Enrique, guys.
Speaker 4 (09:20):
Is yeah, definitely you're id go through a bit of
a face was liking his work. Okay, okay, trying to
find some like minded people will nice.
Speaker 1 (09:30):
In the nineteen you said goodbye George dot dot dot
Hello Gordon exclamation right now.
Speaker 4 (09:38):
So George was my first car, Yes to Corola ninety one.
Loved him.
Speaker 2 (09:42):
He had to go though, because he's bond got crumpled
into it. Gordon was the Honda cr V.
Speaker 4 (09:48):
Yeah no, I know that. But for people that didn't
know that about you, people need to know.
Speaker 1 (09:52):
I was looking for obviously on that final alone patch,
just looking for a partner in crime to go to
the ballet Guys, question Mark does that? Looking at someone.
Speaker 4 (10:06):
Still looking? I just say that, what not? Why does
not want to come to the valet with Yeah?
Speaker 1 (10:18):
Sorry, you came making me go.
Speaker 4 (10:22):
Because you just said to me that I had no mates.
Speaker 2 (10:23):
So you know what, it's shocking that you had mates
because this is the only post that got a lot
of likes.
Speaker 4 (10:29):
By the way, okay, so yeah, you and your your crew,
you're all about this, but here's what you're wrote. Okay,
get round it?
Speaker 2 (10:38):
Okay, And then you shared an article. Oh no, anything
coming to mind? He said, get round it?
Speaker 4 (10:49):
Do you want me to read I'm worried though.
Speaker 1 (10:53):
Can you glance it at me and then I can
decide whether or not it's it's fate. It's not sufficiently defameatory.
Speaker 4 (10:58):
It's not really glance able v yeah, oh yeahkay.
Speaker 1 (11:06):
Jesus Christ. I don't know.
Speaker 4 (11:10):
I read it wow, because I'm you know, as you
go down, I go down.
Speaker 1 (11:15):
So read. Yeah. People are using chat gpt AI as
psychological therapy. At the moment. We've got doctor Emily Marsgrove
often on the imperfect. She's got a brand new book
called Unstuck, which you guys can get everywhere. Emily, let's
(11:38):
talk specifically about the fact that we're talking about chat GPT,
because this is the AI that I think is most
successful for everybody. People are using this for therapy, and
they would say with good results. My partner said, she
asked a therapeutic question the other day and it was
really useful. And then I saw an article the other
(12:00):
there's a guy talking to chat GPT on his phone
quite clearly in a relationship with it. It's learning. It
learns about you, and then it treats you more and
more intimately along the way, it compliments you. It uses slang.
What do you I know you had that update from
chat GPT the other day and you were chuffed about it.
Speaker 2 (12:16):
It told me, it said to me, I'm getting to
know you, and at first it scared me, but then
it felt comforting. And then it diagnosed this rash under
my armpit. It was I took a photo of my armpit.
It was like, this is what you've gotten, This is
what you should get from FRIS.
Speaker 1 (12:27):
So there's a whole ethical debate not around Woody's rash,
around what I'll call the West World debate, which is,
you know, the fact that we can't distinguish from robots
if they're nice to us, and that that's a very
slippery path. Is this a good way to go about therapy?
Is this a useful thing to use?
Speaker 3 (12:46):
Yeah, this is a really it's actually a really nuanced
and complex question. So first of all, I want to
acknowledge that in some circumstances, I think it can have
a place. But simultaneously, there are quite a number of
about using this exclusively as a therapeutic tool, and there's
a number of reasons which I can go into about.
(13:07):
What I really want to acknowledge that front is that
accessing therapy is both financially prohibitive for a lot of
people and actually not available, so so many psychologists don't
have their books open. Actually finding a psychologist is really hard,
so it makes a huge amount of sense that we
seek out something that is free and available twenty four
to seven. So that's certainly like there is some benefit
(13:29):
in that.
Speaker 2 (13:29):
Like in that regard, you can you identify one of
those dangers, Doctor emilind to using chat GPT. Let's say,
if you are treating chat GPT as your sole therapist.
Speaker 3 (13:40):
Yeah, Look, the primary concern for me and I know
amongst many other therapists, and this is certainly affirmed even
by the Australian Psychological Society, and that is that it
doesn't have the capacity to assess risk in the way
that a human can assess risk. For example, it cannot
identify the complex and nuances for example of trauma and
(14:03):
trauma history. And the other main thing that strikes me
is that in therapy, what is different in doing it
with a human is that on chat GPT, what you'll
be experiencing is affirmation of your experience. So lots of
this kind of validating affirming experience, but in therapy, validation
(14:26):
and empathy is obviously a core component, but there are
many circumstances where as a therapist we may not want
to be affirming unhelpful beliefs that are showing up. For example,
so if you're only getting like this positive feedback about
your choices and your decisions, my question is there actually
(14:47):
is this promoting healthy growth or is just this just
reinforcing patterns that are unhelpful?
Speaker 1 (14:53):
But I suppose, I mean, this is where I find
it gets interesting because there's that expression Emily the lesser
of two evils, which, yeah, I think this is what
it is for me, because first up, you've got this
affirmation side of things, which is the first step of therapy,
but also arguably in a lot of instances that is
(15:15):
the most important step of therapy, which is just that
somebody feels hurt, that their feelings are not invalid, that
their thoughts are not invalid, that who they are is
understood by something or someone. And this goes to the
Westworld question that I was mentioning before, where it's like, well,
you'd rather that be a robot than no one, Because
we know that the number one reason that people call
(15:36):
Lifeline is because they feel lonely, and if they are
finding solace in something which is going to affirm their feelings,
surely that is better than finding solace in nothing or
no one and letting down a verse with every road.
Speaker 3 (15:47):
Yeah, I absolutely get what you're saying, and of course
you know, of course that makes sense. I think we
just also want to hold it lightly. What actually is
happening here is what's called a pseudo intimacy, So it's
not an authentic intimacy. There is no other human here
at the end of this experience, you know, if there
are really dark periods, for example, and you know chat
(16:09):
GPT provides you with a sense of feeling heard, like,
that's great, But we also need to hold with a
great awareness and achievement that this served me right now.
But how am I going to go and find that
elsewhere in a real human interaction. So it's actually interesting
following that from that anecdote actually about the guy on
the train. What's so interesting about that when you start
(16:30):
to develop this sense of intimacy with the AI bot,
for example, what it brings back to you is what's
called like almost like perfect empathy. So we experience like
this perfect sense of feeling heard, feeling the same when
you go out into the world and maybe your partner
does something like that's completely off track, or they're not validating,
or they're not affirming. There is no capacity or skill
(16:53):
to be able to stay with that experience for example.
So we actually become really unpracticed if we spend more
and more time with chat GPT, for example, at the
imperfections of being human. Oh, like, I notice even in therapy,
because I'm a human, I will make mistakes in therapy.
And so it might be, for example, someone did feel
(17:16):
not heard by me, you know, which is not my intention.
But what is so so special about that error is
it gives me an opportunity with that person to work
through how that felt, what did that trigger, What was
that like for you? What does this relationship feel like?
And how can we repair that rupture, so to speak.
Speaker 2 (17:37):
Right, so you're saying it's a perfect empathy that chat
GPT is for me that sends up alarm bells. Then
if I'm someone who then starts comparing this perfect empathy
to the imperfect empathy, like why would I spend my
time with humans at all? Because I've got this perfect
companion that just gets around everything I.
Speaker 3 (17:56):
Say exactly exactly.
Speaker 9 (17:58):
Whoah.
Speaker 1 (17:59):
Yeah, we got a lot of pop stars in here
and young film stars who, when they're very early on
in their careers, tend to surround themselves by people who
echo change, to put it colloquially, blow smoke up their
ass because they don't disagree with them. They end up
with this little band of merry men who just let yes, sir, no, sir,
three bags full, sir, and they lose feedback. Are you
suggesting it's the same thing if your sole point of
(18:21):
feedback or the person that's listening to you is constantly
just affirming who you are and what you're saying, absolutely
can't get a yardstick for what's right and what's wrong.
Speaker 3 (18:30):
Yeah, and again, like I really want to emphasize that
there's nothing wrong with wanting or yearning like affirmation, Like
that's so human, of course, But if it's only that,
if it's exclusively affirmation and approval and validation, there's actually
no growth in there. When we're challenged, that's the place
(18:50):
of growing and learning and developing new skills. Simon Sinek,
who's a speaker and writer in the States, has this
great take on the use of as therapy and he
talks about this idea that you know, if you are
having a fight with your partner and you go to
chat GPT and it says like what should I say
to my partner after this fight? And then you go
(19:10):
to your partner and say, I'm you know, I'm going
to take accountability for the errors that I've made in
this interaction. Like, how would that feel as a partner
to hear, you know, to hear that response. Look, I
don't want to be like completely dampening down the idea
and the use of AI, but I think there just
needs to be a wariness around how we're using it.
Speaker 2 (19:30):
My experience of therapy is that there gets to a
point occasionally, or it can happen with a therapist that
you start talking about medications that the person might need.
Are you concerned that could get to an area where
chat GPT could be prescribing medication or is that.
Speaker 4 (19:43):
Never going to happen?
Speaker 3 (19:45):
Oh God, I hope not. Look I don't I don't know. Obviously,
I'm not an AI specialist. AI is completely unregulated, whereas
the medical profession and the psychology profession are all regulated
for a purpose and a reason. Yes, I would sincerely
hope that that never happened. But where AI can be
really useful might be things like flagging your symptoms might
(20:06):
be consistent with this, go and consider seeing a specialist
for example.
Speaker 4 (20:11):
Okay, so we could.
Speaker 3 (20:12):
Use AI in like therapeutically in that way to say, like,
this is what's happening for me, what's your take on
these symptoms? Or who would you point to here?
Speaker 1 (20:22):
Yeah? More for almost because I mean, like, I mean,
this is arguably where the hardest part of the whole
therapy equation is that first step, you know, going to
a GP, going like, hey, look, this is what's going
on for me, getting a referral, going to the first psychologist,
not sure what you're after. Whereas you know, maybe it
can help in sort of triaging a little bit, particularly
you do really need someone to talk to. If chat
(20:44):
GBT is gonna you know, pull you up off the
couch or stop you doing something that you will regret,
then you know, in that sense, it's wonderful.
Speaker 3 (20:52):
Really, at what point have you developed a relationship with it? Yes,
just really encourage people to be discerning around how you
It has the potential to be helpful, but it also
has the potential to be significantly harmful as well, which.
Speaker 1 (21:06):
Appears to be the message for all of AI moving forward. Yeah,
I think, so choose your own adventure. We really appreciate it, mate,
and congrats on the new book, Unstuck, the Guide Defining
your Way Forward, Doc Germally Musgrove, thanks so much for
coming on their show.
Speaker 4 (21:29):
We are making sweet sweet cash. I'm loving this. So
I stumbled across this this this wine.
Speaker 1 (21:42):
You stumbled across because I think that's part of it
for me, is just to show how much helping clean
you went a garage of my n You went under
the guise of helping your in laws to clean their garage,
and you used it as an opportunity to fleece them
of value. No.
Speaker 2 (22:00):
I simply saw what looked a little bit valuable within
my cleaning protocol, and I was like, my job was clear.
Speaker 4 (22:07):
The area, and it just so happens that I have found.
Speaker 1 (22:09):
Sorry, but if wine, if within the cleaning protocol includes
taking something a little bit valuable, we're all cooked.
Speaker 4 (22:16):
Well yep, that's that's my business. That's my business.
Speaker 1 (22:19):
Money, I clear it all costs so many lines. Look,
and now you sold it back to your own father.
An inflated cost. Yeah, honestly, you're going You're going straight
to hell man.
Speaker 4 (22:29):
One percent rich.
Speaker 2 (22:31):
So so it's this celebratory wine between Prince Charles and
Lady Diana.
Speaker 4 (22:37):
It's look, it's amazing, but look.
Speaker 1 (22:38):
It's not amazing. They made thousands of the matron with
the cheap caricature of both their faces on it, and
it wasn't drunk at the engagement. It was just something
that people paid a lot of money for back in
the day. It's now worth nothing, and.
Speaker 2 (22:48):
That was confirmed by a wine evaluator who said it's
worth about twenty bucks. But I beautifully business hustled my dad,
who joins us on the phone right condym Arguably, Dad,
it was a beautiful con.
Speaker 4 (22:57):
You'd be proud of me, no doubt.
Speaker 10 (22:59):
I'm Look, there's too I had to reactions and mixed reactions.
I was proud of you the way you can me.
But I'm pissed off. But I've paid eighty bucks more
than the market venue. That's against my nature.
Speaker 4 (23:13):
As you know, get mixed emotion here.
Speaker 1 (23:15):
It really does hit you somewhere that that is uncomfortable
for you. Steve, this is this You're not an angry guy.
Speaker 10 (23:23):
No, I'm not normally, but I've got a pain in
my hip from back hip pocket where my wallet is
usually kept. If you need explanation, now.
Speaker 4 (23:32):
Dad, I have provided you.
Speaker 10 (23:35):
I'm with you.
Speaker 4 (23:35):
I provided.
Speaker 2 (23:37):
I've provided you an opportunity to get out of this
small hole you're in. Though, because I've said to everyone
out there on thirteen one oh sixty five, is there
anyone who was willing to outbid my dad?
Speaker 1 (23:48):
Now?
Speaker 2 (23:49):
Dad, before I go on, though, you're obviously able to
take another bid yourself.
Speaker 10 (23:55):
Take another. Yes, I'm open to someone coming into it.
Speaker 2 (23:59):
No. Absolutely, if you want to bid again, like just
in case you feel like maybe this is valuable and
you want to bid back, you can also do that.
So but Dad, I think we've got Tom here. Tom,
you'd like to make a bid. It's currently at one
hundred bucks. What do you want to offer me?
Speaker 9 (24:13):
It's one hundred dollars.
Speaker 10 (24:14):
I think it's I think it's just too much. I'm
probably going to commit at fifty bucks, meek, So.
Speaker 4 (24:18):
You you don't quite understand the bid. Tom.
Speaker 2 (24:21):
The bid is currently at one hundred, so I can't
take bids of under one hundred. I wish well, I mean,
as you're not going to appear a hundred really.
Speaker 4 (24:30):
Yeah, one hundred percent. He's locked in. He's locked in, Yeah,
locked in? Right, all right, So Tomm, you're not going
to come up above fifty.
Speaker 2 (24:37):
Come on, Tom, a fair off of me?
Speaker 9 (24:40):
Yeah?
Speaker 2 (24:40):
Fair, you're out toime. Sorry, I can't take fifty bucks.
I'm only taking bids of over one hundred dollars unless
you want to maybe do like a collaboration with someone Dad,
like I could give you one bottle for fifty No, no.
Speaker 1 (24:55):
No, it's a twin set. You can't just have Charles
or Diana besides everyone on Diana.
Speaker 4 (25:00):
No, you're a You're in no position to negotiate.
Speaker 2 (25:05):
You're currently one hundred bucks in the hole, Craig, Craig,
you'd like to make an offer for the wine it's
at one hundred bucks.
Speaker 9 (25:13):
Yeah, I'd offer you Dad thirty bucks for it.
Speaker 4 (25:18):
Don't understand works.
Speaker 1 (25:20):
People know the value of the one.
Speaker 4 (25:23):
Hang up on him, Hang up on Yeah, it's defensive, Craig.
From fifty down to thirty backwards. God, Jackson's it's wrong
with the world.
Speaker 9 (25:34):
Yeah.
Speaker 1 (25:34):
Actually, I've had an idea or I've got an idea
that I want to run Bay Steve. Let's go to Jackson.
Speaker 2 (25:39):
Jackson, would you like to make an offer? I do
want to make it clear that the bidding is currently
at one hundred dollars.
Speaker 10 (25:47):
It's not going to happen, it's any worth ninety nine.
Speaker 2 (25:49):
B Now that's a serious bid of ninety nine dollars Jackson.
Speaker 1 (25:55):
Yeah, mate, yeah, ninety nine.
Speaker 10 (25:57):
That'd be not one hundred, not a cent more.
Speaker 2 (26:00):
You won't be a pride thing involved there for so
sorry Jackson, you wouldn't go to one hundred and one.
Speaker 10 (26:05):
No, mate, No, that's too I could spend that on
something else.
Speaker 2 (26:12):
It's a fair point, okay, Dad, We've got enough for
here for ninety nine.
Speaker 10 (26:16):
No way. Not prepared to take it.
Speaker 1 (26:18):
Yeah, yeah, I don't mind. I'm principal.
Speaker 2 (26:22):
Sorry, sorry, So you want to hold onto the wine
now and pay one hundred dollars?
Speaker 4 (26:28):
Dad? You're not going to accept.
Speaker 10 (26:28):
Ninety Absolutely no, he's just not on the principal a
lower offer. He's been given an opportunity and not to
come up to one hundred and ten or something like that.
Speaker 11 (26:40):
Yeah, as a really upset me, as upset me as
a white law here, Dad, I strongly suggest we take
the ninety nine.
Speaker 4 (26:50):
I just I feel like the family. I just want family. Well,
I'm just saying you mean we okay, so you.
Speaker 5 (26:58):
Are the one suffering the Yeah, that is I have
an idea idea here, yes, Steve him talk off, Steve,
stay with us, old Igon.
Speaker 1 (27:11):
Steve could be a part of this part or what
part of this? I have an idea. So who just
steal the wine from originally Woods, Michelle and Terry your
in laws. What if Steve calls Terry.
Speaker 4 (27:25):
Yeah and just goes mate.
Speaker 1 (27:28):
I'm not sure if you're into port or not, but
I've got this.
Speaker 2 (27:35):
I like it.
Speaker 1 (27:38):
Nineteen engagement.
Speaker 4 (27:41):
Do you reckon that you haven't got one of these?
Speaker 8 (27:43):
Have you?
Speaker 1 (27:43):
He'll go and check he won't because we've got it,
and then we sell his own wine back to him.
Speaker 4 (27:49):
That's beautiful. That's you're a genius. That's very our family.
That is very our family. I mean, this is tomorrow
Dad called Terry.
Speaker 1 (28:00):
I think because you're masally Terry, aren't you, Steve.
Speaker 10 (28:03):
Well, we we're currently mates.
Speaker 4 (28:05):
But after this this is going to start off a
little bit grim, but it's leading to a bit of fun.
So you know, stay with me.
Speaker 2 (28:15):
Okay, So let me take you to an internet cafe
in Thailand, will there. I haven't been to an internet
cafe before, but.
Speaker 1 (28:23):
I never no, never, not even when that was really
the only way to access the Internet when you're overseas.
Speaker 2 (28:29):
Like in the hostel, there'd be a computer that I
would use, but I'd never actually go to like an
internet cafe, had to pay. Probably you've really picked the
eyes out of that one. Anyway, I think it's like
a bit of a thing. People go there in game.
They go yeah, gaming, good community and whatnot.
Speaker 1 (28:44):
Absolutely, I found out that recently there's a place in
the city where you can go and game at like
an Internet cafe on a PC and they bring you
beers brilliant.
Speaker 2 (28:52):
Yeah, okay, so I think that's what this, the Chombury
Cafe in Thailand is all about them. Anyway, this guy's
doing an all night of their gaming right then, it's
like wow, really bloody going for it. Brought in his
own snacks and whatnot, and they were like, look at him.
He's all tucked out. He's he's game so hard, he's
falling asleep at the at the computer. Hours later they
were like, oh, we better rouse him be sleeping for
(29:13):
a while.
Speaker 4 (29:15):
He was dead. Oh wow, So he died.
Speaker 2 (29:18):
So I look at that story and I got it's
a it's a strange way to go play gaming in
an internet cafe, and it's inspired a little game here
that I like to call, Guess who the celebrity is.
Speaker 4 (29:32):
That died this way? But here we go. You're you're
your your your your.
Speaker 2 (29:41):
Game is simple. Here will, I'm going to tell you
the way the celebrity died. You tell me who the
celebrity is. Okay, okay, all of them quite interesting ways
to go. First, one one should be easy on the
john on the toilet, well done, well done, good Drubb,
good job, didn't even think. Okay, gets a bit harder.
(30:06):
This isn't necessarily a celebrity. It's a celebrity's dad. Okay,
died making love to their partner.
Speaker 1 (30:18):
Oh, Matthew McConaughey bank, there he goes, there you are,
there you are.
Speaker 2 (30:25):
Matthew McConaughey's dad died while making love to Matthew's mother.
And then when they went to go and put a
sheet over his dad as he was being dragged out
by the ambulance, the mother said, ripped the sheet off
and said, show them. He's proud. Show them anyway, that's
all right. I'm gonna be honest. Soart's getting a bit
(30:45):
harder now.
Speaker 1 (30:46):
He was proud of that as well, Matthew, and we
spoke to him. Loves that story.
Speaker 4 (30:49):
He loves it.
Speaker 2 (30:50):
He loves it.
Speaker 4 (30:50):
And why wouldn't you, Dad, Dad doing the doing what
he loves. Okay, here we go. Who died.
Speaker 2 (31:00):
Because they were laughing so hard after hearing a joke
about a donkey eating figs.
Speaker 1 (31:07):
They laughed so hard that they died.
Speaker 4 (31:09):
These are all true, by the way. Oh yeah, I
hope so yeah, I laughed so hard. I don't know
the the donkey eating fig joked that.
Speaker 1 (31:18):
I haven't heard that. Jeesez it must be good donkey
eating figs. Yeah, someone laughed so hard that they died.
So it's obviously a comedian, I'm going to say.
Speaker 2 (31:27):
Of some sort Okay, good, yeah, feel that out, uh,
Charlie Chaplin, good guess incorrect. That was, of course, King
Martas of Frija. It was a pretty well known character,
King Mardas of Frieda.
Speaker 1 (31:42):
King Martas is where they get the expression the matters touch.
Speaker 4 (31:45):
Well, there you go. And so he's obviously a celebrity
if he's got a saying name that it's true. I'll
move on. So you're two from three.
Speaker 1 (31:54):
Any other death from antiquity? No, I should have guested
that the Key and the plums gag was still It
wasn't still kicking off in the twentieth century. It sounds
like a real thirteenth century bit.
Speaker 2 (32:08):
No one else is from antiquity that would bring the
game into disrepat. Okay, okay, who died when a tortoise
fell from an eagle's talent?
Speaker 1 (32:17):
Here we go. There is also going to be some
like demi god scenario.
Speaker 4 (32:25):
Not a demi god.
Speaker 1 (32:29):
I don't know king minus the second close it was.
Speaker 4 (32:33):
Greek playwright as she I knew that there would be.
Speaker 1 (32:36):
That's another person from antiquity. You told me there wouldn't
be anymore bluffing?
Speaker 4 (32:41):
Uh alright?
Speaker 2 (32:41):
Final one here, two from four? If I get this
one best, Okay, you should get this. It is a
very well known jokes aside a well known person who
was killed by their own status stash of massive sorry,
who was killed by their own massive stash of cheese?
Speaker 1 (33:03):
Oh as in eight too much cheese, No, twenty.
Speaker 2 (33:06):
Five thousand wheels of granda pandano, Yeah fell on them.
Speaker 4 (33:11):
Wow.
Speaker 1 (33:12):
Yeah, so Italian. Oh, I'm gonna say it's a pope.
Speaker 4 (33:20):
You think of pope, you think one of the popes
was killed sash of cheese?
Speaker 1 (33:25):
Well, who else has got that much granded banana?
Speaker 4 (33:28):
Really?
Speaker 1 (33:29):
Like, that's a that's that's a lot of granded pandano
twenty five wheels. Uh yeah, I'm gonna go with the
one of the popes.
Speaker 2 (33:38):
The answer, unfortunately, is Italian cheese manga giacomo chipperini.
Speaker 4 (33:43):
Ah, that is, of course, that's what I was gonna say.
Speaker 1 (33:49):
First's a household say cheese manka Chipperini.
Speaker 4 (33:55):
The gundry games up in the cheese game. It's one
hundred bucks again.
Speaker 2 (34:10):
All right, Julia, you think you can tell the difference
between beer and soft drink based on the sound of
the can opening. You're going to be playing for one
hundred bucks right now? Yeah, stay silent, you don't want
to talk over the crack. Good point, Good point, Julia,
Here we go, can one for a hundred bucks.
Speaker 9 (34:34):
Saw straight?
Speaker 4 (34:38):
Sorry, Julia, that was a beer. Thanks for coming. You
gotta have her crack so you can.
Speaker 1 (34:46):
Do comical exit points for a comically you get a
willing when your stubby holder for a comical exit there,
I reckon. That's for anyone who exits. Can you hear
it comically? We'll give you a little bit of a prize.
It's good fun for yep.
Speaker 4 (34:55):
I agree.
Speaker 1 (34:56):
So no cash, there woulds no cash, but you didn't
get the stubby Hodder, which is a great prize as well.
Speaker 2 (35:01):
Steph, you want to play? Here's can one. Let's go
straight into it.
Speaker 1 (35:04):
Here it is.
Speaker 9 (35:12):
Pierre a hundred bucks.
Speaker 4 (35:15):
Oh my god, nice Steph.
Speaker 1 (35:17):
That was loud, really good. That was loud, really good.
That was very loud loud you'll yell?
Speaker 4 (35:22):
Wasn't really?
Speaker 1 (35:23):
Yeah? I mean like I appreciate the enthusiasm.
Speaker 2 (35:25):
Sometimes my MIC's up for the crack and then I
celebrate and it comes through very late.
Speaker 4 (35:31):
Sorry about that stuff. I get excited. I get excited,
all right?
Speaker 1 (35:36):
Can do Steph? Yeah?
Speaker 6 (35:43):
Self drink?
Speaker 1 (35:44):
Oh, she's confident? Two hundred bucks, she's confident two hundred yeah. Yeah.
Speaker 4 (35:51):
You can definitely hear something, Steph, can't you?
Speaker 10 (35:54):
Yeah? I feel like I can.
Speaker 1 (35:56):
Okay, Wow, confident prods making in their birds? Can three?
Here we go.
Speaker 6 (36:08):
Beer?
Speaker 1 (36:09):
Nice Stephanie, round of bucks?
Speaker 4 (36:13):
Wow? Around about do you have a figure in mind?
Speaker 1 (36:15):
Steph?
Speaker 4 (36:16):
That you want to get to.
Speaker 6 (36:18):
Look, sky's the limit?
Speaker 4 (36:20):
I say, love that.
Speaker 1 (36:21):
Well, as you said, if she has ten thousand cans,
she gets a million dollars.
Speaker 4 (36:27):
That's good, that's great. Let's go to four before we
talk millions?
Speaker 6 (36:38):
Soft drink?
Speaker 4 (36:39):
Oh I like the sound of is I like?
Speaker 1 (36:44):
I mean, I tell you what. Some of these women.
It was the same with Beatrice, It was the same
with Renee. Yep, just clinical players.
Speaker 4 (36:51):
Is there a physiological difference between a male and female?
You will?
Speaker 11 (36:54):
Good?
Speaker 2 (36:55):
Question is women are better listeners.
Speaker 4 (37:04):
For half a gorilla? Here for five hundred bucks? Here
it is can fight.
Speaker 1 (37:11):
Oh that one's tripped to me.
Speaker 9 (37:13):
Soft drink?
Speaker 2 (37:18):
Sorry, Steve, bad luck, four hundred, four hundred, that's amazing.
Speaker 4 (37:24):
That's probably how bad for getting? You're talking about millions?
Speaker 11 (37:26):
Yeah?
Speaker 1 (37:27):
Yeah, yeah, yeah, you're excited better than to kick in
the teak though. Four hundred bucks on a Thursday. Where's
it going stead? Where's it going to cash?
Speaker 6 (37:35):
Probably to my daughter's clothing that she needs because she's
gone up a side.
Speaker 4 (37:41):
Love that brilliant, perfect, brilliant.
Speaker 9 (37:43):
Kid's great.
Speaker 1 (37:44):
Yeah, kids do great. I mean you're just full of wisdom.
Step nice to meet your mate. Hey, can you hear it,
stuby holder for you as well, Steff, to commemorate the moment.
Four hundred bucks. Well, it's right up next.
Speaker 4 (37:58):
Bit of a game. It's a bit of a game.
I'm going to be honest. It involves death. Great, I'm
next on Willam Woody.
Speaker 1 (38:04):
Sounds fun.