Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:06):
Welcome to Fully
Grown Homos with Matt and Dave a
podcast about our adventures asfully grown homos navigating
today's world full ofinquisitive friends' questions
about gay life and theunexplored activities of a life
lived as fully grown homos.
Speaker 2 (00:21):
We will discuss the
gay 101s, sex, sexuality and
topics we don't even know yet,as we want your input onto what
you want to hear, Nothing is offlimits, so email us on
fullygrownhomospodcast atgmailcom or message any of our
socials atfullygrownhomospodcast.
Well, hello there, dave.
(00:58):
Hello, Good afternoon, goodevening, good morning.
Speaker 1 (01:00):
Whatever it might be
here, it's yeah whatever the
fuck time it is it's actuallymorning here.
It is actually right, it's aSunday morning and we're
recording away and Chanel'sdecided she wanted snacks from
under the pet food thing.
She's been so quiet and so wellbehaved and all of a sudden
she's just decided that we'vehit record.
I think she's taking note outthe girls yeah, she's going
(01:21):
let's do it.
Speaker 2 (01:23):
You're recording too.
Speaker 1 (01:25):
Let girls yeah, she's
gonna, let's do it.
Yeah, well, you're on yourecord into this.
Just fuck you, let's do it,let's do it, let's play up.
Let's play up.
So if we sound a bit different,we're recording on a new
machine.
We're recording on theroadcaster pro 2 and I've tried
editing in all our sounds.
We've created some jingles sothis might sound a bit different
this week.
We're not sure how it's goingto go, but we'll soon find out.
(01:45):
We're trying so far.
So weekly wrap.
Let me just hang on one secondand let's do this.
Oh honey, guess who's back?
(02:13):
Dave and Matt with a discosmack, surfing laughs.
They spill the tea.
The wrap of the week.
Speaker 2 (02:25):
With some attitude,
please.
They spill the tea, the rap ofthe week.
With some attitude, please.
I'm growing and feeling fine.
Speaker 1 (02:35):
Snap those fingers,
sip that wine, get your fix, no
need to nap.
It's Dave and Matt with theWeekly Rap.
Yeah, that's right, it's Daveand Matt's weekly wrap.
So, Dave, how's your week been?
Speaker 2 (02:49):
Busy as usual.
House is progressing reallywell now, so outside got the
rendering done this week.
Yeah, looks spectacular, yeah,yeah.
So the top half got done, as incladding-wise a couple of weeks
ago, and now the bottom half isall rendered.
So and the side, which I'mpretty happy with, they did it
in a day rather than two days,which is great.
(03:10):
Yeah, they sent probably abouteight guys down rather than four
that were going to initiallyturn up and they just smashed
through.
It was great, and I'm glad theydid, because the weather on the
second day of the Tuesday was alittle bit wet, a little bit
rainy.
Speaker 1 (03:23):
Yep, and as long as
they don't charge you anymore,
you don't care if there's 20guys.
No, exactly, that's correct.
Speaker 2 (03:28):
The more the better,
absolutely.
They're all Lebanese guys,older guys, so they work really
well.
I mean, I don't know how theirbacks are faring, because I
don't think I would want to dothat job.
No, it looked like hard work.
Speaker 1 (03:46):
Oh hard work.
I mean I did buckets of likecement and stuff.
Speaker 2 (03:48):
I did try and teach
our temp stay in to take dave
into taking some photos of theguys, some up shorts and stuff
like that.
These guys weren't sort of likethe clientele I would say I'd
want to do that for yeah, but anice big heavy they were nice
people, apart from them, dirtyin my walls with their hands and
stuff, which I didn't realizeuntil after they left, that
they'd put handprints on thewalls and yeah.
So I was a little bit pissedoff with that, but then my day
(04:09):
it goes with the territory.
Speaker 1 (04:10):
I suppose it does.
So my week's been busy, You'vebeen very busy.
I haven't seen much of you atall, not busy for anything at
all, but just busy because I'vejust had lots of work on Lots of
work on Lots of work.
Speaker 2 (04:22):
Well, you've had a
lot of things happening in your
workplace, haven't?
Speaker 1 (04:24):
you.
Definitely it's been a trickyweek, but still very rewarding,
but tricky, but I've been tired.
Speaker 2 (04:32):
I've been so freaking
tired Because the travelling
every day to and from the city.
Speaker 1 (04:37):
I'm used to the
travelling.
I know.
Speaker 2 (04:38):
But what I'm saying
is what would be a normal hour
journey turns sometimes into twohours, depending on what train
you get, and et cetera, etcetera.
I just think.
Speaker 1 (04:48):
I think I hate cold,
I hate this winter shit.
We shouldn't be over already?
And I know all of Sydney'scomplaining about the cold.
Speaker 2 (04:55):
All you have to do is
look at social media and find
out that zero degrees and onedegrees this is quite unusual
because the last couple of yearswe've had quite mild winters
and I think it's a bit of ashock for us again because we
have had this cold weatherbefore.
Yeah, and you know, we've beenso lucky in the past couple of
years where it's been mild andwe haven't really had this
temperature drop, but it is very, very cold.
It really has been bitterly coldit's been fucking freezing and
(05:16):
it doesn't help with the pricesof all the uh, you know, the
energy um things going up whichrestricts you from putting your
heaters on, because you know,you know you're going to end it
with a massive bill, like I hadlast month.
Yeah, yours.
My mind went up like 460 percentjust by running the air
conditioning unit for five hoursa day, yeah.
And suddenly it went from like80 a month to 465 and I thought,
(05:37):
fuck that, I'm not doing that,I'd rather just fucking stay
cold and wrap myself with ablanket yeah, and the other
thing we did do, which fun,really fun.
Speaker 1 (05:44):
we sat here and
created a whole heap of jingles
for different segments thatwe're going to use, some of them
today, some of them in thefuture.
Yeah, we sat here in theRodecaster Pro 2 with my MacBook
and sat here and created allthese different jingles with the
help of today's main topic,which is AI, and I'm just
realising now that we didn't doone to say this is the main
(06:08):
event.
We need to do.
A main event topic.
Speaker 2 (06:10):
We need to do a main
event sting.
Dave, oh God, you know we'regoing to end up having the whole
podcast because it's going tobe jingles.
Speaker 1 (06:16):
Yeah, that's cute.
Speaker 2 (06:18):
That's cute.
Speaker 1 (06:18):
So yeah, main event,
we need a main event jingle.
We'll put one here next time.
Speaker 2 (06:22):
The reason why we
came up with this topic today is
because over the last couple ofweeks, we've been obviously,
well, not couple of weeks Overthe last.
Couple of weeks we've beenreally hammering, you know, the
the music, the music side ofthings, and we've come to the
realisation that we love this somuch that we're going to make
it into a bigger event forourselves.
Speaker 1 (06:41):
Well see, I love AI
and I love I'll say most of what
it does.
Right, I mean look and we'lldelve into the….
Advantages and disadvantages.
Yeah, some advantages anddisadvantages of the things that
we find and things like that,but and some things that I guess
that the internet finds, andwe've even asked AI itself what
(07:03):
are the disadvantages andadvantages?
Speaker 2 (07:05):
of using AI, which
we'll run through in a bit.
Speaker 1 (07:08):
But like I love it
and I've been utilising it, even
just as through, when we do thepodcast, I throw it through an
AI filter just to clean it a bitso that it doesn't sound
amateur-y and I know we soundamateur-y because we are.
Speaker 2 (07:24):
I think it's become
ingrained into our daily lives
now just like a mobile phone.
Years and years and years ago,you know, we had mobile phones
and suddenly everybody had themand then it became your daily
use and people can't livewithout them.
Now, and I think ai is goingthat way as well.
Yeah, I think we're so relianton the, the actual technology
that is providing and issuing us, because I mean talking about
what we were talking about todayeverything on my facebook now
(07:47):
is coming up with ai, ai, ai, aistuff.
Yeah, so again your phone'slistening to you.
Speaker 1 (07:52):
I mean you know my
phone's been listening to you
for a while now, yeah yeah,exactly.
Speaker 2 (07:56):
So what I'm saying is
, I mean, do you sort of like
appreciate the ai or do you justthink, fuck, that, I can't deal
with it?
Well, get on board, is what Iwould say and look from every
context.
Speaker 1 (08:07):
Now, quite a while
ago now, one of my team um sent
me a a AI generated resignation.
Um, they're a phenomenal teammember.
They sort of moving on to theirown chosen career path.
They were studying and theysent me this um, this ai
generated message or resignation, but they'd forgotten to take
(08:28):
out like and write dear such andsuch.
They've just wrote dear and inbrackets was employer right.
Um and they said yeah yeah,pretty much insert your name
here, where your name getsinserted and stuff like that.
Um, so I we had a bit of achuckle.
I said best resignation letterI've ever received, because AI
(08:49):
writes really, really well.
And so we'd sent him a message,a separate message, and said,
like you know, great job, lovethat ChatGBT was able to write
your resignation for you.
And with a chuckle and he wouldbe horrified and no, he laughed
and he's gone.
Well, at least you could do iswrite me a heartfelt reply with
(09:11):
ChatGBT.
So I did, I got on ChatGBT andI wrote him a heartfelt reply
and then I actually left inthere lots of love and I
inserted ChatGBT in the bottomof it.
So it was actually really quitefunny and quite comical, but
the whole like it was so smart.
Yeah, like I I say, recently,I'll say a while ago, I picked
(09:35):
up my resume and I justliterally threw it into ChatGBT
and said please recreate andmake it look more professional
and friendly and warm.
Right, and it did that.
And, holy Christ, I'm lookingand I'm going, I'm going to hire
this guy.
I don't need him, but I'm goingto hire him because it turned
out so good, what's the one thatthe teachers use?
Speaker 2 (09:58):
Every business has
their own type of thing, yeah,
but there's one that theteachers use a lot isn't there
for writing reports and stuff.
I mean because Cleo was tellingus about it.
Speaker 1 (10:06):
Yeah, I can't
remember what it was.
Not sure, is it?
Speaker 2 (10:07):
telling port point?
I'm not sure.
Speaker 1 (10:09):
I can't remember
Because I know that the business
I work with has Gemini orGoogle has Gemini or something
like that.
Chrome has Gemini.
I don't know who it is.
Speaker 2 (10:26):
But that use at work
as such right, because chat gbt
is like open source and so where?
Where do we draw the line,though?
In terms of, like we don't.
I know there's a thing,everything.
You seem to be going that waynow.
Speaker 1 (10:31):
Well, look deep fakes
.
I'm not a fan of deep fakes.
However, I don't want to createsomebody else's persona.
All right, I don't.
I don't want to say, pick up azach efron and put him into a
situation where he hasn'tactually truly been right.
Like I'd love to put him into asituation where he hasn't
actually truly been right, likeI'd love to put him in a
situation Fantasy wise Put himin a situation like my bed with
me or in the sling.
(10:52):
I think he'd top.
Speaker 2 (10:55):
No, I don't, you
don't.
You think he'd be a bottom, Ithink he'd be first bottom, I
think First bottom Okay.
Speaker 1 (11:01):
Okay, maybe.
Well, let's put him into chatty.
No, I am against doing thatwith porn, like making porn out
of.
Speaker 2 (11:08):
No, like I can make
porn in my mind, let other
people do it and then we canjust get off on that.
Yeah, well, yeah, because thenwe're not corrupting ourselves
in terms of like making it,we're just enjoying it.
Speaker 1 (11:16):
I'm also not that
advanced yet.
So I'm learning, and I'mlearning really quickly.
Speaker 2 (11:21):
I know your advice
We've currently.
Speaker 1 (11:24):
What I am advanced at
is cock sucking and a few other
things and a few other things.
Speaker 2 (11:31):
I can take a dick
pretty well, it makes a good
lasagna.
Speaker 1 (11:33):
Yeah, yeah, it makes
a good lasagna Carrots in it.
I don't make any lasagna, Idon't fucking cook.
Everyone knows that.
So, like what AI does reallywell from my perspective, right,
and we've been playing withSonu, which is a song generator.
Speaker 2 (11:51):
Yep.
Speaker 1 (11:53):
We sometimes write
all the lyrics, sometimes write
some of the lyrics, sometimesget ChatGBT to write some lyrics
based on our theme.
Speaker 2 (12:00):
Inputs yeah, yeah.
Speaker 1 (12:01):
And then we can edit
a little bit and then put it in
there.
So it's always got some humantouch to it.
So I don't think it's at thestage where it doesn't need any
human interaction at all.
Right, and it's getting therereal quick, because some things
are just so good.
Speaker 2 (12:18):
But the good thing is
you can change a song just like
that, just by just slightlychanging your input, yeah, or
changing a lyric or changing atitle of a song, and ai will
completely change or sunu willdefinitely change that for you.
Yeah, I've done a few takes onon certain songs that I liked
and I've re-gone in there,reworked them, um, but you've
(12:40):
got your laptop version ordesktop version, which gives you
even more access, which Ihaven't got at the moment.
Speaker 1 (12:44):
Yeah, but it's oh for
Sunu.
Yeah, yeah, yeah, I do, butlike for AI, my biggest and I
want to put it in.
I think that was going into mypet peeves, I don't know.
Well, the good thing is.
The thing that's bugging me atthe moment, and my brain can't
work out how to fix this, isthat I pay for the version of
ChatJBT.
I only pay like the smaller one.
Speaker 2 (13:07):
I'm not doing like
$300 a month or anything like
that I don't need all that.
Speaker 1 (13:10):
I'm not using it for
everything, right, I pay $30,
but I can pretty much dounlimited stuff with it, right
then?
But it won't let me log into mylaptop and to my phone.
I'm just remembering it's.
Speaker 2 (13:24):
Copilot is the other
one.
Oh, copilot, that's the one.
Speaker 1 (13:27):
Sorry, that's the one
, yeah sorry, no, that's okay,
so it's still my head in.
So I'm going to have to askChatGPT how to do it, because
it's just like it's still myhead in, because you can ask it
anything.
So I've literally just asked itwhat are the advantages of AI
and disadvantages?
Right, so the advantages it'scome up with.
(13:49):
The first advantage isefficiency and automation right,
so handles repetitive anddangerous tasks with high speed
and accuracy.
Right.
Reduces human error in areaslike manufacturing, logistics
and data entry.
Right, oh, we see the fewerrors sometimes.
Yeah.
Speaker 2 (14:05):
Especially when it
generates a photograph and
they've got like 12 fingers orthree hands.
You did one the other day,didn't you?
Speaker 1 (14:11):
So I did one the
other day, because we're
actually creating a band.
We are creating a band, yep,yep.
Speaker 2 (14:15):
And.
Speaker 1 (14:16):
FGH, fgh right is our
new band and our lead singer is
May Blaze.
And I've intentionally calledher May because the capital
letter is M and then it's thebig A with a little I, so it's
like AI.
So it's like AI is in heractual name, right, because I
thought it was really cool.
And she's a sexy little fieryredhead that's just so cool.
(14:39):
Hence why her last name isBlaze because she's fiery, she's
got the three other bandmembers?
Yeah she's got three other bandmembers and I can't remember
their names at the moment.
Speaker 2 (14:47):
She's a keyboardist
who has blue hair.
You've got the Asian.
Speaker 1 (14:50):
The Asian drummer
who's like.
She's a cool Asian chick who'sa drummer.
Speaker 2 (14:54):
And then you've got a
sexy daddy looking with a beard
guitarist, right bass guitarist, normal guitarist, but we're
still working on these, aren'twe?
Speaker 1 (15:02):
Yeah, we're working
on the songs and all that kind
of stuff.
Speaker 2 (15:04):
So we've created some
images and stuff like that, but
we have got some songs actuallyalready published, haven't we?
Speaker 1 (15:09):
Yeah, we have got
songs on iTunes and Spotify and
stuff like that.
That's currently under thefully grown homos, if you type
that in.
But we're actually going tolaunch this band as FGH.
We're going to be a lot ofdifferent genres of the songs,
(15:30):
yeah, but fundamentally May willfront most of them with the
other lead guitarist and I can'tremember his name.
Speaker 2 (15:33):
I think it's Ethan or
something like that.
I can't remember what it was.
Speaker 1 (15:36):
I cannot remember
what it was.
Speaker 2 (15:39):
They do a lot of
collaborations.
Speaker 1 (15:40):
Yeah, but he's going
to.
His name is Ethan.
Yeah, ethan, ethan.
Yeah, ethan Ethan.
Yeah, that's it and he's cool.
You've got Ethan, lena andJamie or JD, jd.
Speaker 2 (15:52):
Yeah, that's a cool,
he's a keyboard player.
Yeah, he's a keyboard player.
Speaker 1 (15:55):
But yeah, they're
going to sort of Ethan and her
will front up.
Speaker 2 (16:00):
Different sort of
male vocals will require
different sort of feelings andall that kind of stuff, but yeah
, but watch his face becauseit's going to be interesting.
Watch his face, it'll be lotsof fun.
Yeah, We've had some amazingsongs we've made or we created,
you know.
Speaker 1 (16:13):
I mean using the
technology, but she look but May
herself is just so cool, rightwhen we've created her and we've
sort of I've created a versionand then Dave's created a
version and we've gone oh, Ilike that bit in there, that bit
on there and stuff like that.
So we've sort of morphed her,but AI has been able to do this.
Speaker 2 (16:32):
Did you say morphed
or muffed?
Speaker 1 (16:33):
Morphed, not muff
Yuck, but AI has actually helped
us create this persona.
Yeah Right.
Speaker 2 (16:46):
And it's really
steady and it's given this
really, and the good thing isbecause I know what matt's like
as well.
I mean we, you know, I wastexting him the other day and we
were back and forth, back andforth, and I could see us going
down this rabbit hole of likeyou know, we were just getting,
so eventually get sleep yeah,but it is fun and if you've got
like a common interest, what wecan work towards is fun because
I don't have to be here with youdoing it no, no, no, correct,
correct, correct.
Speaker 1 (17:06):
And it is fun, and
that's the other reason why I
want my chat GPT to work, sothat we can actually log in from
the same chat GPT.
Speaker 2 (17:16):
The only thing that
we've stored on is like trying
to make like small video clipsto go with them.
Yeah, but we'll get there.
We'll get there.
Speaker 1 (17:25):
We're still learning
as we're going along.
So some of the other advantagesare 24-7 availability, right.
So obviously, ai doesn't takebreaks, doesn't take holidays,
it's not unionized.
It's great for customer serviceand things like that.
No HR to report to, no, right.
It's good at data analysis,analysis, analysis oh fuck,
that's a word.
Analysis right, it's good atdata analysis.
(17:46):
Analysis oh fuck, that's a wordAnalysis.
Sorry, see, I would pronouncethat properly.
And decision making.
So it can process massiveamounts and do it really quickly
as well.
Personalization, so it powersrecommendation engines such as
Netflix, spotify and Amazon, soit actually helps with what you
(18:09):
listen to.
Speaker 2 (18:10):
Oh yeah, it will
actually then bring up different
things.
Speaker 1 (18:12):
And, like you said
earlier, when you're listening
to, when you're watchingsomething or speaking about
something, even it all comes upon your phone and we all know
that that happens as well, yep,so I'm not going to go through
the whole list.
Speaker 2 (18:30):
But the disadvantages
, Matt, are you lose your right
to control, I suppose.
Speaker 1 (18:36):
Yeah, well, one of
them here says job displacement,
so it says automation threatensjobs in sectors like transport,
retail and manufacturing.
Speaker 2 (18:43):
Now.
Speaker 1 (18:44):
I don't know.
You're always going to people,especially in the retail space.
Well, you need programmers aswell.
Well, you might not.
That's the thing, right,because AI is clever and it
learns, and it learns and itlearns and it learns and it's
self-generating and it'scontinually learning, so you may
(19:06):
not need programmers in thefuture.
Yeah, when it gets smart enoughto learn everything itself, it
might be just.
Yeah, you're the chief AI.
Speaker 2 (19:17):
and you're off.
Well, I honestly think in thenext 20 years' time.
You know the whole country, Ithink 10.
If you look back, we're in 2025.
Speaker 1 (19:26):
Yeah, 20 years ago we
were just getting mobile phones
.
What are we at 2025?
So when was it 1995?
Speaker 2 (19:37):
Well, the old, the
original ones Like the Nokia
3310s and all that kind of stuff.
The earliest ones were probablyabout the late 80s.
They came out.
They were like bricks.
Speaker 1 (19:43):
Yeah.
Speaker 2 (19:44):
I mean you're talking
like the late 80s.
But technology has progressedto the point where most people
had a phone by the late 90s, Iwould say you know, or
thereabouts yeah.
Speaker 1 (19:57):
So it's going to be
next level.
Some of the other disadvantagesare bias and discrimination, so
it can reflect and amplifysocial biases if trained on
flawed data.
So if you're telling it thatsomething, or if everyone that's
putting inputs is basicallysaying that a certain thing is
is a bad feed like if I'm askingfor people every single time
(20:20):
right to generate pictures ofpeople and it's generating a
certain race or culture orsomething like that, and I'm
giving it a thumbs down andeveryone's giving it the same
thumbs down, it will then stopcreating that race.
Speaker 2 (20:32):
Well, talking of
which I mean, we've had this
situation with a friend of ourswho's had his identity hacked oh
, yeah, yeah, on facebook, yeahand again I mean people taking
other people's photos, utilizingthem as their own.
Catfishing people, as we know,happens all the time.
Yeah right, changing the bodyshapes so this is definitely one
of the negatives, the negative.
(20:52):
To me, the biggest negativewould be how far do they go with
that?
I think can they use it for,like, um, a criminal act?
Yeah, can they make it lookthat you are actually?
Speaker 1 (21:03):
doing something?
Fake id.
Can they open bank accounts?
I'm sure it's happened.
Speaker 2 (21:07):
I'm sure people have
challenged these things where
people have actually made an aivideo and it shows that person
who you think is that person,but they're nowhere near that,
that that scene of the crime orwhatever oh, yeah, yeah, so
you're saying yeah, so theyactually somehow embedded that
that person into a crime scene?
Speaker 1 (21:26):
yeah, absolutely, and
and again.
The possibilities are endless.
So this is, it is a scary thing.
Speaker 2 (21:33):
I mean I'm all for AI
, definitely.
I mean the enjoyment I've gotfrom the perspective I use it,
for I like the fun stuffdefinitely.
But the sinister side and thenegative side and the deep dark
web that's out there, thatpeople take your own personal
identities.
How do I get on the dark web?
I've tried, I don't know.
I think there's a linksomewhere.
Ask AI, he'll tell you oh, Ican do that.
Speaker 1 (21:53):
You could.
Speaker 2 (21:54):
You'd probably end up
losing your credit cards and
everything else.
No, no, good luck.
Well, could you imaginefinancing about yourself that
you'd never done on there?
Oh, that'd be cool.
Or like murdering someone oh,you say never done.
We talked about this on the Mr.
Yeah, we did a couple of weeksago.
Yeah, I can confirm that Mattis not a murderer yet you don't
(22:14):
know everything about me.
You don't know me.
He says that with a little grinon his face.
I've murdered some holes youhave indeed, I've punished quite
a few, yeah, you definitely.
Speaker 1 (22:27):
And a few tonsils,
yes, indeedy, so, yeah, so the
lack of human judgment, becauseit still doesn't have that human
empathy or that kind of stuffagain, yeah, it's learning but
you couldn't really put it in acourthouse situation.
Speaker 2 (22:43):
Well, you've only got
to see the, the, the artificial
robots now these days, the onesthat have, like, facial
expressions that are sorealistic that you know what
you're going to do here?
Speaker 1 (22:51):
no, what do you mean,
dave?
Out of the future robots, whatdo you that you know?
What are you going to do here?
No, what do you mean, dave?
I don't think it's a few robots, what do you mean?
Well, do you know what I mean?
I'm talking in the robot voicenow.
So that's literally just asound effect that Dave has that
I've sort of set up for Dave.
Now he's actually that was justa voice disguise.
So if you want to come on ourpodcast but don't want to be
known, we can actually yeah, Ican suppose you yeah at any time
(23:12):
, or if you want.
I'm a monster when I do thatwhich is kind of fun.
It's pretty cool I mean,there's some club, you know but
hey, you know you're playingaround now.
It's like fucking around withthose.
Speaker 2 (23:28):
Yeah, he's like a
little toy he's's like a kid in
a candy shop.
Speaker 1 (23:32):
Yeah.
Speaker 2 (23:32):
All the buttons are
pretty, aren't they?
The Broadcaster.
Speaker 1 (23:36):
Pro 2 is sexy I love
it Right and it integrates a lot
of AI and stuff like that.
It's learning lots of stuff.
Speaker 2 (23:43):
I don't know where
society's going to be in the
next 10 years.
Like I said for me, how far doyou allow AI to run things to
you know?
Are you going to allow them torun a country?
Are you going to learn to run abank, I mean, are you?
I mean they probably do abetter job.
Speaker 1 (23:58):
Well, they probably
would this is the thing is when
you actually look at theefficiencies of it right now I,
I can if.
If now I'm not, I'm fairly techsavvy and computer savvy and
stuff like that, but I wouldn'tknow how to write certain
formulas and stuff like that.
Speaker 2 (24:15):
Well, you'd learn
that It'll teach you.
Speaker 1 (24:17):
I could take a long
time and go oh, I know this
person knows this.
I can pick up the phone or Ican send them an email.
I can do this, I can do this.
Or I can just say chat, gbt.
I can screenshot what Iactually want to happen.
I can say can you work out howto do this right?
(24:38):
And it'll give me the formulathat I can just copy and paste
into my excel spreadsheet andit's done.
So we're talking seconds, right?
Seconds it's gonna literallytake, as opposed to hours.
Um, and it's just crazy that itwill actually do that for you,
like, like, it will do that foryou straight away and you sit
there and go like this is insane.
So people will lose their jobsright.
Speaker 2 (24:57):
Do you reckon that
teachers will lose their jobs in
the future?
Do you think kids will be allsitting at a phone or a computer
desk and then learningeverything they need to know,
because you could learn thewhole syllabus of a whole school
?
Speaker 1 (25:08):
I would like to think
not because I think a lot of
learning school.
I would like to think, notbecause I think a lot of
learning.
Now, I guess history has shownthat learning happens best when
you're actually customising it.
So our friend Cleo, who's likea phenomenal teacher, I'm sure
she just doesn't stand up thefront of the class and say this
(25:30):
is this, that's that, that'sthat, that's that.
I'm sure she doesn't.
I'm sure that she sits atstudent A and says hey, student
A, I think this is the best wayto approach you, student B, this
is the best way to approach you, so on and so forth.
And we need to get her onactually to discuss lots of
stuff.
But I think that would be quiteinteresting to find out.
(25:51):
But AI couldn't do that.
It couldn't.
I don't believe it could caterto everyone's learning style.
It could learn your learningstyle, I guess, and how you best
learn.
But I think, human element,you'd need to have teachers,
you'd need to have teachers.
There's certain roles you'dneed to have, yeah, so yeah, but
(26:15):
I think, yeah, I don't think itcould replace teachers.
Speaker 2 (26:17):
No, no, no, I look.
I mean I don't know.
I mean you're talking like 10,15 years probably after we've oh
we'll be dead, after we've leftthis past life and come back
again in the future.
Well, I've got no more left youthat's only if you choose not
to remember we're just makingreference to the last podcast.
But that was all good again.
We did all that by AI, didn't?
Speaker 1 (26:38):
we.
Speaker 2 (26:38):
We did a lot of that
via AI as well.
I think, to be honest with you,if you look at your daily
activities, I think you use AI alot without knowing you're
using it, yeah, because we'reasking questions or we're
searching for things.
I mean, there's a lot of fakenews that comes up, as we know.
Yeah, and Donald Trump's famousfor saying that Fake news, fake
news, fake news.
But again, how much of it isand how much of it isn't?
I mean, that's the deliberation.
Speaker 1 (27:03):
Do we follow that as
truth or do we follow it as,
like you said?
Trump and I phased out, sorry,dependence and loss of skills.
So if we really do becomedependent on it, we'll lose all
our skills.
Speaker 2 (27:15):
Well, I don't think
we will, we will.
Speaker 1 (27:17):
I don't think we will
Because if I can, I'm a lazy
fucker, right?
So if I know that I can, all ofa sudden don't need to know how
to do this calculation, forexample, right, and all I've got
to do is ask chat GBT everytime, right, I don't need to
retain that memory, yeah, butyou've got to understand that I
(27:38):
don't need to retain that memory, but you have to, yeah.
Speaker 2 (27:39):
But what I'm saying
is you've got to know that the
result that's given is real.
Yeah, so you still have to knowbasics.
I mean, it's like you asking amathematical equation you would
know what the approximate numberwould be.
Speaker 1 (27:52):
But if it looks so
different, someone else might
Well.
Speaker 2 (27:54):
if you said to me
what's 17 times 21 plus 16 times
21, or whatever I mean, in yourmind.
You could write it down and dowhatever, right, it would give
me the answer.
But, if it came up and it saysthree triangles, four bricks and
five things.
You know that's wrong.
Speaker 1 (28:07):
Well, I don't, but
somebody or some smarter person
would.
Well, that's what I'm saying.
An example, you know what Imean?
Speaker 2 (28:13):
Yeah, so you know,
you will still have your.
You know, I think, you stillutilise your brain power.
Speaker 1 (28:19):
Yeah, yeah,
definitely.
Physically you might not needto, but Definitely.
So AI, are we done with thatpart?
Speaker 2 (28:25):
I think it's fun.
Speaker 1 (28:26):
Yeah, I think we're
done.
We could talk at it all day.
Speaker 2 (28:28):
Yeah, we do love it,
but we did say we want to keep
things simple today.
Speaker 1 (28:31):
Keep it nice and
short and sharp today.
Yep, um, but don't forget um onthe ai front to check out our
band fgh.
Yeah, definitely um, they'll beavailable on spotify and itunes
and all those other streamingplatforms under the fully grown
homos um podcast or fully, fullygrown homos podcast, fully
grows homos on itunes and allthat kind of stuff, or you can
(28:53):
find it on Instagram, but itwill be changed into FGH.
Speaker 2 (28:55):
FGH Well our podcast
won't.
And there's another fact aboutthat we only noticed, or I
noticed today FGH on thekeyboard is the three middle
letters and they go one afterthe other.
They're next to each other.
It's so bizarre.
Speaker 1 (29:06):
It's so strange, yeah
, so right now we're going to go
on to our next segment which isgoing to be called.
Hang on, here we go, Don't Dave?
And?
Speaker 2 (29:19):
Matt are back again
With tales that'll twist your
brain Aliens dancing instilettos, grandma skydiving
with their ghettos.
Speaker 1 (29:26):
It's bizarre, it's
wild, it's queer delight.
You'll laugh so hard you'll peeall night.
So buckle up, get ready.
No worries, it's Bizarre Factsand Funny Stories.
Speaker 2 (29:35):
Woo Conadron fully
grown homos.
Speaker 1 (29:40):
That's right.
This is our Bizarre Facts andFunny Stories.
Yeah, so Bizarre Facts, dave.
Speaker 2 (29:46):
I've got a Bizarre
Fact, which I spoke to you about
today, didn't I?
Yeah, and it's strange.
Speaker 1 (29:50):
It is very strange,
but again, it's interesting.
What's even stranger is how youcame across it and what entered
your brain.
Speaker 2 (29:56):
I don't know, maybe
it just came up my feeds and I
was like intrigued to have alook and I thought, okay, let's
go down this rabbit hole okay,but anyway.
So, yes, I went online and Ilooked on this uh website called
unbox factory and it's prettycool.
I mean, I'm just lookingthrough different things and
it's got here.
Researchers discovered that theproteins in camel tears contain
protein antibodies antibodies,sorry capable of neutralizing
(30:19):
multiple types of snake venom.
A single drop of these tearshas been shown the ability to
break to block toxins from overtwo dozen types of venomous
snakes, including cobras andvipers.
Camel antibodies are smallerand more stable than human ones,
making them more ideal for theuse in extreme environments.
This discovery could lead tomore effective and universal
(30:41):
anti-venom treatments,especially in remote and
undeveloped regions where snakebites and fatalities are common.
So there you go.
So, so, camel tea.
Who thought about using cameltea?
Who tested it?
Well, I don't know, but whatI'm saying is I mean, how
fucking clever is that?
It's very clever, there's a lotof scientists out there that
choose whatever they might doand then just research into that
(31:04):
.
So they're fucking amazingpeople out there.
They are.
But why would you go to a cameland think, okay, I'm going to
extract the camel protein fromthe tears?
Speaker 1 (31:11):
Well, I'm going to
extract the camel protein from
the tears, well, I guess.
And I'm going to use it forvenom?
Well, they test it.
And they test the antibodies,yeah, but what?
Speaker 2 (31:17):
I mean, is I mean
what correlation from?
A camel's tear to snake venom.
Did they go from Unlesssomebody's seen?
Do you reckon they asked AI?
I'm joking.
Speaker 1 (31:26):
No, maybe, Unless
they've seen a camel getting
bitten by a snake and it didn'tDie it.
Unless they seen a camelgetting bitten by a snake and it
didn't die maybe, oh, it couldvery well be.
Speaker 2 (31:36):
I mean, that could
very well be the answer.
It probably is, and theythought well, what?
Part is stopping the camel fromdying.
Speaker 1 (31:45):
Yeah, and then they
discovered that cutting and
making it cry Like how do youmake a camel?
Speaker 2 (31:50):
cry.
Well, they didn't actually cry.
I mean, you've got dust andstuff.
They have got dust on theirheads.
They would have had tears.
What are you trying to pervertme now?
You?
Speaker 1 (31:58):
can sit there and you
go.
Oh, I know you could haveplayed the camel Hachiko, that
dog story, with Richard Gere.
Speaker 2 (32:06):
I cry every time you
could smack it in the balls.
It One hump less.
Yeah, you can take its hump.
Speaker 1 (32:14):
Yeah, I don't know, I
don't know how you would make a
camel.
Speaker 2 (32:17):
So have you got
anything on yours?
Speaker 1 (32:18):
I do so.
I had one.
That it's strange because Ididn't even think about it.
Speaker 2 (32:24):
It's a story or a
fact.
Speaker 1 (32:25):
It's a strange fact,
okay.
Speaker 2 (32:27):
Yeah, strange fact.
Speaker 1 (32:28):
Yeah, yeah, yeah yeah
, the dogs can eat mandarins.
Right Well, right well, it'snot a strange.
It is strange why?
Because I didn't know it right.
So what did you do, did you?
Ask ai I asked google, right,and google said yes, in, in very
small quantities because of thehigh sugars and stuff like that
, um, basically, but even better, even better, um, yeah, the
(32:55):
high sugars and stuff like that,basically, yeah yeah, as I said
, yeah so what?
Speaker 2 (33:01):
it's just started
typing your narrative.
Oh, my phone just started forsome reason.
It just started listening to meand it started typing
everything in my phone only Davewould listen to me.
Speaker 1 (33:11):
I know so, but I have
another mind-blowing fact right
.
This one is strange as fuck Nowagain.
Strange to you, or strange toeverybody?
Strange to me.
Speaker 2 (33:23):
So it could be.
Speaker 1 (33:24):
Did you know that
your stomach gets a new lining
every three to four days toprevent it from digesting itself
?
No, there you go, there you goI did not know that.
So what made you look at that?
I just typed in again chat,chibi, tea.
I wanted some bizarre andfascinating facts, right, and
that's what it gave me.
There you go, right.
So I'm not going to professthat I got this information or
(33:46):
knew this information.
Speaker 2 (33:47):
Yeah but it's
interesting though.
Speaker 1 (33:48):
Everyone knows that
I'm not that smart.
I mean?
Speaker 2 (33:51):
would you ever think
that you or your stomach?
Speaker 1 (33:53):
No, so maybe.
So where does that lining comeout?
Speaker 2 (33:57):
Does it?
Shit out or I should imagine itgets dissolved, doesn't it?
I'm going to have to dig deeperinto this.
You need to get that and comeback to us on that one, I think.
Speaker 1 (34:04):
Yeah, because it's
really bizarre.
Three to four days you get astomach.
Speaker 2 (34:08):
it's a new lining
Well it's like your skin that
comes off, doesn't?
Speaker 1 (34:10):
it.
Speaker 2 (34:17):
So what happens to
your skin away, doesn't it comes
off like it does, yeah, yeah,so is it like that?
Speaker 1 (34:19):
where you're just
getting like a new layer
epidermis inside your stomach.
I don't know, I don't know.
Interesting facts who?
Speaker 2 (34:22):
discovered that,
though?
Who discovered that?
Oh well, chat jbt did, that'sfor sure honestly well I've got
another one.
Another interesting story.
Here's another one from unboxfactory.
It's got.
China has made history bylaunching the world's first ai
operated hospital, staffed by 14ai doctors and four ai nurses.
The um, the family, canautomatically uh, autonomously,
(34:43):
sorry.
Treat up to 3 000 patients aday.
The ai agents use real-timedata, medical imaging and
patient history to make precisediagnosis and even prescribe
medication as well.
Wow, wow.
Powered by China's top medicalAI platform, the hospital
drastically reduces the patientwait time and enhances early
(35:04):
detection rates for commondiseases.
The system also is constantlylearning and making it smarter
each time it interacts withpeople, so, very much, like you
said, it's learning as it goesalong, and this is why this is
where I think it is useful, likeif you go to any of our trust
and I would I probably wouldmind you.
(35:25):
They've probably got moreknowledge than what we have like
the thing is that, like I said,it's constantly learning.
Speaker 1 (35:30):
It's drawing from
sources or would you let a robot
?
Operate on you.
Speaker 2 (35:34):
I'd let a robot probe
me you come back with two
enhanced double G breasts yes,indeed, that's all right and the
penis the size of your nose.
Speaker 1 (35:44):
Oh, it depends on how
big your nose is.
I guess Exactly.
Speaker 2 (35:47):
But look, pinocchio
would be all right.
Who Pinocchio Keeps crying?
I want to be Pinocchio.
Speaker 1 (35:52):
Hang on, wait there,
wait, wait, wait, wait, wait,
wait Say that again.
I want to be Pinocchio.
There you go, you're playingaround again, yeah.
But yeah, I think it's good Ifthat would decrease waiting
times in hospitals, right, andactually treat more people,
because early detection, earlytreatment is a benefit.
Speaker 2 (36:16):
You probably find
that a lot of these surgeries
we've done they won't need to be, you won't even need to be open
.
You can go in like a carservice and you go in, you line
up, you go in and you get sortedand you come back out and
you're fixed.
Yeah, For a while at least, youknow.
Speaker 1 (36:28):
Yeah, I guess the
technology and the way that
doctors operate now with viamachine anyway, because I know
if you're having like a littlesurgery or something like that,
you can actually basically thedoctors will go in with little
pinches and stuff like that.
So it's like, yeah, it's prettycool there.
(36:50):
That's very cool, dave, that'sa really cool fact, that one.
So that was that segment, andnow I think we're going to go on
to our final segment, which isour Nope, they're grumpy.
They're gay.
They've got something to say.
(37:11):
From traffic cues to crookedcues, they'll bitch it all away.
Gabe and Max Pet Peeves yeah.
Dave.
And Matt's Pet Peeves hey.
That's our weekly Pet Peeves,dave, and I'm sure you've got a
few up your sleeve this week.
Oh, I've always got things upmy sleeve man, all righty, hit
me with it.
Speaker 2 (37:27):
Okay, on street
parking.
You know me and my trafficthings have a pet peeve every
week about parking or something.
Speaker 1 (37:32):
Something to do with
cars.
Speaker 2 (37:37):
Something to do with
cars, something to do with noise
?
Anyway, cranky old man, exactlyso on-street parking I hate it
when you have two cars parallelto each other and the road is so
small that you have to struggleto get between them.
Well, funny, you mention thatthat's called my street.
I know, and that's why it's apet peeve, because it pissed me
off the other day.
Speaker 1 (37:51):
It did piss you off
the other day.
Usually Dave parks half up onthe street, half up on the curb
On the nature strip.
Speaker 2 (37:58):
But then I've been
told that I can get fined for
doing that.
Speaker 1 (38:01):
Yeah, he thought he's
not going to take the chance.
Speaker 2 (38:04):
So when I drove up
the street, when we went out for
dinner last night, that's right, yeah.
Speaker 1 (38:11):
And when we drove up
the street I went oh, you're
parked on the street in a normalfashion.
And he went yep, you'll be finefor it apparently.
So I kind of went oh okay, hesaid yeah, it's going to be my
pet peeve tomorrow becausenobody can fucking get past,
because our street's quite, mystreet's quite narrow and there
are a lot of people, because welive near the hospital there are
(38:35):
a lot of people that parkeither side of the road, on
identical sides yeah, oppositeside sorry, I mean it restricts
the flow you know it does, butyou just have to.
Speaker 2 (38:44):
But I thought you
have to stagger it so you can
get by without.
Speaker 1 (38:47):
It's all like I do
get why it pisses you off
because, again, like I said toyou last night, when I'm driving
of a morning and I leave myhouse and there's like ice all
over my windscreen it's reallyannoying that it basically is
just sort of… you are terrified,you're going to knock
somebody's windscreen off andall that kind of thing.
Speaker 2 (39:06):
Yeah, yeah, I get why
that's annoying.
You know what else is annoying.
Speaker 1 (39:11):
Go on Coughing in
public without covering your
mouth, oh.
Speaker 2 (39:15):
I know.
Speaker 1 (39:16):
Oh, my God.
Speaker 2 (39:16):
This is why COVID
emphasised it's winter.
Speaker 1 (39:20):
I get it, everyone's
not.
Well, I get you're sick.
I get you've been fucking, butthe reason you're sick is
because you've been coughed onby some other feral fucker.
Speaker 2 (39:30):
It's when you see
them at the table sitting there
and the people eating aroundthem, and they cough, cough,
cough, cough, eating around themand they cough, cough, cough,
cough, cough into your elbow.
Speaker 1 (39:35):
You think we would
have all learnt this over COVID.
Like into the top of the arm,like not into your hand either.
That's gross.
Speaker 2 (39:42):
And they went to all
that, that, that that effort to
make all those signs the worldhas been trained.
Speaker 1 (39:49):
Stop it.
Speaker 2 (39:50):
Cough into your
fucking the arm, your elbow
right, whatever you call itright, or just open your mouth,
just close it and keep the coughto yourself.
Speaker 1 (39:57):
Yeah, suck it in.
I don't know if you can suck ina cough and I get it, but put a
lozenge in your mouth.
I've got somebody that I knowand they just non-stop cough.
Speaker 2 (40:08):
And just don't come
out.
Speaker 1 (40:09):
if you've got a cough
, I keep saying, like you know
saying to them have you seenthat?
Yeah, it's been like that formonths now.
I said maybe you need to see adifferent doctor.
Maybe you need a lozenge inyour mouth at all times.
Maybe you need to start eatinga different diet, like some
citrus, some honey, lemon, someginger.
I don't like the citruses.
I said, well, they're reallyfucking good for you and you'll
(40:29):
see that I've got.
I can't believe I'm poweringthrough my Phoenix mandarins.
They're the best mandarins inthe world.
I love them.
They're so delicious.
If you're in Australia, go toyour local Woolies supermarket
and get the Phoenix mandarins.
They are the best.
I prefer a pear.
You're a dickhead though I likea pear.
You're a dickhead though You'renot right Fuck you All right.
Speaker 2 (41:03):
Pet peeves, people
who don't like phoenix mandarins
.
Yeah, there you go.
Just that's all his.
Speaker 1 (41:04):
Well, I'm gonna
people leaving their food
wrappers on the table after theyfinish their meals and just
leaving it and walk away whenthere's a bin right next to them
.
Oh yeah, like so you go to afast, yeah exactly.
Speaker 2 (41:07):
I mean, there's so
many tables, people hang around
waiting for a table and yetpeople can be so blasé, eat
their food and be just sodisrespectful and just leave
their rubbish on the table, whenit's literally there and the
bin's next to it.
Yeah, you know what I mean.
It doesn't take two seconds.
They've got to get up and walkpast it anyway.
Speaker 1 (41:23):
It's annoying as shit
.
Yeah, you're right, and I pickup my own every single time
without fail, right, yeah, andoccasionally if I see somebody
that's sort of just leaving,I'll make a big, like loud,
noise and I'll go.
Oh, so you're expectingsomebody else to pick this up,
right, but yet they'll complainwhen they're at the front of the
or where they're at the queuewaiting for the poor Macca's
(41:46):
people to fucking hurry up andserve.
But they've left shit all overthe table, so they've got to
actually go and do that insteadof actually serving.
You clean up your own mess, andI get.
The staff are there to do thatas well.
They keep it clean and hygienic, but not to clean up after you,
you know, but anyway, yeah, soI get that one.
That one's also a pet peeve ofmine, but my next one, the quiet
(42:09):
carriage, right, so I catchpublic transport every day and
the front carriage and the backcarriage in….
Speaker 2 (42:16):
I know which one you
prefer.
Oh yeah, every day.
Speaker 1 (42:18):
and the front
carriage and the back carriage
on the Blue Mountains train isactually a quiet carriage right
Now.
I get on the quiet carriagebecause I put my AirPods in and
I just chill, right.
So I got on the quiet carriagelast week on day and all I could
hear was these two girls makingall this fucking noise and I'm
(42:41):
thinking I'm on noise cancellingmode, I've got something
playing and I can still hear you.
So you're definitely not beingquiet right.
And I was literally about tosay something and this um, older
indian woman turned around andsaid quiet carriage, get up,
move right.
And they looked.
Said white carriage, get up,move Right.
Speaker 2 (43:01):
And they looked at
her and she said get up, move.
And they did so.
It was a physical person, not ayeah, Okay, no no, yeah, she
said get up move.
And they were two young.
Speaker 1 (43:10):
They had two young
girls, right, okay, and they
because they looked as if theywere going to turn around and
give her a mouthful.
But then I think a few otherpeople had turned around at the
same time because she was verystern right, and it was just
like I think they thought theywere going to give her a
mouthful.
But then when they seen thateveryone else had sort of gone
like yeah, fucking, shut upbitches.
They got up and they moved toanother carriage.
Speaker 2 (43:33):
So it was great and
there was a bit of another
applause afterwards?
Speaker 1 (43:37):
No, because it was a
quiet carriage.
We don't applaud in quietcarriages, Dave.
So yeah, that one definitely is.
I can understand that.
Speaker 2 (43:42):
I can relate to you.
Yeah, yeah, they're there for areason.
Yeah, they're there for areason.
Speaker 1 (43:46):
Some people have
worked like 14-hour days and
they just want a little bit ofquiet.
Some is not there just becauseyou want to chill and relax.
It's also there for people withum disorders that have got like
noise disorders and stuff likethat.
So, um, thank you, man.
Well, cranky old men like davethat don't like noise.
Okay, speaking of dave, what'syour next one?
Speaker 2 (44:05):
my next one is people
blocking ambulances and other
um services on the roads andthey can quite obviously see
that they're trying to get pastthem and they just don't move
out of their way or they justdon't know what the hell they're
doing.
Yeah, I mean, I watched a clipthe other day and there was one.
There was an ambulance tryingto get past and this car was
just like right in front of thecar.
The ambulance was behind it,lights flashing, sirens going,
(44:28):
and it wasn't moving, and therewas a someone was fuck, are they
doing?
So eventually the lightschanged and the car moved
forward, but then they stoppedagain, and they just stopped for
no reason and the ambulance hadto mount the actual central
reservation and, as it did so,the car pulled off again.
So the ambulance was just stilltrying to get past this fucking
person.
Speaker 1 (44:48):
You know what I mean.
And if it was their relativethat's dying Exactly, then
they'd be really pissed.
Speaker 2 (44:53):
I'm all for having
like an emergency service lane.
I think there should be anemergency service lane, do you
know, like the hard shoulder,just for emergency service only,
so they can get past and get towhatever they need.
Speaker 1 (45:01):
Yeah, I think that
would be great.
Fuck buses, they've got theirown lane.
Speaker 2 (45:04):
I think emergency
services should have more
priority over bus lanes thananybody or whatever.
You know what I mean 100% theyshould.
Speaker 1 (45:17):
Get the government on
to one, I reckon.
Yep, yep, yeah, get clover ontoit.
Um, she's busy with the bikelanes.
Speaker 2 (45:20):
She's fucking put
ambulances or ambulance officers
on bikes if she could, but yeahshe probably would.
Speaker 1 (45:22):
Yeah, yep, um, all
right.
My last one is a work relatedone.
All right, so you don't talkabout work, I don't talk about
work very much at all, but thisone has been grinding my gears
for years.
Right and it's.
It's muting yourself whenyou're on a conference call, all
right, um, we have one personin our team that just doesn't
(45:43):
seem to know how, right now.
We were on a big call the otherweek, like a big call, a really
important call, with lots ofimpressive, important
stakeholders, and they just satthere and then they started
talking.
So I took myself off mute and Iwent such and such you're not
(46:04):
on mute again, right.
And then put myself back onmute and they went oh, oh, oh.
And I thought to myself howoften do you need to be told?
How often do you need to betrained?
How often do you need to betrained?
Speaker 2 (46:18):
It's just a simple
button Were they being
disrespectful.
Speaker 1 (46:20):
No, they're just a
dickhead.
Right, they're about 150.
They have been shown.
I've personally shown them.
I said it's just this buttonand they said yeah, but then I
forget when I come off mute totalk.
Speaker 2 (46:34):
I said, well, don't,
Doesn't it show a symbol on your
screen, yeah it shows it on thescreen.
Speaker 1 (46:38):
It's a little
microphone and it has a little
red extra.
And I said but I forget when Ihave to talk and then I forget I
have to take it off.
I said we will remind you thatyou're on mute.
Right is what we say to some ofthe other people when they're
talking, right, who forget.
That's normal.
But I said by being off mutethe whole time, anyone that
(46:59):
comes into the room, anyonethat's there, they can actually
hear everything you're saying,because imagine if it was like
you know like no, if it's me, Icouldn't do it.
This is why I got so good at it,because I say things that I
shouldn't say, right like I'vebeen called out for my lip
reading once because one of mybosses said um, matt, you're not
on mute, and no, mate, you'reon mute, but we can still read
(47:22):
your lips and I bet you diedbecause I kind of went oh for
fuck's sake, right, and I hadsomething that was being said
did you go red?
and I kind of went, and then Ithen came off mute, and I said,
well, to be fair.
I said, have we not spokenabout this?
Every week I says so.
(47:42):
Why does this need to berepeated again, right?
And then my boss at that pointin time said oh, I'm with you,
don't get me wrong.
I'm 100% with you, but probablyshouldn't vocalise it.
Speaker 2 (47:55):
And I said, well, I
was on mute, well, this is a bit
like that cam thing on theColdplay, yeah, the Coldplay
thing.
Speaker 1 (48:01):
Oh yeah, the Kiss cam
, yeah, you know what I mean.
Speaker 2 (48:02):
So you know it's
almost like being caught out.
Yeah, oh, I got caught, butbecause?
Speaker 1 (48:06):
I'm very expressive.
Speaker 2 (48:07):
The thing is, had
those people not reacted the way
they did, then nobody wouldhave known any difference.
It would it's viral.
Speaker 1 (48:13):
It was funny.
That was funny.
I picked up a pen.
I better put it down.
Speaker 2 (48:15):
Yeah, because we know
that if he starts clicking, the
pen, because that was your petpeeve last week, wasn't it?
Speaker 1 (48:20):
I think we're pretty
much covered all our pet peeves.
Speaker 2 (48:22):
I think so.
I think that's pretty much thewhole segment.
Speaker 1 (48:24):
So we've pretty much
done, the whole.
Thing.
Speaker 2 (48:30):
So I've been Dave?
No, I've been Matt.
If you put AI and changeyourself.
Speaker 1 (48:34):
You could actually be
me.
Speaker 2 (48:35):
I could be anything
we could do that next week I'll
be you, you be me.
No, knowing me, knowing you.
All right, okay.
Speaker 1 (48:42):
All right, I'm Matt,
I'm Dave as usual, and we hope
you've had fun, because we haveand we hope you enjoy the new
podcast of Pro 2.
Bye, bye, that's a wrap from us.
We've been your fully grownhomos and we look forward to
(49:02):
opening your mind, your ears andyour curiosities.
Don't forget to like, commentand subscribe and share our
podcast with your curiousfriends.
You can contact us atfullygrownhomospodcast at
gmailcom or any of our socialsby the same name
fullygrownhomospodcast atgmailcom or any of our socials
by the same name Fully GrownHomos Podcast.