Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:01):
You're listening to Snippets from the Summit with your host Scott McKay.
All right, gentlemen, this is your main man, Scott McKay, coming at you again with anothersnippet from the summit as part of the mountaintop podcast from X and Y Communications.
Today, I want to talk to you about something that's extremely interesting to me andhaven't heard many other people talk about it, but it has a subtle and yet extremely
(00:29):
powerful potential effect on how women are going to see you, what they're going to thinkof you.
and whether they're going to continue to want to date you or not after they're attractedto you.
So as you guys can see, the Siri effect, which is what we're going to talk about today, isindeed quite significant.
Now, the first thing I should tell you about the Siri effect is it's always been there.
(00:51):
This is nothing new here.
But the anonymity of the internet, then smartphones and social media, and now AI, ismaking it categorically worse, or at least observably so.
At the root of what we're talking about here is how we treat people.
The uncomfortable truth is how we treat people who we're most comfortable around is who wereally are.
(01:14):
You may have heard that before because in years past, that notion has applied to immediatefamily members mostly.
But here's a good question.
What can you tell about someone who treats Siri and AI with disrespect, maybe even rudely?
Well,
I know what the first pushback I'm going to hear on this notion is.
(01:37):
Look, McKay, they're not even really human.
So this doesn't count, right?
We can treat them however we want.
There's no social recourse or backlash, at least ostensibly.
Well, while I'm not going to argue that, here's the important twist.
We train others how to treat us, which is also a well-known fact, and communication iswhat it is.
(02:00):
wherever the reps, so to speak, are coming from, wherever you're training yourself to bebetter socially.
So with the advent of the anonymity of the internet and virtual assistants and now AI,could it be that we're actually training ourselves to be more comfortable with being rude
and nasty to the point it's becoming habit?
(02:21):
Well, why is this important?
Well, for starters, we already know women watch us as men, especially when they startliking us romantically.
So yeah, they're watching how we treat waiters, convenience store clerks, cats and dogs,et cetera.
And in doing so, they're taking careful notes as to how they honestly believe you'll treatthem someday, were your relationship to take off and go long term.
(02:46):
Why?
Because we, and I'm talking about men and women here actually, instinctively know thatthat bit about treating people we're comfortable with is absolutely true.
So we already know women are watching us intently when they're attracted.
if we trained ourselves to be snarky and rude and perhaps even profane to non-humans andwe've developed a habit that's starting to leak over into conversations with real actual
(03:14):
red-blooded humans, the women are going to notice that.
And ultimately, it's not going to end well insofar as attracting women is concerned.
But what if this goes even deeper?
The other day I hired a tech guy on Fiverr and we found ourselves consulting ChatGPT onhis Zoom call.
Pretty normal stuff.
The guy had been perfectly civil to me so far, even dare I say nice, but the first timeChatGPT fumbled a request, he started calling it dumb and stupid and stuff like that.
(03:46):
And admittedly, you've probably already figured out English wasn't his first language.
Well, almost unconsciously though, I thought to myself,
This is how he thinks about his customers too, after the zoom calls are over.
It was like a completely different guy interacting with chat GPT versus the guy who wasinteracting with me.
Now I didn't get my boxers in the water or anything as I already know customer service isa social dance.
(04:10):
And again, it always has been, this is nothing new, but yeah, I couldn't help but think isall this transhumanism stuff already going off the rails?
I mean, if we normalize dehumanizing conversation, albeit in certain non-humancircumstances, it's human nature for people to notice the habits we've built and to start
(04:33):
making generalizations, if even hasty ones.
And that's why we should treat even Siri and AI respectfully.
We're not training a large language model as much as we're training ourselves.
And the woman you've attracted most recently is the one watching most intently.
Want to talk about this or anything else?
(04:54):
Scott at mountaintoppodcast.com.
Be good out there.
As always, visit mountaintoppodcast.com.
Love you.