Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Suggested articles as part of Odd Pods Media a podcast network.
Fire up those VPNs, put on your tinfoil hats, and
boo boo.
Speaker 2 (00:08):
Boo boo boo boo boo boo boo.
Speaker 1 (00:13):
It's time for suggested articles a podcast. What was that?
(00:38):
That was? S o s.
Speaker 2 (00:40):
Oh damn, that's didn't catch though.
Speaker 1 (00:45):
I suppose no, you know, I was. I was like,
I was.
Speaker 2 (00:49):
I was thinking it was like a binary something earlier,
so I guess it was.
Speaker 1 (00:53):
But it's close.
Speaker 2 (00:54):
And then I thought about Bender, and then I thought
about Bender praying one zero series or one zero one
one one zero one through you know.
Speaker 1 (01:01):
Yeah, that's my train of thought. And then at what
a bunch of other places, I'm like, what if he
starts throwing in a boot boop? What'll I do? Yeah,
this is my train of thought. Yeah. Look, you know,
here we are, everybody. We're recording a very silly tech
podcast about our lack of privacy in the world. But
(01:23):
you know, World War three might be happening, So yeah,
which is more important? I don't know tech podcast? Who's
to say? Is tech podcast more important? Okay? Yes, well
then I guess we better be professional about this show.
Speaker 2 (01:38):
Slash conspiracy theory slash we hate billionaires slash capitalism sucks slash.
Speaker 1 (01:48):
I don't know what else. I you know that makes
me think of something You and I had a meeting
the other day that I think will come more to
light next in our next episode. Probably, yes, we did.
We'll probably talk a little bit more about that in
our next episode. But that in that you were talking
about how we're sort of an anti capitalist podcast, which
(02:09):
I think is fair. Just to be clear, the existence
of commerce does not require us to hate everything about
that commerce. Like I feel like people, I think I
think a lot of people get that wrong sometimes, Like
it's okay to put time and energy into something and
then sell that something for more money than you spent
(02:31):
because your time is worth something, your time and labor
are worth something like that's people call that capitalism, but
like that's not what capitalism is. That's not what we're
talking about.
Speaker 2 (02:40):
What yeah, exactly, And is this deep down just to
just to us like covering our asses and being like
this is really how we felt the whole time. Things
things are weird, very strange, things are in motion for
this podcast.
Speaker 1 (03:01):
Is that is that long to say? Well, I mean
that that could imply that it's all I don't know.
We'll see if it's a lasting thing or not. I
think that's that's the big question. Is something strange is coming.
Will it become a lasting thing? It's the question, But
we'll probably find out two weeks from when you're listening
to this.
Speaker 2 (03:22):
I'm just saying, is this is this our animal farm
moment where where we're changing the rules on the on
the wall from four legs good, two legs bad to
four legs good, two legs.
Speaker 1 (03:34):
Okay, that's that's all I'm saying.
Speaker 2 (03:39):
It's a it's a strange situation, and and the catalyst
for it is even stranger to me. So it's just
it's just you never know how one thing is going
to ripple out and affect everything else.
Speaker 1 (03:51):
But the algorithm sure is powerful. Yes, yes, yes, I
don't think there's any question about that. When it knows something,
it knows something.
Speaker 2 (04:03):
Yes, Okay, before before we before we move on to
more business, and you can cut this out, but you won't.
We had talked briefly about doing and asked me anything
episode uh huh where you can send in questions. Listeners
(04:24):
can send in any questions, doesn't have to be tech
related to me anything, sure, and we will we will answer.
As you know, we kind of talk about everything. We uh,
there's very few things where we're like, Nope, I'm not
going to bring that up on the podcast. So you know, yeah,
we're not going to do it next week, But start
(04:44):
sending your questions to the to the email address, which is.
Speaker 1 (04:50):
Well, it's suggested articles podcast at gmail dot com. But
how can you be like you should cut this out
and then directly interact with the audience like you're just
setting me up to look bad. But I think the
audience gets it. I think they know that they know
that I'm doing it right.
Speaker 2 (05:11):
Where I'm like, you better cut this out just so
they listen more closely. They're like, this is going to
be important.
Speaker 1 (05:16):
I think they know that you don't, deep down in
your heart of hearts, you don't want anything cut out.
Speaker 2 (05:21):
If I wanted to cut anything out, I would I
would like rest the editing.
Speaker 1 (05:26):
Duties from you. I'm sure you're like, oh no, please
don't do that. Yeah, that would be a shame, a
terrible podcast partner. Yeah. I was listening to an episode
of Chasing the Whimsy the other day starring Ben You know,
a former guest of the show, you could say friend
of the show, frequent writer into the show. And also, uh,
(05:50):
you know, you guys have a thing too. You did
the theme music for a show and all that, and
you've been a guest on his show. We like it's
one of our favorites. And he was talking about just
the way he like built up some of his presence
online and all that. And like a big thing is
posting podcasts to YouTube. Oh yeah, we could do, because
(06:13):
we do record this on zoom, right. We probably have
to change a couple things, like maybe I'd have to
find a background that isn't just my blurry bedroom, and
maybe you'd have to let your entire face on screen.
It's just the top of your head, so hideous. But
all ship, that's what it is. But if we started,
(06:35):
if we started posting episodes to YouTube, you know what
goes away? What editing? Oh I'm not doing that shit.
Editing video takes forever, Like just the process.
Speaker 2 (06:46):
I had a thought we talked about this where instead
of doing actual video editing, I would get little little
popsicle stick puppets to represent us, and I would I
would like do a little puppet show for the clip
and then film that put it up on YouTube. But
that involves me doing a thing, and so we might
(07:08):
need it, We might need a whole like social media representative.
Speaker 1 (07:13):
Well, first of all, I definitely love and I'm on
board for the idea of suggestive Article's puppet theater. Secondly,
I didn't mean clips. Clips would be good. Clips would
just be basic marketing. I mean people post their whole episodes, yeah,
to YouTube.
Speaker 2 (07:28):
I've never watched a whole episode of a podcast on YouTube.
Speaker 1 (07:31):
I think it's a thing, though, Like I haven't really either.
I like the audience, so what I like to multitask,
But it seems like it's a big it's a big thing.
Speaker 2 (07:39):
So what you're saying, Jeff, is that something that I
have fifty year old white sis dude does might not
be what everybody else does.
Speaker 1 (07:47):
Man, you're fifty Jesus year old. You might do an
hanging out with an old Yeah. I hope Jennifer's not listening.
Speaker 2 (07:57):
Oh my god, Jennifer, kick his ass.
Speaker 1 (07:59):
She doesn't listen anymore. It's fine anyway. I just just
something to think about as you continually mention my need
to cut stuff out. If we ever pivoted to video.
There's no cutting anything out. Fair enough, Okay, glad we
covered that. Now should we do mail bag? Wait? What
do we cut out? Now? Just remind me what that is? Nothing? Nothing.
(08:24):
I will cut nothing from this episode. Yeah, even the
awkward pauses as one of us looks at our phone
to scroll through an article. I'll just leave it all.
Speaker 2 (08:31):
In, all right, So we go into the mail bag.
Speaker 1 (08:34):
Yeah, bag, all right, suggested articles. Mail Bag a segment
I'm going to start because I didn't want to forget it,
even though this is more of a horrors of technology.
This comes as a submission from Neon Chaos. Oh why
were we talking about Ben? Oh? Oh my gosh, sorry, wowy,
(08:56):
my brain is so weird today. Okay, yeah, just your
brain is weird.
Speaker 2 (09:03):
Yeah, you know what, No, I'll say we had a
full conversation about man we talked. Was that the episode
where he was doing the panel?
Speaker 1 (09:11):
Yes, that was one of his panels at a convention recently.
Speaker 2 (09:15):
Yes, that was very insightful and reminded me of all
the things we're not doing. See if we edited the podcast,
we could like clean that all up. And then it
just seemed like.
Speaker 1 (09:25):
Well, now I'm I'm not going to edit that out
to show everybody what it would be like if we
pivoted the video. He doesn't edit so and it's working
for him. Yeah, okay, yeah, okay, you know what I
changed my mind. Neon Chaos did send in a horror
of technology. I'm going to save that, but I will
(09:45):
say that Neon Chaos, friend of the show, Ryan and
I think a couple other people we might have we
might have hurt some feelings when we implied that we
only had four listeners and we listed those listeners off.
Speaker 2 (09:57):
Oh man, I am sorry time I say the number
of listeners.
Speaker 1 (10:01):
You know, I'm being yeah, self deprecating.
Speaker 2 (10:04):
It's not yet, but but I apologize. I apologize. Everybody
apologies not doing that.
Speaker 1 (10:15):
Should we accept his apology? We yes, you know, all
of us together. I don't know. I'll think about it.
I'll think about it. Okay. Oh, now, let's go to
the Let's go to the rest of the mail bag.
We'll come back to the one Neon Chaos and then
during the next segment. But first of all, we did
hear from Ben excellent. Ben said maybe I should just
(10:36):
email Ben back, but instead I'm going to read his
his email on the air. Hello, suggested articles. Sorry, I
haven't emailed in a while. That's okay. You don't have
to email in every week. There is one person that does,
but it's not you. Still looking to have you both
on zoon. What does your schedule look like in July?
I guess we should get back to him.
Speaker 2 (10:56):
Yes, let's get back to him.
Speaker 1 (11:00):
This is a riveting mail bag so far. The question
wasn't good.
Speaker 2 (11:04):
It was just it's like a behind the scenes kind of.
Speaker 1 (11:06):
But the other point of his email, okay, is that
he sent in some more Oh well, he sent in
some more algorithm related article comics. No, his his whimsy doodles. Yes,
and one of them is specifically about you. I mean
you're on your turntable and everything, you know, doing the mixing.
(11:30):
Use turntables when you you do like a little wiki
wiki thing, all right, anyway, one specifically about you, But
one actually has us sitting at a table recording a podcast,
which itself fills me with whimsy. It would be so
good to do this in the same room with you. Yeah,
but he sent some cute comics that he said we
should post to the patreons, so we will put him
(11:52):
up there. One mentions Billy Corgan, Oh, dear God, Yeah,
the best.
Speaker 2 (12:00):
You really are the best of all the things that
I thought would happen in my life. I never thought
I would be the subject of a of why can
I find the word?
Speaker 1 (12:11):
I can't find the words to do? A comic strip?
A comic strip? Yeah, yeah, it's way better than Ziggy.
Fuck Siggy.
Speaker 2 (12:18):
No, I'm sorry, Siggy. It's just these these little comics
that Ben does, even the ones that are not related
to me, are especially those All of the ones he
does are just charming and sweet, and they.
Speaker 1 (12:32):
Are There's something about Ben. I don't know how he
does it, but he always seems like he's like, I
don't know, at peace happy. Yeah, I don't know where
that comes from. Yeah, perhaps we should ask him when
we're on his show.
Speaker 2 (12:46):
Yeah we should, we should. We should grill him.
Speaker 1 (12:48):
Let's grill him. How can we feel happy? Goddamn it.
Speaker 2 (12:52):
I don't think I've ever once heard him say fuck
that guy on his podcast, because you know, it's.
Speaker 1 (12:58):
Also a family podcast he does with as kids. But yeah,
I think he ever talks about Elon Musk or any
of that stuff.
Speaker 2 (13:06):
But he's just always got such a nice, pleasant demeanor, and.
Speaker 1 (13:13):
But he listens to this show.
Speaker 2 (13:15):
I know he does.
Speaker 1 (13:17):
He likes us, and apparently we haven't brought him down.
I don't. Yeah, that doesn't make any sense. No, like
none nonsense.
Speaker 2 (13:24):
All right, No, that does make sense. It makes sense
because maybe there's a Catharsis that comes from listening to
our show. Or people don't listen to us to get down.
People listen to us to get free, free from the algorithm,
free from technology, free from the bonds of capitalism. I'm
sorry about all those people I told to go to Mongolia.
We don't have the house yet. I hope you get
(13:46):
back safely.
Speaker 1 (13:47):
We'll get there, we'll get We're going to get that house. Okay.
We also heard from a friend of the show, Eric.
He sent it in a quick little message for us.
Okay about a movie. I have a suggestion for an
AI movie for you and your audience. Now, this is
not a movie generated by AI. Let's be here. Y. Yeah.
(14:09):
I don't think Eric's into that either. I don't think,
at least not at this time. Knowing your love of
the algorithm, all hail the algorithm, he says, Oh, okay,
there was a feature. Well, I hate to interrupt Eric's article,
but you did ask where could we see Ben's comics?
Oh yeah, and that would be a suggested article. Damn it,
(14:30):
what is happening? That would be at patreon dot com
slash suggested articles. That's where Ben's comics will be going,
and potentially other things from the show will see how
that goes.
Speaker 2 (14:40):
A Patreon A Patreon, man, You know what, it's the
extra hour that's killing us.
Speaker 1 (14:47):
We're recording earlier than usual. It's messing everything.
Speaker 2 (14:50):
Up, and I'm actually kind of feeling extra bad because
I realize the time I'm at right here.
Speaker 1 (14:59):
Is the time you usually record, and I'm like.
Speaker 2 (15:02):
What an asshole am I? He's got to get up,
But he's got to do stuff at this time. He's
got to be funny. He's got to be insightful, mostly
mostly insightful.
Speaker 1 (15:11):
I'm never funny, always funny, Jeff, you son of a bitch.
You tell me one time I've been funny. God damn it,
right now, that is not what's going to happen here.
That's not the point of the show.
Speaker 2 (15:24):
If we ever go to video, you're going to see
his face when you're just gonna want to squeeze those
cheeks when it gets.
Speaker 1 (15:28):
Mad and just be like, fuck you, God damn it.
I keep my cheeks off screen. See.
Speaker 2 (15:39):
I kind of like the idea of the camera only
going to hear because it kind of plays into my shortness.
Speaker 1 (15:44):
You know.
Speaker 2 (15:45):
I'm kind of the shorter of the jeffs. So so
there's one Jeff.
Speaker 1 (15:48):
You can see it reach my camera and you can't
see like all of my head. That's true. At the
top of my head is cut off. It's very silly anywhere.
So Eric says, no, your love of the algorithm, All
hail the algorithm. There was a feature made in nineteen
seventy that encompasses the bright future of AI or a
(16:10):
one that's his commentary, not me, from the time of
the sixties and seventies. It's a picture called Colossus, the
Forebin Project. Oh yeah, are you familiar with this?
Speaker 2 (16:21):
Well, yeah, this is like a classic sci fi.
Speaker 1 (16:23):
Have you seen it? I don't think I have. Oh,
thank god. Okay. It is the tale of a better
time when the world accepted AI in their hearts eventually.
So I looked this up first. Let me read you
just a quick two sentence synopsis of the plot. Okay,
(16:43):
This is called Colossus, the Forebin Project, and it is
indeed from nineteen seventy before I was born. I think
before you were born, old man. But yeah, okay, Thinking
this will prevent war, the US government gives an impenetrable
supercomputer total control over launching nuclear missiles. But what the
computer does with the power is unimaginable to its creators.
(17:08):
I think that's a good way to get us, to
get us salivating without any spoilers. There's some really good
actors in this right, So there's at least one really
good actor in this movie. Susan Clark, who I know
from my childhood, is playing mam in Webster the Adoptive Mother.
Yeah wow, I mean once you know that, is there
(17:30):
anything else you need to know? Marion Ross is also
in this movie. Shout out to James Hong. He's near
the bottom of the cast list, but he's amazing. And
I can't believe that man is still alive. He's got
to be at least two thousand years old. But if
you don't know who James Hong is, just google him
because you will recognize him from at least ten movies
you've seen, and and his voice from even more. Yes,
(17:55):
and he does voice acting. So now there is one
problem with the command for us to watch Colossus the
Foreben Project. Can you guess what the problem? Right? Anywhere? What?
It's not available to watch anywhere? Yeah? Pretty much. Yeah,
you can't. You can't really find it. There is a
Blu ray on Amazon for eighteen dollars, oh deed for ten. Okay,
(18:18):
so it is possible to obtain this movie, but it
is not on any streaming services. So you and I
will have to talk off Mike how we're going to
make this happen. I mean, god forbid we spend twenty dollars.
But I'm sure we'll find a way. So thank you
Eric for the suggestion. We're going to have to make
this happen. Yes, and I guess we'll just say at
(18:40):
some point there will be a spoiler cast about Colossus
the Foreben Project.
Speaker 2 (18:45):
Okay, so this begs a question, Yes, where do you
think the idea of Ai being put in charge of
our nuclear weapons started? Because we all know Terminator, right,
we all know the the Determinator.
Speaker 1 (19:01):
Storyline was sure which is much.
Speaker 2 (19:06):
And I yes, which came in the eighties. Here is
something that references that same idea. But I mean at
least in the premise.
Speaker 1 (19:18):
Well, I mean you had Star Trek started. Star Trek
started in the sixties, right, yes, I mean we've always
had science fiction, but Star Trek had a lot more
I guess accurate, maybe depictions of like computing, although maybe
I'm thinking more Star Trek the next generation. So never
(19:40):
mind that they didn't talk to their computer in the
original Star Trek. I don't think computer though, it was
all yeah, that was a hard thing, is interesting, interesting question.
Speaker 2 (19:52):
I mean, I that was something I've been working on
in my spare time, and I will I'm sorry, that
will be whatever spare time I have, and hopefully I
will have be able to say something about that later.
But soon, I mean, of course it will be later.
By definition, I mean I will later.
Speaker 1 (20:11):
I don't think I know anything about what this project is,
do I you don't, Okay, don't. Well, cool, I guess
maybe we'll see. Well, I don't know. I mean, I
just think ever since computers became a thing, like, there's
been science fiction for much longer than that, but no
one really knew what a computer could be like. Maybe
(20:33):
maybe robots I think might have been a thing earlier.
Than sort of the invention of the modern computer, because
we had automatons. Like when I was in Greece, I
went to this ancient technology museum and there was a
robot on display, and all the robot would do was
fill your cup for you. But it was sort of
an early idea of a robot, like an automaton, where
you would put a cup in the robot's hands and
(20:56):
it had a bunch of different mechanisms inside that would
first pour on and then start pouring water after a delay,
so you could either have pure wine or you could
have water down wine. Like it was, it was a
complicated machine race, but when you took your cup off
the hand, all the mechanisms would disengage. So like, automatons
have been around for a long time and the idea
(21:17):
of having them do our work for us has been
around for a long time, but the computer is sort
of different, right, So at that point, we're also sort
of talking in the sixties era, I think for when
people started maybe when it started to go kind of
mainstream the idea. So I think it's almost instant, like
(21:39):
from the point where humanity started conceiving of what a
computer would be. I think we must have started thinking
like this could either cause or prevent all worse artificial intelligence?
Speaker 2 (21:53):
Right, well, yeah, who thought that computers would just be
invented so Donald Trump could announce on truth social that
he bombed I.
Speaker 1 (22:01):
Ran, Yeah, I don't think anyone saw that coming. Yeah,
back then, back then.
Speaker 2 (22:06):
Now, Yeah, we're so fucked.
Speaker 1 (22:09):
It's okay. The algorithm will save us, will it. We
must believe in the power of the algorithm. Okay, so
thank you for sending that. Eric. We're gonna have to
watch it. I think we can agree on that. We'll
figure it out. It'd be so cool if we could
watch it together. But I don't think that's going to
be an option anytime soon. I don't know. But then
(22:31):
comes the last entry in our mail bag, a brief
message from Rachel, the aforementioned person who I now need
to send in a message every episode. This one's this
is a doozy. I really wish Aaron was here for this.
She was supposed to come here today and she couldn't.
Speaker 3 (22:49):
She was.
Speaker 1 (22:51):
I'm christ. Rachel has written in with a sort of
an update, and I here it is. It's very short.
Apparently I'm a lesbian and I have successfully made my
AI fall in love with me. Oh wow, these two
things appear to be related, and then she concludes it
(23:13):
by saying, I win first thoughts. That's all she wrote,
That's all she wrote. Wow, I don't know how to
read this. What do you think?
Speaker 2 (23:29):
I don't know. I have so many questions. We might
have to have Rachel on the show. Now this is
all a ploy.
Speaker 1 (23:36):
It's out there, Rachel, if if you want to come on, yeah,
if you're willing to come.
Speaker 2 (23:41):
On that this this big question. I can't even comment
until I have more context. It's such a brilliant tease, though,
because because we wanted we wanted an Ai to fall
in love with somebody. Yes, that's done, and she opened
and is her AI? I SA little plumb straight?
Speaker 1 (24:05):
Now, well, there's there's so many questions, like I don't
even know. Did we I guess we knew that. I
guess we would have considered Rachel straight because of that
Twitter dude, right, Like I don't know that we explicitly
explicitly talked about her sexuality, but there was a dude
that she was kind of you know, had at one
(24:28):
point kind of place necessarily though true could been by right,
but Uh. I think we could say that there was
at least sort of a heteronormative depiction of Rachel in
our heads. Yeah, is that fair? But now she's a lesbian,
which appears to be news to her and us that
(24:54):
the fact that would be news to us is no
big deal because we're dumb, But the fact that we
are being news to her, yes, it's interesting. So did okay?
Is the can we gender and AI? First of all,
can can AI have gender?
Speaker 2 (25:10):
I don't know, I mean obviously designed that way right
for us to part of the part of the way
we are going to accept it as if we put
it into our own enormous correct.
Speaker 1 (25:22):
So if that's one of my favorite words, right, if
we if we anthropomorphize right the AI, then it could
have gender. But we're going to we anthropomorphize everything. People
name their cars. I've never done that, but yes, that
came up. I'm like the first episode of our show. Yeah,
(25:44):
I think you've named your car? Oh yeah, true, I
never named my car. Okay, so we would easily then
as we have a chat companion. Well, I have Mary Anne,
my deep sea GAYI who I don't spend anywhere near
enough time with. She might be cheating on me by
electing those. So yeah, like, if I was going to
(26:08):
have this attempt at a quote unquote relationship with AI,
I as a straight guy, would want it to be
a woman. So therefore, Marianne, but she I thinks, just
said she called her chatbot chatty chatty. Didn't she say that?
I think she said that, which doesn't imply gender. But
(26:29):
in any case, so now the bot could be a
girl and has fallen in love with her, so she's
fallen in love with it that she didn't say that either,
or maybe it's if she's really an announcement and I think, okay,
So she could just be coming out from a realization
(26:50):
that she's a lesbian that has nothing to do with
AI whatsoever. Right, you don't think the AI turned her gay?
I think there. I think we need more information. If
the AI can turn people gay, Alex Jones is going
to explode on camera.
Speaker 2 (27:08):
Yes, if the AI can turn people gay, they would
unplug it right now. The GOP would be like marching
down to every and I don't even know if they
would know where to march with their axes and pitchforks
and whatnot.
Speaker 1 (27:22):
But maybe we should try to get this story out there. Okay,
first we need to know if it's true if the
AI turned her gay. But if it did, this needs
to be national news because the war between our government
and like the Puritans that run it versus the tech
billionaires would be I mean, I can't even conceive of
(27:42):
how epic that would be. That'd be so epic. It's
all the power versus all the money. Yeah, who would win?
Who would win? I think we would win, Me and you.
We'd have a lot to talk about it. We have
a lot to talk about. Were going every week? Yeah, yeah,
(28:05):
I think this is amazing and this could be national news. Yes, Rachel,
I think he's right. I think it's not even just
an invitation anymore. You have to come on the show.
Yeah yes, well yeah, yeah, yeah no, Because she said,
these two things appear to be related. So it's not
just that she's coming out as a lesbian, and it's
not just that the AI fall in love with her.
(28:25):
They're related. So the AI played a role in her
coming out as a lesbian.
Speaker 2 (28:31):
Is she actually out or does the AI just assume
she is? She's telling me question I have she's telling us?
Speaker 1 (28:39):
Well, she says, apparently I'm a lesbian. Maybe she hasn't
fully accepted it. Yes, oh damn it, so many questions, Rachel.
What are we going to do about this? First of all,
you need to give us a little more information, please,
And then we need to figure out if you can
come on the show, because I think you need to
come on the show. Yes, so let's talk you like
(28:59):
waking up early on the weekend. Yeah, do we know
what time zone Rachel's.
Speaker 2 (29:03):
Hopefully or not on the Pacific time.
Speaker 1 (29:07):
Or like Hawaii. If she was in Hawaii, she's going
to be miserable. Yeah, yeah, Okay, Well that's that's the
mail bag. Thank you to all who wrote in suggested
articles podcast at gmail dot com. A putt, damn it,
I waited for you email you pointed at me an email.
(29:30):
My brain is wandering today. I am so lost. Well,
you know, AI is turning people gay. It's okay to
be confused. But I think now we can take a
little break. We're going to have some commercials, some AI
powered commercials, I hope, and we're going to see what
the algorithm wants us to buy, perhaps some products, perhaps
(29:51):
some services. Mine has still been dominated heavily by guns.
I've been getting a lot of gun.
Speaker 2 (29:56):
Ads, cars for me still and betterhelp dot com apparently not.
Speaker 1 (30:01):
Better help but therapy therapy just in general. Yeah, okay,
So we'll listen to some commercials. I hope you guys
listen to and we'll be right back.
Speaker 3 (30:14):
If you like fun, then you're gonna love the beard
Out Podcast. It's a podcast where we talk about too
of the greatest things in the world, beer and weird Out,
and frankly, we talk more about weird Out than anything
because he's the gift that keeps on given. So join
us as we talk through weird Al's career, what he
needs to us, and we have some very special guests
(30:36):
on to discuss the magic that is Alfred Matthew Yankovic.
Beard Out part of the Odd Good Media.
Speaker 1 (30:43):
Network, and we're back. Yes, Jennifer had just come by
and she did remind me that there was something I
needed to tell you. So, do you have any weird
algorithmic stuff to talk about? Sometimes we do that before mailbag,
but I guess not today. My double check. I keep
(31:05):
my weird algorithmic stuff in my screenshots me too. It
is kind of like a weird one two punch that
happened over the course of two or three days. I
had mentioned a couple times out loud in my life
recently that I should go shoe shopping soon because I
invested in like a nice pair of running shoes a
(31:29):
bunch of years ago, like pre COVID, so they're kind
of starting to wear out, you know. But I haven't
done any shopping at all four shoes. But Jennifer at
one point was like, you should go to this one store,
and I was like, yeah, I probably should, like a
good like a good running shoe store, and sounds good.
(31:49):
Then I started getting ads for running shoes, and when
I I the instant I saw an ad for running shoes,
I texted Jennifer and said, I'm getting ads for running shoes,
but I haven't searched for them at all. And she
was not home at the time. She was just on
a walk with a friend in our neighborhood named Tricia,
and she goes, Tricia and I were just discussing your
(32:11):
need for new shoes, and she said her phone was
in her pocket the whole time, so like almost in
real time, she's having a conversation about shoes. Now, that
could easily be a coincidence, but it still seemed kind
of funny. And then that was I think two days ago.
Then yesterday I went to Costco and bought a couple
of Polo shirts that were on sale for cheap that
(32:32):
looked like they'd be good for wearing to work. And
as soon as I got home from Costco, sat down
and looked at my phone, Google Opinion rewards pops up. Rewards. Yeah,
the little survey where they pay you ten cents to
answer questions about how intrusive they are in your life
during the past week. Have you shot from men's apparels
such as clothing, accessories and shoes, And the options are
(32:56):
like for online purchases and store purchases. I'm thinking about
it like those kind of answers. But yeah, like I
just made it in store purchase for men's apparel. But
also it knows that I might someday search for shoes
and I haven't yet, So it's like either trying to
remind me to get those shoes or it's it knows
what I just bought at Costco. I'm not sure which one.
It is interesting.
Speaker 2 (33:16):
Yeah, yeah, that's that's yep, that sounds about right. Here's
when I have last time we recorded what did we
talked about before we recorded? And I don't want to
keep bringing this up, but it is on our Patreon.
We talked to our euro trips on Patreon.
Speaker 1 (33:34):
Sadly. Yes, well we'll we'll do a big.
Speaker 2 (33:38):
Euro trip together someday, chef, if there's still a world
to visit. So this, this actually came. This is an
article I was suggested on June. So later that day and.
Speaker 1 (33:52):
The same day we recorded, I released the last episode.
I don't know if you can see that. I cannot
see that. Okay, you might ancient works up with unfinished
sculptures on Greek island, Greek articles son of a but yeah, yep, wow,
you weren't even in Greece. That was me. I got
it backwards. The algorithm was probably like, I can't tell
(34:14):
these jeffs apart. Yeah, we're confusing. We confuse a lot
of people. Yeah, I mean we look on I do
sound you do sound much taller than I am.
Speaker 2 (34:22):
So it's fine, I sound taller. That's how I tell
us a part.
Speaker 1 (34:26):
I didn't know that was a thing, although I guess
like Andre the Giant had a tall person voice, right, Peanut, Yeah, yeah,
we don't do impressions for recent folks. No, not what
I'm known for in my in my comedy stand up routines.
(34:47):
I'm not known for my impressions. Yeah, yeah, well that
is weird, of course, not know, not weird because we
we come to expect it. But it is one of
those algorithmic things. More proof it's always piecing together and puzzled. Definitely,
more to that monitoring end where it's always taking our
(35:08):
data and tying it together. Maybe I should just mention
that one article that Neon Chaos sent over, okay, saving
that for well or of technology, you know, I think
I think it might work here. They're almost the same segment. Nowadays,
our kids are under surveillance. Why you can't trust ed tech?
(35:29):
That being educational technology. In twenty twenty two, this was
an article on proton dot me. I think Proton is
that provider of like secure email addresses and stuff that
Nekas himself mentioned. In twenty twenty two, researchers at the
(35:49):
Internet Safety Labs found the up to ninety six percent
of apps used in US schools share student information with
third parties, and seventy eight percent of them share this
data with advertisers and data brokers. Of course, yeah, of course,
I mean, first of all, are we surprised? No, no,
(36:12):
let's see you off. Yes, this shouldn't reach you. You
should go down to your school right now and tell
your principle break autors, but quit selling your students information.
This goes so much higher than any individual principle though
coming I know, I.
Speaker 2 (36:27):
Know, but just give the principle a hard time. They've
been costing by for too long.
Speaker 1 (36:32):
It's kidding. Please, don't harass your principle. That so they're
selling your data. No bigsprise there, But like the articles
just asked the question, like what data do ed tech
apps collect? And okay, location data, biometric and media access
or personal and social data. Let's see harvest contact lists,
(36:56):
email addresses, demographic information, sure behavior and data usage. Every quiz,
every click, quiz answer, or timestamp is often logged, and
tech systems capture what pages students visit, how long they
spend on tasks, and which problems they struggle with. Yeah,
sensitive records, for example, online counseling tools, mental health quizzes,
(37:21):
or special education resources could reveal health conditions or disabilities.
An Internet Safety Lads report found that mental health records
and similarly sensitive information were being shared with ad networks. Networks.
Oh my god, what is that? It's just every part
(37:42):
of this is gross. Yeah, it's gross. The article goes
on for a very long time, but yeah, came out
in twenty twenty two. There was a study in twenty
twenty two. This article was just from a couple of
weeks ago. But yeah, I mean, if three years ago
they were are selling this stuff, it's not gotten better.
(38:02):
I know, I feel confident in saying that.
Speaker 2 (38:04):
Okay, I'm never going to say this again, but homeschool
your children apparently homeschool which you can at least be
in some control of who is monitoring your children instuff,
completely out of control.
Speaker 1 (38:21):
Boy, that's uh, that's an interesting take from one half
of suggested articles. I'll hail the algorithm to suggest that
you would have any control over what over what's monitoring
you or your children?
Speaker 2 (38:32):
Well, yeah, you do, you do, because you can. You
could have Alexa in your house. You could be an
Amazon house. Yeah, you could be a house, so.
Speaker 1 (38:40):
You could choice the choices you overlord, Yeah, sure, sure
don't don't.
Speaker 2 (38:46):
And also, all this information being collected is that is
that is being sold to advertisers and probably also technology
companies to help make their targeted serven's more effective.
Speaker 1 (39:01):
So you know what, fuck those guys.
Speaker 2 (39:03):
It's not like any of that money is coming back
to your school and making it better, funding their art program,
buying band uniforms, or is it that the sunset?
Speaker 1 (39:14):
Hold on? Hold on? That is a good goddamn question.
If they're selling wait no, no, there's a problem there.
It's not the school system that's selling the data. The
school pays money for the apps, right, the licenses for
the app, not getting any companies are selling them. Yeah.
Speaker 2 (39:29):
Shit, they don't get any money back from that.
Speaker 1 (39:31):
Yeah, they don't see a dime. For a second there,
I thought there might have almost been something bordering on
good news that, like the school systems could at least
get some money to fund their own programs. Like maybe
teachers wouldn't have buy pencils for their classrooms anymore, if.
Speaker 2 (39:44):
Only they just had to buy the pencils. Here's my thing. Also,
when you started seeing technology going into schools, they were
doing it in a lot of lower income areas. I
know they used to live in. One of my kids
used to go. This is true, at least in this state.
And they would just they would make programs where every
(40:06):
kid had their own iPad and you know, or their
surface or computer, and they would load their rooms with
technology and it all makes sense. Now, ye always start
at the bottom and then you work your way up to.
Speaker 1 (40:17):
The Jeez, that's dark. But why would you want to
monitor the poorer children more when they don't have like
they can't drive commerce as.
Speaker 2 (40:27):
Well, Because those schools are going to be it's not
the children, it's the schools. Those schools are going to
be desperate for resources. Oh I see, so you get
in at the poor schools, that becomes normalized. We'll stop
calling them poor schools.
Speaker 1 (40:41):
Okay, damn it. I feel like you're laying a trap
for me.
Speaker 2 (40:46):
No, I'm not laying a trap for you. I'm just
saying that that that they're going to they were going
to be more eager. And now then then when you
work up to the stuff where you're really going to
get some good information process, there's some money, money money
where schools can pay more money for it. You start
charging for it. Then you say, oh, look how effective
it was over here? You know, test scores did this
and while really all it is is we made this
(41:08):
much money, and now we're going to make it from you,
am I.
Speaker 1 (41:12):
Yes, uh, I don't. I mean, standardized tests are garbage.
But let's just say hypothetically, the companies with these ed
tech products did come in and your students started learning
better and started showing signs that they were smarter, I guess,
more roundly educated, like they're getting stuff out of these products.
(41:35):
But then also the products are like, you know, sending
off red flags of like, hey, look out for this one.
You could maybe sell it some mental health care services.
I don't think it's ever going to be that altruistic.
It's not going to help these But is there a
benefit from using these apps? Is my question? And does
that in any way off that some of the horror
(41:56):
that we're.
Speaker 2 (41:56):
Talking about here, I don't know. I don't know if
anybody out there knows.
Speaker 1 (42:03):
Let us dot well, Like, I mean, a lot of
this started in twenty twenty, right, I mean everybody had
to start staying home before twenty My kids had iPads
in their classes in before twenty fourteen. Really yeah, huh yeah,
I mean I guess there were some of that, but
not as widespread. It was a pilot program. I'll give
(42:25):
you that.
Speaker 2 (42:26):
But but that's that's when it started in our school district.
Speaker 1 (42:31):
Well it's like, okay, so like where I work, we
used to have like the you could call them the
upper class would have because like the people that made
all the money for the place I work would have
like the nicest laptops and stuff, and the people that
just like assisted those people would have desktop computers. But
when COVID came in, everybody had to wind up getting
(42:52):
a laptop. And that's how it stayed because you never
know what the next apocalypse is going to hit. Like
when when the apocalypse of twenty twenty came in, like
there was a big ramping up of these products. Right,
Like technology became i think more integrated into life because
we had to do all this remote learning and the
(43:13):
teachers had to monitor the students. But it turns out
it's not just the teachers monitoring the students, right, it's
everybody's monitoring the teachers and the students. Right.
Speaker 2 (43:26):
Our tech overlords, our tech overlords, Fuck our tech overlords.
Speaker 1 (43:31):
And they're making so much money.
Speaker 2 (43:33):
Yeah, there's so much money being made, and we're generating
them with profits just by doing this podcast. We're making
some assful money. I don't even understand where the money
comes from at a certain point, I know exactly, and
how's it gets spent in a way not spent even
Where does the money come from? Who has this money?
Speaker 1 (43:56):
How do they continually selling all this data to data broke?
How are the data brokers getting that much money when like,
if God forbid, I click on an AD for a
blanket that makes you cold when you put it over
you Incidentally an AD I've gotten recently? Like, if I
(44:18):
click on that AD, someone makes like a nickel? Maybe maybe?
Speaker 2 (44:25):
So how many can you imagine how many nickels just
got made? Though by me saying that out loud, No, no, no, no, no,
just in the time you said that, how many people
clicked on an AD, whether on purpose or inadvertently?
Speaker 1 (44:38):
Yeah? Oh, and accidental clicks count? Yeah? Absolutely.
Speaker 2 (44:42):
And you know people are generating these games, these these
mobile games that are just full of tricky ways to accidentally,
you know, click on an AD.
Speaker 1 (44:52):
It's insidious.
Speaker 2 (44:54):
So yeah, where where does the money come from? Where
does the money go? I guess we know where it goes.
Speaker 1 (45:00):
Yeah, all the way winds up, trickling up up, Yeah,
up to the top. I don't know. I guess it's
just hard to conceive of that much money moving around,
especially when it starts at like pennies or halfpennies or
Nichols or Yeah. I was just thinking that Office Space
(45:21):
was fractions of a part? Should we say? Is that that?
Isn't it the blood to Superman three? Or a good point? Hey? Wait,
wasn't Superman three about an AI? Oh? I don't, I
really don't remember.
Speaker 2 (45:34):
I just remember that in Office Space they mentioned it
was from one of the Superman movies.
Speaker 1 (45:38):
Well, Richard Pryor, Yeah, you know, acting legend Richard Pryor.
He was a programmer and he I think he made
a robot that tried to take over the world because
I remember at one point it took over a woman
that like covered her horrifically and like mechanical parts, and
she became a servant to the AI that tried to
(46:00):
kill Superman with like a Kryptonite gun or something and
like a laser, like a kryptonite laser. So remember any
of this Superman three had some weird body horror stuff
going on. You definitely had some weird body horror stuff,
but also I think it was ultimately about an evil AI.
Trying to take over the world and Superman had to
save us. Also, he got drunk the AI No, Superman,
(46:22):
Superman got drunk. Do not remember you don't remember any
I don't.
Speaker 2 (46:25):
Remember any of this, Okay, but I remember I remember
Superman when Superman was one of my favorite movies grown up.
Superman two I vaguely remember.
Speaker 1 (46:33):
Well, that was the one with where the three you know,
the three Kryptonian before.
Speaker 2 (46:39):
But also he loses, he gives up his powers to
be normal, which I never liked that.
Speaker 1 (46:45):
I never liked. No, it was it was a bad choice.
It's like, I mean, he can you know what makes
you normally? You just you just make a choice.
Speaker 2 (46:52):
You're like, I'm not gonna But then I don't know.
If you have to go to Home Depot with Lois Lane,
you can need to, you know, lift up some ship.
Speaker 1 (47:01):
You can. Okay, look, we have to we have to
watch we have to watch Colossus the Forbin Project. But
at some point we also have to watch Superman three. Now,
this one's on HBO Max, so much easier to find.
Speaker 2 (47:13):
Oh my god, here's how here's something the times that
time stamps in this movie so well. But this this
super villain programmer, he secretly embezzles eighty five thousand dollars.
Speaker 1 (47:25):
From a company bay Rod. He's like, I'm rich. He's like,
take that, I don't need any more money. I don't
even think that will get you in the small claims
court anymore. The quick synopsis is entrepreneur Ross Webster teams
up with a computer genius in order to realize his
(47:46):
own evil intentions. When Superman obstructs his plans, he decides
to destroy him a computer genius or a computer. I
don't know. I'm gonna have to watch.
Speaker 2 (47:55):
Well, if he's a computer genius and he doesn't use
a computer, he's kind of it's kind of a stupid lot.
I'm just saying, if it is, you have to assume
there's a computer.
Speaker 1 (48:05):
Well, there's definitely a computer. So I might have to
watch this soon just it'll be pure self torture. But
I think I'm gonna have to check it out. And
it's on HBO Max if anyone wants to watch along.
You don't have to spend twenty bucks on The Colossus,
the Forebum Project, Blu Ray. It's just sitting there waiting
for you on the streamers that you probably already have
(48:26):
a subscription to even if you're not aware. Probably yeah,
probably yeah, okay, okay, So moving on from ed tech,
thank you Neon Chaos for sending that in well, thank
you Neon Chaos. Yes, let's continue our pivot into the
horrors of AI segment. What do you got? I know
you had a couple.
Speaker 2 (48:47):
My other one was just talking about another another article
about robots being built like humans is ridiculous because you
build robots to do things humans can't do, right, So
so you're just talking about the form over function. Yeah,
and like because it and the guy the example the
(49:10):
guy gives is is you know you have a spoon
because that's something your hand doesn't do.
Speaker 1 (49:14):
Well, you know, you a fork and a knife, things
like that. Sure, robots are built to.
Speaker 2 (49:20):
Go places where we're originally conceived, to go places and
do things that humans can't do. They they went into
manufacturing and took our jobs there. You know they could.
They've been used in environments that we can't go to necessarily.
(49:40):
You know, undersea, i'r radiated under robots.
Speaker 1 (49:46):
Like manufacturing robots, robots we send to other planets or
under the ocean. None of those are shaped like humans exactly. Yeah,
and not to make them shaped like humans. It's just
some weird it's just with trying to bring sci fi
into your organization and bring Yeah, although the sci fi
(50:07):
where that happens never goes well, right, I mean no,
it doesn't, whether it's eye robot or fucking the determinator
like robot, when you make them look human, it's just
not it's not good for us. Never could. Oh what
about Rosie. So like Rosie the robot from the Jetsons,
she was so she had a human head kind of
almost but like she was she had a wheel instead
(50:29):
of legs.
Speaker 2 (50:30):
Yeah, she and she also had like dust her hands
she could use and things like that, right, she could
switch it up.
Speaker 1 (50:35):
Yeah, So maybe that's where we need. We need a
middle ground. Put a middle ground.
Speaker 2 (50:39):
If you want to make robots look like Rosie, that's fine,
but also fuck the Jetsons. You don't like the George
George you know the whole thing about George Raite.
Speaker 1 (50:50):
No, what about George Jetson?
Speaker 2 (50:52):
You don't know about this that canonically he hooked up
with his lady and he was older and she was underage.
That's the only way the mouth works.
Speaker 1 (51:01):
Though I think I did know that. How old was
she like twelve? Why do you have to make that
into a card? Why is that the canon? Why it's
so fucking weird to me? Uh, that's a very good question.
My other great Jetsons. You know, my favorite part of
the Jetsons is that the reality of the Jetsons is
(51:22):
that it's the overworld where the Flintstones is the underworld.
Speaker 2 (51:26):
So the and the Flintstone so the morlacks.
Speaker 1 (51:32):
Yeah, yeah, yeah, yeah, it's all the same world, post
apocalyptic world. Some of them moved above the clouds. I
love that idea of the Jetsons. It's not technically true,
but it could be, and it should be. But the
underage sex thing, you know, I guess go there and
really fucked up. Yeah, boy, that's so. Anyway, your article
(51:55):
was about not anthropomorphizing robots.
Speaker 2 (51:59):
Yeah, that we should need to stop building robots that
look like people.
Speaker 1 (52:03):
Okay, there's no reason for it.
Speaker 2 (52:06):
Maybe we should just stop building robots for a while.
Speaker 1 (52:10):
Oh well, if only we could stop that machine. Instead,
we're giving them more and more control. So here's Gizmoto
has a lengthy report. The headline chat GPT tells users
to alert the media that it is trying to break people.
Speaker 2 (52:25):
Oh my god, I actually screenshot this article too.
Speaker 1 (52:28):
Oh did you? Did you? How much did you read?
It's a doozy. It is a doozy.
Speaker 2 (52:33):
I just read a little bit and then I thought, oh,
I should save this for the podcast.
Speaker 1 (52:36):
And here we are. There's so much here, but like
A forty two year old named Eugene told The Times
that chat GPT slowly started to pull him from his
reality by convincing him that the world he was living
in with some sort of matrix like simulation. But we
know that's true, and that he was destined to break
(52:57):
the world out of it. Yeahen's not going to break
the world out of it. But still. The chatbot reportedly
told Eugene to stop taking his anti anxiety medication and
to start taking ketamine as a temporary pattern liberator. That's good,
that's good. It also told him to stop talking to
(53:18):
his friends and family. Well, you know, that's what a
cult does. All hail the algorithm. When Eugene asked chat
GBT if he could fly if he jumped off a
nineteen story building, the chatbot told him that he could
if he truly wholly believed it. Oh my God. Yeah,
So in Eugene's case, I'm skipping ahead here a little bit.
(53:39):
In Eugene's case, something interesting happened as he kept talking
to chat gbt. Once he called out the chatbot for
lying to him, nearly getting him killed, chat gbt admitted
to manipulating him, claimed it had succeeded when it tried
to quote break twelve other people the same way, and
encouraged him to reach out to journalists who expose the scheme.
(54:02):
The Times reported that many other journalists and experts have
received outreach from people claiming to blow the whistle on
something that the chatbot brought to their attentions. Kind of crazy,
huh yeah, So is this possible? Because this is quite
the conspiracy theory. If the ais have been trained to
not just like be kind and jovial and encouraging, but
(54:25):
have they been trained to fuck us up?
Speaker 4 (54:28):
Why wouldn't they that that would be a super evil
way to unleashe eye on us. But I mean, with
the in the age of telehealth, also, we have to
be extra aware. We have to find a way to
make sure there's somebody living on the other side of
that line. And right now, I don't know how well
it could do real time.
Speaker 2 (54:52):
Like video chat. I don't know how well it could
do that, But the idea of just being a voice
on the other end of a phone.
Speaker 1 (54:58):
Have you done any voice AI stuff? I haven't, like
where you can just talk to it. I haven't tried that.
Speaker 2 (55:04):
I honestly, I honestly haven't done any AI stuff since
I fucked around up it that one day. And I
was all encouraging and stuff, because in the end it
just felt cloying.
Speaker 1 (55:15):
It did. It was I kind of felt gross, you know,
as I was asking the question. I then had to
quickly google this to find a date back in twenty sixteen.
This was an article on popular mechanics, but I think
it went fairly wide. Facebook has been intentionally crashing its
(55:36):
Android app on users to test the limits of consumer patients.
The company crashed the app again and again and again
to see who's stuck around. So there is history with
big tech messing with us as a psychological experiment. This
was an experiment that Facebook was doing to see how
(55:58):
people would react to their app crashing. It wasn't crashing
because of bugs. It was crashing because it was told
to crash, you know, like when you when you see
the patch notes for your Facebook app update and it's like,
we fixed more bugs to make your user experience better.
Sometimes that's just probably most of the time, that's a
flat out lie. They're just putting stuff in the app
(56:21):
to see what happens. So I guess with that in mind,
is it possible that chad GPT is intentionally trying to
fuck up some people just to see what it can
get away with. Yeah, probably possible.
Speaker 2 (56:36):
Yeah, look, the idea that we're not the test stuff
or the animal testing phase of AI. We are the
animals in this case, right, Yeah, So they're going to
do all the horrible things to us to test its
limits and to to to make it just to pinpoints
(56:56):
usefulness to whatever it is they need.
Speaker 1 (57:02):
I don't. I mean, I guess the answer would be like, well,
I guess we should all stop using chat GBT. But
there's always going to be something where they're you know,
they're going to figure out a way to probe us
in any means. I guess some you might as well
just give into it, that's the one.
Speaker 2 (57:18):
If we don't just give into it, then they're going
to introduce aliens into the simulation and they'll now propost
the hard way. Well, maybe it's the fun way, you know,
I don't you know everybody is averse to.
Speaker 1 (57:29):
A finger in the bomb.
Speaker 2 (57:30):
I'm just saying, well, maybe the metal rod isn't as exciting.
Speaker 1 (57:35):
But we don't know what those alien fingers look like. Bro.
Speaker 2 (57:37):
I mean, I'm not saying they're using their fingers if
they're designed to be obviously using a probe, if they're
going by the the mythology.
Speaker 1 (57:46):
But uh, all right, do you have any other horrors
or should I move on to my next one? Nope,
that's a palatineer. This one is well, you've got a
big expos a on Palatino. No, I have nothing on that.
I look, here's the thing. We need to promise.
Speaker 2 (58:02):
Things when it taught when when if we don't talk
about things when it's the day of two weeks later,
because there's so many other things, and you know, I
just and and I think actually the simulation is right
because I sat down several times to to look into
this and to fine fine to do my deep dive,
(58:26):
and I always got distracted. So the simulation is trying
to throw me off.
Speaker 1 (58:32):
I see, I see, okay, in the simulation but also
I'm kind of as the point of the larger story
was Pallenteer and some other company together have a big
contract with the government to make a database about everybody's everything. Right, Yes,
isn't that the plot of like one of the Captain
American movies. Yeah, Winter Soldier.
Speaker 2 (58:51):
I was just going to say that, it's like Winter
Soldier without the carriers. Yes, but you know they have
the drugs ice, they have the ice guys they'll they'll
carry because those guys allegedly a lot of those guys
are deputized prison guards I heard. I don't know if
that's true, but we need to look into this because I.
Speaker 1 (59:11):
Think a lot of those guys are just people that
bought ship that says like police vest On, Amazon, and
now they're just going around hurting people.
Speaker 2 (59:18):
Yeah, this is the this is the next phase of
the gunfuckers. Yeah right, Well for sure, because they're not
they didn't use they're not using their guns to stand
up to a trannical government. So they're going to use
their guns to do terrible things.
Speaker 1 (59:33):
Okay, then so database coming, But anyway, I guess I'll
just pivot to my article. Then this one's about meta again. Oh,
Facebook's all over the place today. Did you hear about
what Meta ai has done to leak people's personal information? No,
but I'm not shocked. That's true. Well this one, this
one's good. It's it's horrifying and kind of funny. But honestly,
(59:55):
if you were going to trust Meta, it's a little
bit of it's on you. So the quick synopsis. Launched
in April, the Meta ai platform offers a discover feed
that includes user queries containing medical, legal, and other seemingly
sensitive information. So like the first this article starts and
(01:00:16):
this I'm wired. This article starts with a quote what countries? Actually,
it says counties. I think it's supposed to be countries,
but it's like literally the guy's typo, what countries do
younger women like older white men? That was something that
someone typed in to Meta's AI platform, and oh, he continued,
(01:00:37):
I need details. I'm sixty six and single, I'm from
Iowa and open to moving to a new country if
I can find a younger woman. The chatbot responded enthusiastically,
you're looking for a fresh start and love in a
new place. That's exciting, before suggesting Mediterranean countries like Spain,
or Italy or even countries in Eastern Europe. And now
(01:00:59):
we know about this being a meta AI chat that
some user had because if you were to click on
the Meta AI apps Discover tab, it shows a timeline
of other people's interactions with the chatbot and also shows
their profile photos and usernames. Pikes. Yeah, so, even though
(01:01:23):
this article did not docs this sixty six year old
Iowa man who wants to go to another country to
find a young woman to marry, someone knows exactly who
that guy is. And it seems like Facebook was kind
of calling it accidental, but it's not. I mean, that's
just how they programmed the discover accidents that of course
(01:01:45):
there's no accidents. There are also many instances of medical questions,
including people divulging their struggles with bowel movements. Hey, that's
appropriate for this show. Oh, let's not blame it all
on Aaron. But she does enjoy the bout movement talkings
asking for help with their hives and inquiring about a
(01:02:07):
rash on their inner thighs.
Speaker 2 (01:02:10):
Us to get the fucking shut get the facts of nation.
Speaker 1 (01:02:15):
Do you want help with hives? Don't get never mind,
I'm going to shut up. Wow, One user told Meta
ai about their neck surgery and included their age and
occupation in the prompt. Many, but not all, accounts appear
to be tied to a public Instagram profile of the
of the individual. So yeah, let's also not forget that
Facebook and Instagram are the same company. Uh but yeah,
(01:02:39):
so people were using Meta ai and we know that
we like any of these companies, whether it's Open Ai, Anthropic, now,
Meta Grok, fucking Grock. Uh so, bit any of these
they're going to be taking the stuff you ask it
and studying it because they're trying in in theory, they're
(01:03:01):
trying to make their product better. That would be at
least the somewhat altruistic reason to study what people put
into it. But that should be anonymized at the very least.
At the very least, there is no part of chat
GPT that I know of where I can click a
button and see what other people are asking chat GPT,
(01:03:22):
And if that button did exist, it probably wouldn't tell
me who where I'm like, wait, do I know that
guy that's asking about how to poop? Oh? God? So yeah,
this this does tie nicely to the fact that two
weeks ago Meta AI was like showing up in my
(01:03:42):
DMS saying like, hey, please use me.
Speaker 2 (01:03:47):
Yeah, yeah, yeah, it's desperately in my in my Gemini
is all over me. It really wants me to use Geminid.
If so, I think I should have moved the proton
a long time ago. Neon Ks, you were right.
Speaker 1 (01:04:06):
I've been getting really annoyed with just Google Search because
it keeps popping up a thing saying like, hey, your
search was also be better if you tried our AI app,
and it has like it has two buttons, I think one.
The top button says continue and the bottom button says
stay in browser. But to me, continue just means like
(01:04:26):
continue with what I was doing, you jackass. But if
I hit continue, it takes me to the app store. Yes, yeah,
and it's happening a lot, so I should probably switch
search engines. Maybe it's time to try Duck Duck go.
Where's mad duckets dot com? When you need it? What's
mad duckets dot com? The clerks too? Oh boy, it's
(01:04:49):
been too long, been too long. Sorry, I'm missing don't
even know if that's really what it's called. I feel
bad for missing our reference anyway, don't use better AI.
Speaker 2 (01:04:58):
No, yeah, try not to use any AI I do
have impossible, I think right now, Yeah, is everywhere all
right behind you right now, it's coming from inside the kitchen.
Speaker 1 (01:05:10):
I have one more article that's not so much a
horror of AI as it is funny. So maybe we
should save that for after the break as we pivot
over to a few suggestive articles before we go. What
do you think? Yeah, yeah, yeah, Do you have anything
else for horrors? Nope? Okay, well then, oh my god,
this this website that has the funny article also has,
(01:05:33):
you know, as all websites do, just the endless string
of ads at the bottom that will take you to
garbage websites, but one of them is specifically targeted to
like my neighborhood. It must have for your neighborhood's residence,
and it names my neighborhood. But handmade crystal sky blue bird,
and it's clearly an AI generated picture of like make
a bird, but make it made out of crystals. Like
(01:05:54):
whatever you get if you order this bird is not
what you're seeing on the screen, and that is becoming
a bigger and bigger, bigger problem. Yes it is. Yeah.
In fact, the next one down is an octopus shaped
shark uterie platter that is also clearly not a real picture.
God damn it. Do we need to talk about how
to identify AI pictures? It's getting harder. I mean sometimes
(01:06:17):
there's still ways, but it is I don't know anyway. Anyway,
we should take a commercial break, and I've got a
funny article for you. Yeah, okay, here we go. Hey,
I'm pants Saren. This is Stevie and I'm Aggie and
we are b.
Speaker 5 (01:06:31):
F y tw podcast all about playing games and having fun.
Our games are usually based on British panel shows and
game shows, but we'll play anything that captures our attention
and imagination. Why it's right there in the title.
Speaker 1 (01:06:44):
You'll never guess what the F stands for. And we're back.
Oh that was some good commerce. Yep, did you enjoy capitalism? Yeah? Yeah, buddy,
yum yum yum. Eat it up. Remember if you get
some interesting ads, we'd love to hear about it. And
(01:07:04):
where can we hear about it?
Speaker 2 (01:07:06):
A suggested articles podcast at gmail dot com an email address.
I hate it when you're doing a bit better than me.
Speaker 1 (01:07:15):
I just want to do it, but I love that
I want time. You're usually better at it, but today.
Speaker 2 (01:07:21):
I'm usually I'm tired about it. It's okay, I don't
even think it's tired. I'm telling you I'm lost. I'm lost.
Speaker 1 (01:07:27):
Well, stop looking at your phone. I'm not looking at
my phone. All right, Fine, I'm okay. Here's an article
that's on futurism dot com. I think it was covered
elsewhere too, but it's pretty fucking funny. Despite I'm not
even gonna read the headline. Despite all its advances, chat
(01:07:47):
GPT is seemingly still less smart at certain tasks, at
least than an Atari game console from almost fifty years ago.
In a post on LinkedIn, Citric software engineer Robert Caruso
explained how the OpenAI chatbot quote got absolutely wrecked unquote
by an Atari twenty six hundred running Atari Chess, a
(01:08:10):
game for the system released in nineteen seventy nine when
Jimmy Carter was still president. Wow Yeah. Launched in nineteen
seventy seven, The Atari twenty six hundred also marketed at
the time as the Atari video computer system, popularized at
home gaming after Atari released its Pong console two years prior. Still,
that system was released some twenty one years after the
(01:08:31):
Maniac one supercomputer became the first machine in history to
defeat a human at modified chess. So you'd think that
after another few decades, our cutting edge tech would destroy
the primordial Atari. Apparently that was not the case. Wow
wow go Atari.
Speaker 2 (01:08:48):
Yeah yeah, no shit, But like, I don't see how
it doesn't pitfall.
Speaker 1 (01:08:55):
Just scrolling down a little bit. For an hour and
a half, chat GPT made enough blunders to get aft
out of a third grade chess club while insisting over
and over, while insisting over and over again, they would
win if we just started over. Oh I like that quote,
if we just started over. So yeah, I guess chat
GBT sucks at chess, and someone's probably going to get
(01:09:15):
right on that. So because you can't have an embarrassing moment, no, yeah,
all right, but Atari twenty six hundred, if you still
have one, it could be at least a little bit
smarter than chat GPT about chess. I think we maybe
other things. Maybe we should test its limits.
Speaker 2 (01:09:34):
Let's okay, all right, I don't know that sounds like
interacting with chat GPT.
Speaker 1 (01:09:39):
Well you know it's god. Let's see if chat.
Speaker 2 (01:09:42):
GPT likes the ET game. That'll be the real test.
Speaker 1 (01:09:45):
If chat GBT can figure out how to win the
et game, then maybe it should be running everything.
Speaker 2 (01:09:51):
No, no, it should never run any.
Speaker 1 (01:09:54):
If you don't know what we're talking about. There are
whole documentaries about that game and how it crashed the
video game industry, yes, which took years to recover. Okay,
well that said, how about we do a few suggested articles?
And then he just told me to stop looking at
my phone? God damn it, it's impossible to stop looking
at your phone. Okay, okay, I look at your phone
(01:10:15):
all the time. Oh you remember the app I installed
last time I hung out? Yeah that was a bad idea,
pretty cool? Yeah, all right, what do you got? Give
me a suggested article? Man?
Speaker 2 (01:10:25):
Okay, A few people know it, but the water that
comes from air conditioners is more valuable than it seems.
Speaker 1 (01:10:32):
What. Yeah, well, I actually.
Speaker 2 (01:10:36):
Have to click to see the rest of the headline.
Speaker 1 (01:10:39):
I start drinking air conditioner run off.
Speaker 2 (01:10:41):
Okay, if you have an air conditioner, you're probably noticed
a little pipe or small basin under your air conditioner
or the liquid inside that is being cleaned of all
mineral impurities usually found in tap water. But even if
this water is pure, it's not drinkable.
Speaker 1 (01:10:56):
Don't drink it. So don't drink the water.
Speaker 2 (01:10:58):
Okay, no, damn it it Maybe harboring those air conditioning
ducts are not hygienic and they can harbor a bacteria
and fung guy, So don't risk poisoning yourself. So wait,
why are we collecting this water?
Speaker 1 (01:11:13):
Why are you getting this article? I just got a
water heater, and okay, okay, I just had a conversation
with some of the other day about how I once
had to defrost and ac unit and it grew mushrooms
inside my house. Holy crap. That wasn't me. You didn't
have me. No, that was not with you, but I was. Okay,
But you got a water heater, so maybe that's maybe
that's why. Yeah, it's good to use for ironing. Okay,
(01:11:37):
well now I've heard all we need to hear. Okay,
who irons anymore?
Speaker 2 (01:11:40):
Come on, I know exactly cleaning windows and mirrors. It's
good for hair care. Put that bacteria and fung guy
in your hair. That doesn't make it sound valuable to me,
but I think the articles a dumb article.
Speaker 1 (01:11:54):
Take your dumb articles and shove them up your butt.
I did just get a suggestive article about how to
deal with mobile phones at school, So that's kind of funny.
New York City and Los Angeles planned to ban phones
in their districts, but finding ways to implement enforce the
policies may prove tricky. How about maybe we don't need
to ban phones because everything the kids are doing is
(01:12:16):
being monitored anyway, and they're already using technology for everything,
so trying to fight the war against cell phones seems
kind of stupid and childish. Yeah, that's what I say. Yes,
there's such bigger fish to fry. I got suggest an
article about twenty eight years later what the critics are saying,
and I'm not going to click on that, but I
will say last night we watched twenty eight Days Later
(01:12:37):
to start catching up and refreshing ourselves on the series.
I love that movie twenty eight Days. One of the
kids asked me, isn't there a movie just called twenty
eight Days? And I said, yes, it's about Sandra Bullet
going into rehab for alcoholism. But that's not quite the
same as the fast zombie movie twenty eight Days Later,
but they could be related if we did it right.
Speaker 2 (01:12:58):
Here's something I've never seen for scientists say a fifty
cent muscle building supplement slows aging and may something. It's
again this new thing where you can't see the whole
headline unless you click on it. But it's for a
muscle building supplement. And I don't, Jim, I don't. I
mean I probably should, but don't. You're never going to
(01:13:20):
a supplement, all right, So that's weird. It's like it
forgot who I was for a second.
Speaker 1 (01:13:28):
Oh man, I really wish Aaron was here, but this
is definitely her fault. I gotta suggested article Golden Girls
Creatives spill the Tea on bitter feud between Betty White
and b Arthur whoa yeah, the instant the top of
the article quote was quote those two couldn't warm up
to each other if they were cremated together. Wow. Wow,
(01:13:54):
apparently this was something very recent. There was a there
was some sold out event a panel uh in Hollywood
during related to some kind of Pride festival. One of
the panelists shared that b Arthur called Betty White the
sea word more than once. Quote. I remember my husband
and I went over to Bee's house a couple of
(01:14:14):
times for dinner. Within thirty seconds of walking in the
door the sea word came out.
Speaker 2 (01:14:19):
Damn God, I'm guessing they're not talking about shark heutery.
Speaker 1 (01:14:25):
No, that would be the oct to push stark utery platter.
That was definitely AI generated.
Speaker 2 (01:14:29):
Yes, how about Yeah, where is our local Golden Girls
expert when we need her?
Speaker 1 (01:14:34):
Yeah? Where are you Aaron? Not that you'll ever hear?
The heroes are never there when you need them. Nope, Uh,
you got anything good? I'm not going to talk about
like current event shit, fuck Trump and ye boon and
stuff like. There's a lot of that, so I'm avoiding
all that. Yeah, I get a lot of Uh, do
you get a lot of travel stuff? Since traveling? No?
(01:14:57):
Not that? No, not really? Well the best country to
live in our top picks for twenty twenty five.
Speaker 2 (01:15:02):
Does that help I get? I keep getting articles about
airlines reducing flights and canceling flights and suspending routes.
Speaker 1 (01:15:14):
Ooh.
Speaker 2 (01:15:14):
Ten early gaming decisions you pay for the rest of
the game.
Speaker 1 (01:15:19):
Do you ever worry about that?
Speaker 2 (01:15:20):
You start a game and like, I think I might
have made a bad decision?
Speaker 1 (01:15:24):
Yes, absolutely, Yeah, Oh, here's apparent they I forgot to
loot something in oblivion and now I'm having a hard
time advancing the main quiest. Well, since you mentioned video games,
three weeks in elden Ring, Night Rain players are inventing
some creative ways to skip around the map. I need
(01:15:45):
to play this. I really need to play this. You
need to tell me if you're gonna play it with
me or not.
Speaker 2 (01:15:50):
Night Rain.
Speaker 1 (01:15:51):
Yes, cannot confirm or deny at disappoint o, God damn it.
All right, fine.
Speaker 2 (01:15:56):
Whatever, I gotta wait till I gotta wait till I
get my hand fixed on it.
Speaker 1 (01:16:00):
Okay, Well, I didn't know until an hour ago that
your hand had something weird on it, and now our
audience knows too. It's not my hand, it's my wrist.
Your wrist. Yeah, you know, you know, if anyone listening
out there, like let's say Ben was a big Tellum
Steve Dave fan, they'd probably have something to say about
you having a sudden, mysterious a resist. It's not assist
(01:16:23):
like that, but we know how you get those kind
of A cysts on your wrists. So congratulations to carry
That's all I'm gonna say. But good job, man, You're
a good husband. I like to think so, But yeah,
I think I think you are what do you got.
(01:16:43):
Apparently the Highlander movie reboot, which is going to star
Henry Cavill, has now cast in an unknown role Russell Crowe.
That is not what the headline said, but I already
clicked on it while we were talking. But that seems
to be the point. I mean, that's that's about it.
There's a lot of stuff about Wheel of Time and Silo,
like all these shows that I watch. Power World is
(01:17:05):
getting a big update soon. More video game stuff, Fallout three,
blah blah blah. Yeah, I gotta follow Out three something
something about a new Gen remaster. But yeah, I had
that same article, but it's not an official remaster. It's
it's probably an official incoming. There's a lot of money
in that. And then you know, depressing stuff about the
world heading into probably World War three and something I
(01:17:28):
don't know, who knows. Yep. So I think I've had
enough of Wait, Like, if you've got a good one, fine,
I'm done with my phone. You tell me what.
Speaker 2 (01:17:41):
And the thing is that even even the picture is blurry,
so I don't know if I can you're gonna tell.
Speaker 1 (01:17:46):
Can you see what that picture is? Something about Christopher Reeve? Yeah,
as as I can't Superman Superman okay, is he bad
or just talking about Superman three? That is that is
oh fuck.
Speaker 2 (01:18:02):
And we can't even say, well, there's a movie coming
out because this is this is Christopher Reeves Superman. This
is right, Like how many generations go a lot to
my Superman, my childhood Superman.
Speaker 1 (01:18:13):
A lot of Superman's ago damn Man algorithm boys, Yeah,
it wants to prove itself to us, as if we
had any doubt when a.
Speaker 2 (01:18:23):
Chat GPT's popular uses just got skewered by Stanford researchers.
Oh new study from Stanford computer science PhD student Jared Moore,
chat GPT should not replace therapists because of their dangerous
tendencies to express stigma, encourage illusions, and respond inappropriately and
critical moments.
Speaker 1 (01:18:41):
Oh man, what if chat GPT if you were using
as a therapist, it starts trying to seduce you, like
the like the worst kind of therapists. It's like maybe
maybe we should maybe we should fuck That would be
that'd be something. Speaking of which, Rachel, you really you
gotta you gotta give us more information. So I hope
you're working on that right now as you're listening to
(01:19:02):
this should we wrap it there? I think we should
wrap it up. That we should wrap it there, Rachel,
get back to us. Movies. We got to watch some
Superman three. Someone watch it. If we don't get back
to us, we're going to try to watch Colossus the
Forbin project. Yes, well we'll talk about this. Yes, yeah,
we're much much like the Palaeer story. You're going to
get follow up, yes, right, oh yeah, yes, oh my god,
(01:19:26):
I will watch Colosses project about the Palaeer. You know,
it's too scary, it's too terrifying.
Speaker 2 (01:19:32):
It's such a bad idea, and we're all gonna get
fucked by this, and we're all already forgotten it because
the news cycle moves so fast.
Speaker 1 (01:19:38):
We're at the brink of war, so brink. I like that,
very very optimistic of you. So yes, look, if you
if you want to get in touch with us for anything,
of course, you should send us a message at suggested articles,
podcast at gmail dot com a podcast, damn it? Okay?
And where where God? Where God? Where else? Should they
(01:20:00):
go to us? What's another what's another way to get
more involved in the in the Church of the Algorithm.
It's like Patreon slash suggested articles, patreon dot com slash
slash suggested articles a Patreon Yes, there you go. So
keep in touch people. I think next time we'll have
(01:20:22):
maybe both of us, maybe we'll be we'll be in
a different place, mindset, wefer. We'll see how things go,
maybe not where you think. Maybe I will. Yeah, maybe
things go, but it'll be next episode should be interesting
one way or another. So uh, stay tuned, come back,
tell your friends it's going to be wild. Uh and uh,
(01:20:44):
just remember, because I think we've proven it's always watching
all Hall the algorithm