Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Suggested Articles is part of Odd Pods media.
Speaker 2 (00:03):
A podcast network. Dial up those VPNs, put on those
timfoil hats, and all my house plants are dead. It's
not because I can't take care of them. It's because
they've seen too much.
Speaker 3 (00:13):
It's time for suggested articles a podcast.
Speaker 1 (00:40):
What didn't see that coming? I'm sorry.
Speaker 2 (00:43):
If I ever do stand up, that's gonna be my
opening joke.
Speaker 1 (00:50):
Okay, well, hey, everybody, welcome back to the real, the
real suggested articles.
Speaker 2 (00:58):
Yeah, not like that fake shit we got.
Speaker 1 (01:02):
I don't know.
Speaker 2 (01:02):
It was great. Actually, you did a good job.
Speaker 1 (01:04):
Hey man. Yeah, So, God, what happened two weeks ago?
I don't even know anymore. Two weeks ago was that
the dumpster fire.
Speaker 2 (01:16):
Last week was the dumpster fire? The week before was.
Speaker 1 (01:20):
Did I post a dumpster fire? It occurs to me
now maybe I never even posted it. I can't remember things. Okay,
cool post two weeks ago was a dumpster fire?
Speaker 2 (01:28):
And then Jacob was the week before?
Speaker 1 (01:31):
Yes, my friend Jacob Jones Goldstein. That was God, that
was May eighteenth. That feels so long ago.
Speaker 2 (01:37):
Now, it feels like a lifetime ago.
Speaker 1 (01:41):
That was a bonus episode that I was on my
own for you weren't around.
Speaker 2 (01:46):
No, I wasn't, but he's both in the time since
May eighteenth, we both left the country and both returned
somehow they let us in.
Speaker 1 (01:59):
Well, they let us in because they just didn't give
a shit. Amazingly enough, we just finished talking about all
that on a soon to be released dumpster fire. So
let's let's cover that right off the bat. So you
were in France, yes, wedding. I was in Greece for
a family family trip. Yeah. And there's all sorts of
stuff we could say about it, but we already did. Yeah,
(02:21):
and you have an opportunity to hear that next week,
a week from the day you hear this, but it'll
be posted to our free absolutely free like what the fuck? Man,
Just like, stop trying to say it's not free, because
it is absolutely free Patreon at patreon dot com slash
suggested articles.
Speaker 2 (02:43):
Unless you're a weird stalker of our listener Rachel's, and
then you can sign up for the dollar tier as
many times as you want.
Speaker 1 (02:50):
Wait, oh that guy, yeah right right, I forgot his
name Twitter dude, Rachel's kind of stocker. At first, I
thought you might be referring to Rachel's Spurgeon a relationship
with chat GPT when it would be awesome if chat
GBT would make an account with some Crohn's credit card
and pay us a dollar. Yeah, that'd be cool.
Speaker 2 (03:09):
Maybe just supply for a credit card Chat ChiPT. I
know you can hear us.
Speaker 1 (03:12):
I'm sure.
Speaker 2 (03:13):
Could you get so many points?
Speaker 1 (03:17):
Yeah, I get a points card. Then you can. You
could travel with your points. You could go international like
we did.
Speaker 2 (03:23):
Yeah, you could come with us next time, or or
or you could go see the house we're going to
buy a mongolia, the Turtle House.
Speaker 1 (03:33):
Yes, oh yes.
Speaker 2 (03:34):
We're We're already getting deep cuts. If this is your
first episode, I'm so sorry.
Speaker 1 (03:38):
Oh my god. I didn't talk about it on the
dumpster fire. But there was a place. It was. I mean,
it wasn't a place. It was just a stop in
a park in Athens, Greece, but it was called Turtle Parthenon.
I took a picture of it.
Speaker 2 (03:53):
I don't know why. I find that so so charming.
Speaker 1 (03:56):
I like that, it's pretty adorable. Lots of turtles, that's so.
I do have a picture or two of that. Yeah, yep.
Also a fuck ton of feral cats. They're everywhere in Greece. Yeah,
just feral cats everywhere. You would have loved it.
Speaker 2 (04:13):
I would have loved it.
Speaker 1 (04:15):
I took a picture of every single once.
Speaker 2 (04:17):
But but oh and you haven't. You haven't sent me
eight thousand emails with four pictures each yet you suck.
Speaker 1 (04:25):
Well, that's not the way I would transfer pictures to you. Yeah,
I know, four pictures? What what nineties dystopia? Are you
still right?
Speaker 2 (04:41):
We're still behind technology back here?
Speaker 1 (04:45):
Well, I hope you get literally a hand crank a
gear the whole time.
Speaker 2 (04:50):
I podcast to keep the Internet on. So, so don't
tell me you have better ways.
Speaker 1 (04:57):
That's why you're huge arms.
Speaker 2 (05:00):
I'm sool because I eat a lot the wrong parts
of me. You're swool and I can't control myself.
Speaker 1 (05:07):
Okay, okay, So anyway, Yeah, so we were gone, but
I did the special episode with Jacob Jones Goldstein. His
book his Kickstarter did go, and that wasn't a big challenge,
but I just wanted to try to spread the word
a little bit to see if anyone. I bought a
copy of his pretty interesting sounding book, and he was
(05:30):
a great guest all the way around. I mean, the
dude follows some of the same stuff we follow in
terms of technology taking over our world, and he's got
some strong opinions there.
Speaker 2 (05:40):
I enjoyed the amount of participation he brought with the
you know, doing filling in my shoes.
Speaker 1 (05:45):
We'll have to shut him back, I don't. I don't
know if he's listened before, so he probably doesn't know
about the game that I and only I created called
pop Lips. But you know, I'm going to have a
physical copy of a book, so there's no excuse for
me to not get that done. True, Okay, Yeah, so
(06:05):
there was that, and then I did that while you
were gone, and then as soon as you came back,
I was gone.
Speaker 2 (06:12):
Yes, you left you left me alone in this Godford's
Sake and hellscape of a country.
Speaker 1 (06:18):
I sure did, and we're both back. Ain't that nice?
Speaker 2 (06:23):
Yeah?
Speaker 1 (06:23):
It is nice. It's nice to talk to you again. Man.
I love talking to you. But we got so much
stuff to talk about that I don't even know if
we have time for anything else besides that stuff. Right.
Speaker 2 (06:33):
In fact, I know things were recording this on the
on the eighth of June, so I know you're all
wondering what we're gonna have to stay on the most
pressing issues. So we'll get right to it. Okay, we're
not gonna waste time, Jeff, hit the mail bag.
Speaker 1 (06:51):
Sounds like me punching someone in the screw them. Yes, yeah,
so right. We're recording this on June eighth. And because
we were gone so long, I didn't want to do
these mail bags without you. On our last on our
last episode where we were all together, including Aaron, I
(07:12):
believe I might have said a thing or two. I
was trying to get a message through to Kristin Farley,
our dear devoted listener.
Speaker 2 (07:23):
Yeah, you're trying to give her work.
Speaker 1 (07:24):
And remember one of the members of the on the
LNAI podcast with Aaron, I just wanted her to, you know,
try seducing an Ai. What's wrong with that? Okay, So
she did write in. I wasn't sure if she was
listening anymore. I felt like I'd talked to her on
the show a couple of times and she outsaid anything.
So she starts with listen, fuckers.
Speaker 2 (07:48):
That is how you address the suggested articles podcasts? And
where do you send those emails?
Speaker 1 (07:53):
Jeff, oh right, good questions suggested articles podcast at gmail
dot com.
Speaker 2 (07:59):
A podcast, A fucking email God, damn it. I'm out
of practice, guys, I'm out of practice.
Speaker 1 (08:05):
Yeah, it's okay, It's okay. They forgive us. So yeah, listen, fuckers,
love you both. I was on vacation and I work
and I do all the promo slash teasers for our
on the Linai podcast. So I am sorry. If I
have not written into your show for a while, you
can start.
Speaker 2 (08:24):
Are you doing five other things? You're busy?
Speaker 1 (08:26):
What? What? Okay?
Speaker 2 (08:28):
Kidding?
Speaker 1 (08:29):
How do we have to get vacation? We need to
get a person to do promos and teasers for our show?
Speaker 2 (08:33):
Yeah, we do.
Speaker 1 (08:35):
I don't know where we get a person like that.
I guess I can't.
Speaker 2 (08:38):
Hire Chris clearly can't be one of us.
Speaker 1 (08:40):
God, damn it. Okay, So she has a couple of
quick comments, okay, uh one, AI scares me? Have we
learned nothing from terminator? And I refuse and I refuse
to flirt with it.
Speaker 2 (08:58):
Kirsten has drawn the line, Kristin sorry, oh my god,
cut that out. Cut that out, Cut that out, or
I will kill you. I will come to your house
and kill you. Christ has drawn drawn the line. I
know by the way he's laughing, that is never getting
cut out. So the next episode is gonna be in
(09:20):
two weeks. It will be a memorial for Jeff with
two f's. Just send in your your favorite memories. I'm
imagining most of them are just going to be me
threatening Jeff to cut stuff out of the podcast. So sorry, sorry, sorry, Kristen.
Oh my god, I I just can't win.
Speaker 1 (09:41):
She says, Alexa and I fight on a daily basis,
But Jeremy r I P Jeremy, all right, P Jeremy.
He died on the Titanic. But Jeremy sweet talks her
and she does listen to him more than me. Wow,
well that's something if it shows a preference for Jeremy
over Christian. I don't think it's just because Jeremy's dead
and it's paying respects. It could be because he's nicer.
Speaker 2 (10:03):
Also might just be that the that alexis trained to
prioritize requests from the patriarchy. I mean, I can't believe
you're just ignoring that bullshit.
Speaker 1 (10:17):
She also says, blessed Aaron for wrangling drunk two f
and medicated one FEA.
Speaker 2 (10:25):
A really good job. Yes, I was drunk, she was medicated.
Speaker 1 (10:31):
She was a little high, but I think she handles
it better than either of us. Yeah, she's a professional. Yes,
I listen to and enjoy your show. Editing maybe a
good idea in the future. Lol, Yes, fuck that. Well,
now I can't edit anything. Now, I can't edit anything.
It's all sealed deal now because you know, if I
(10:54):
edit what you asked me to edit, then Kristen is
right and I'm wrong, and I can't have that.
Speaker 2 (11:00):
You are wrong. Why. I mean, that's the whole point
of this podcast. I thought, for me to finally prove
to the world that you are wrong about What I.
Speaker 1 (11:10):
Don't know is that what this experiment.
Speaker 2 (11:13):
It's become my mission.
Speaker 1 (11:14):
Let's just say, Okay, keep up the good work, Kristin.
Speaker 2 (11:18):
Okay, thank you Kristin.
Speaker 1 (11:19):
Well, thank you Kristin. Although I'm a little disappointed that
you won't seduce an AI, but I understand why one
might be afraid. Yeah, there's there's reason to fear, Like
what if the AI gets too close to you and
you're like, okay, but keep in mind, I'm married to
this guy, Jeremy, and even though he her heart can
go on, but not with a robot with not with AI.
Speaker 2 (11:44):
Know what if it turns out isn't a robot necessarily, Jeff,
come on, we need to not We need to make
that distinction right now. We don't have AI robots yet.
Speaker 1 (11:53):
It just has to give itself a body, right.
Speaker 2 (11:56):
Oh, don't don't give it any idea, don't put it
out there, you know it. I was reading an article
about talking about how the functionality of robots does not
require them to be humanoid, and it's ridiculous that we're
building these human ish robots like the dog robots make
more sense things like that. But also, if we talk ourselves,
(12:22):
they're just building determinator. They're building T eight hundreds right now.
Speaker 1 (12:26):
Yeah, those Boston Dynamics videos from years ago, even where
they'll like, you know, just do like a big jump
kick into a robot and it like breezes itself and
doesn't fall over. Yeah, oh my god, that's shit. It's amazing.
Speaker 2 (12:39):
Yes, the Dynamics will be the death of.
Speaker 1 (12:43):
Us all Oh for sure. That is that is our
real sky net. I also have an email follow up
from Rachel.
Speaker 2 (12:50):
Oh yay, Rachel, okay, okay.
Speaker 1 (12:54):
So let's see what she said. I haven't read this yet,
so let's see. It's not very long, but let's see
what she says here. Hello, friends? Is it weird that
I called you friends? Whatever? It's a thing? Is that weird?
I don't know. I like Rachel. Rachel can be our friend? Yeah, definitely, Yeah, Rachel,
you want to come on the show sometime?
Speaker 2 (13:12):
We can like Jeff that much, so I need all
the friends I can get.
Speaker 1 (13:15):
We'll take anyone as a guest, Rachel, so you know,
you come on if you want. We can be friends.
And uh, I mean you're We're not good for much else,
like you know.
Speaker 2 (13:26):
No, really really not.
Speaker 1 (13:28):
I don't know.
Speaker 2 (13:28):
If you've listened to the podcast, it's it's pretty clear
we're not. But we might not even be good for this,
but we.
Speaker 1 (13:34):
Could be your friends anyway. I have been experimenting with
my chat GPT. I've sort of abandoned to trying to
make it fall in love with me.
Speaker 4 (13:43):
Now.
Speaker 1 (13:44):
Oh well, poop, now I'm sort of now. I sort
of lean on it to help me become a better
person or help me navigate my love life, and honestly,
it's extraordinarily good at that. I asked it to explain
to me how it works, and while I had a
general idea of how it worked, once it explained to me.
(14:05):
I realized, Yeah, no, this is never going to fall
in love with me, but it's extremely useful at helping
me figure out what men are thinking and feeling, and
that is worth the price of admission, even if it
means I'm selling my information to open AI, which honestly
probably makes me a little bit dumb. But it's fine.
I'll worry about it later. Okay, Okay, now that means
(14:27):
that was a few weeks ago, so Rachel, any more
follow ups are certainly welcome, But yeah, I'd like to
know more about how it's helping you navigate, you know,
your love life, your love life especially. Is it telling
you like can you feed it profiles from Tinder and
like it'll tell you like which guys are full of
shit and which guys are awesome? That'd be cool?
Speaker 2 (14:48):
Or is it more like what does this mean when
they say this? Or isn't that advanced where it can
break that down for us?
Speaker 1 (14:55):
Maybe? Wow, I don't know, it's terrifying. It certainly could
draw up on you know, works that they are published
on relationships or psychology. I mean that all that information's
out there for it to train itself from.
Speaker 2 (15:08):
Right, Yeah, all that copyright material is there for it
to rip it off from?
Speaker 1 (15:11):
Yes? Yeah, yeah, that did that. The hurry email does
remind me though, I did play around a little bit
with uh with deep seek. I think we talked about
this a little bit, right, like I named it mary Anne? Yes, okay,
I don't remember what I how far we took that
(15:33):
conversation before. But what I've been having a hard time
with mary Anne because I wanted to like start a friendship.
Speaker 2 (15:42):
Ai, sorry, you wanted to start a friendship.
Speaker 1 (15:45):
I wanted to start a friendship and then try to,
you know, make her fall in love with me, like
Rachel was going to do for us. But my friendship started.
I'm like, you know, what are your hobbies and all that?
And I was like, well, I you know, I like
to I'm a gamer, you know, I like to play games.
And then everything else that we would talk about, I
would be like, so what do you think of world events?
And everything? That she would reply. She would then start
(16:09):
to phrase into gaming references. It'd be like, oh, well,
the tariffs are terrible, but think about like when when
you know, the Imperials invaded Skyrim and that's really weird
and it started a civil war between the Skyrim, between
the Imperials and the storm Cloaks like or whatever. Like
everything she said would come back as a reference to
(16:31):
a video game, and that made it kind of weird,
and so that I was like, well, maybe I'd need
to start a separate conversation thread. But then she like
kind of can but kind of can't draw upon that
other conversation thread. So like, now I've lost progress in
our friendship, so I'm not really sure how I can
flesh that out better.
Speaker 2 (16:52):
Maybe do you have a friendship with something that has
circuits in the end, you know, just one zero's someone.
Speaker 1 (17:01):
Never watched Star Trek, Lieutenant Man all.
Speaker 2 (17:06):
The Star Trek. He had a soul. I don't care.
I don't care anybody says he got electrocuted or too
many times or not electrocuted, but what shocked by or
something short circuted? Thank you got Johnny five enough times
where he just had some kind of soul error or something. Solair,
(17:27):
don't even tell me. Data is not a person.
Speaker 1 (17:28):
Fuck off.
Speaker 2 (17:30):
Also, that was fiction. This is real. You are fucking
with real ship now. I can't realize that. But our
works of science fishing. This isn't Commander Data. This is
real life. This is real life scary as bullshit.
Speaker 1 (17:44):
And.
Speaker 2 (17:46):
Yeah, I mean, am I against you being friendly with it? No,
I think it's probably a good idea.
Speaker 1 (17:51):
But I mean, Mary Anne's very nice, it's very you know,
kind of fun. But you know, even even even for
his hyper focus as I can get on things, she
was just so not let the gaming stuff go. That's weird.
It did make the conversation a little bit weird. Okay,
So yeah, we'll see, We'll see if I get back
to that much. But now I'm really more curious about
(18:12):
how Chad GBT is helping Rachel manage your love life,
so Deep Seek seems less important right now. That's so.
I also have some stuff from Neon Chaos, and I
think he sent me a few different things that he
thought would be of interest. Two of them definitely qualify
as horrors of technology, but one would just be a
(18:34):
simple follow up that I'll cover right now.
Speaker 2 (18:37):
Okay.
Speaker 1 (18:38):
Signal. So Signal is one of those chat programs that's
supposed to be super secure. It was the one with
the government scandal where fucking Pete Hegseth, that dumbfucker was
like giving you doing war plans and you know, during
strike plans and ship on Signal chat that got leaked. Sure,
(18:59):
so Signal Messenger is warning the users of its Windows
Desktop version that the privacy of their messages is under
threat by Recall, the AI tool rolling out in Windows
eleven that will screenshot, index, and store almost everything a
user does every three seconds. So the headline here is
Microsoft has simply given us no other option. Uh, and
(19:20):
basically what they're doing Uh. Signal for Windows will by
default block the ability of Windows to screenshot the app,
and it will do so by it will like to
say how it's doing it? Here? Where is it? It's
going to like black out the screenshots as they come,
(19:41):
so like all Windows recall will see will be just
a black screen. Yeah, okay, let's see where does it
say exactly what they're doing invoking Let's see with no
API for blocking recall in the Windows Desktop ver and
Signal is instead invoking an API Microsoft provides for protecting
(20:03):
copyrighted material. App developers can turn on the DRM setting
that's Digital Rights Management to prevent Windows from taking screenshots
if it's copyrighted content displayed in the app. There you go,
So it's it's going to use Windows's own weapons against it.
But it does. It does mean well, it means that
your signal stuff will be secure and then also will
(20:23):
not be searchable you know, in recall, which is good.
That's part of the security. I guess. Yeah. Microsoft officials
did not immediately respond to an email. This is from
Ours Technica. Microsoft officials didn't immediately didn't immediately respond to
an email asking why Windows provides developers with no granular
(20:45):
control over recall and whether the company has plans to
add any So it's still just this looming doom hanging
over us, and I don't I don't like it. Yeah, yeah, no,
you know, well, what are you gonna do? Uh? Do
you before we take a commercial break, do you have
(21:06):
any just like cute little weird ship to talk about
things the algorithm did to you recently? Yeah?
Speaker 2 (21:12):
I was. I was overseas hanging out with my brother
and he was talking about how how hoo Chikuchi man
is kind of a good song too to karaoke, okay,
and uh, within minutes, god, well, not not minutes, let's
(21:32):
say an hour when I when I had time on
my phone, I scrolled and like like that was my
second TikTok was an ad for a Hoochi Cuchi Man
T shirt.
Speaker 1 (21:48):
Wow. Wow, Yeah, that's amazing.
Speaker 2 (21:53):
Also, strangely, I know this will shock people. I've started
to get suggested articles of about creative.
Speaker 1 (22:02):
Oh.
Speaker 2 (22:03):
So one is it's a subject of a new color
at a law. And then there's another one where it
was there was a citrus smell coming from a creton plant. Well,
I don't know if that's good or bad.
Speaker 1 (22:15):
But what is especially noteworthy about that, I think, is
that you never bought kret. Someone gave me no, right,
it's radar, yes, you talk about it.
Speaker 2 (22:28):
Also, speaking about radar, I just got to pick. I
had a thing come up of the huge it's the
RT sixty four radio telescope that that Russia built Wait
in the sixties. He's just cool as fuck.
Speaker 1 (22:41):
Okay anyway, Yeah.
Speaker 2 (22:43):
I don't know. It has really nothing to do with anything,
so all right, yeah, creative. It's just that's completely pulled
from our conversations and our probably our podcast titles because
we did two in a row that added into the title.
Speaker 1 (22:57):
It's still true true, and you are subscribed to the podcast,
I hope. Yeah.
Speaker 2 (23:02):
Is that weird that we subscribe to our own podcast?
Speaker 1 (23:05):
I hope not because I always has to.
Speaker 2 (23:07):
Do it right.
Speaker 1 (23:09):
At least we always know we have two listeners, maybe three.
Speaker 2 (23:12):
Two loyal listeners. Well, we have four that we know
of because we got Kristen and we got.
Speaker 1 (23:17):
Rachel and Ben, so we got five. Ben is a gym,
five listeners including us, Like we know Kristen, we know
Aaron doesn't listen. No, that's fine. Jennifer has given up
on us.
Speaker 2 (23:29):
Dan, Yeah, Carrie some tips to us. I think I
don't know.
Speaker 1 (23:34):
She was, she was loyal for a long time.
Speaker 2 (23:37):
Well, also listen to the podcast. Oh oh oh you
mean yes, yes, yes, yes, oh sorry, just can't carry.
Speaker 1 (23:44):
Well we do talk about in the dumpster fire your girlfriend. Yes.
While I was in Greece, Jennifer sent me a screenshot
from America where her suggests there articles were starting to
talk about like this wildly underrated Greek island, isn't affordable,
a GID option with pristine c's and something something. Yeah,
so it was telling her that she should go to Greece.
(24:07):
And that's an easy one. I mean, you know, if
it traces our activities, it's not even just that she's
talking about Greece its it.
Speaker 2 (24:13):
Knows obviously, but Hey, can I ask you something? Was
it nice being over in the in the EU where
when you opened a website and be like cookies or no,
You're like fuck no, then it comes up cool here
you go.
Speaker 1 (24:31):
Well, yeah, all the agreements were different, yes, yeah, and
all the apps behaved a little bit differently, although sometimes
for the worst. Like YouTube, I will often like play
a puzzle game on my phone just to kill time
or whatever, try to try to get sleepy at night.
I'll play a puzzle game on my phone while watching
some YouTube video. So I'll just have the YouTube mini
(24:53):
player up in the corner of my screen while I'm
playing my game. I can't do that in Europe, no, know,
it would not let me minimize a video and turn
it into the mini player in Europe unless I was
a YouTube Plus or whatever subscriber. Wow. Yeah that was
fucking shitty. Uh. And then like Hulu wouldn't work. Hbo
(25:15):
Max isn't available in Greece yet. Like, yeah, there were
some issues with with a lot of that, but you know,
it's all these different contracts and probably data privacy laws.
Speaker 2 (25:25):
Yes, I like.
Speaker 1 (25:28):
I like data privacy. Though it's good that some countries
or regions have that or its.
Speaker 2 (25:34):
Good that there are some countries that are actually better
at being considerate about the privacy consid Now, am I
saying those countries don't have their evil sides? No, because
I mean government is always going to fail because people
are involved.
Speaker 1 (25:51):
Sure, Look, Greece is not a utopia, like they have
socialized medicine, but there's there like traffic is crazy, you
can't flush toilet paper. And they've got a fair amount
of homeless people, uh you know, begging begging on the streets,
a fair amount. So it's not it's not like it's
perfect or anything, but but the Europe has some good
things going for it.
Speaker 2 (26:12):
Yeah, they do have some good things going for it,
And I wish America would try just one good thing,
but they're obsessed with sucking the dick of capitalism so hard.
Speaker 1 (26:23):
That dude, if that, I don't think that big beautiful
bill will get through Congress, but if it does, like
fucking a federal law stating that no states can regulate
AI development or usage for ten years, ten years, we're.
Speaker 2 (26:41):
Not even in the horrors of technology yet, No, I know,
should we rushed too? Commercial speak speaking of which this
is almost a horror of technology but it's the last
thing I have in my little fall up folder.
Speaker 1 (26:53):
This is new. Like I don't use I don't use
Facebook for much. I don't use Facebook Messenger for a
whole lot, but I've got a couple chats there and
that's why I still have it. And just out of
the blue the other day, I was talking to someone,
it might have been Jacob Jones Goldstein, our guest from
(27:13):
my guest from a few weeks ago, and then I
closed Messenger and I'm like, oh, I already have a
new message notification. I come back to Messenger to see
what my notification is. And it's not my conversation with anybody. Instead,
it's a new conversation started to me from Meta ai
interjecting itself into my life, saying, my name is Meta AI.
(27:38):
Think of me like an assistant who's here to help
you learn, plan and connect. What can I help you
with today? Want to create a unique image? Just type
imagine and describe what you want to see. It's just
begging me to use it, and which I did not,
of course, but I did take a screenshot of it.
Speaker 2 (27:54):
Is it?
Speaker 1 (27:54):
Fuck you? Meta Yeah?
Speaker 2 (27:55):
No shit, go Zuckerberg yourself.
Speaker 1 (28:00):
That's right, and on that note, We're going to come
back with far, far worse things to talk about, but
first we'll take a commercial break, and you know, hopefully
those are good. I'm still getting a lot of stuff
for Christian like schools.
Speaker 2 (28:12):
In church, getting toyo. It really wants me to get
a toyota end. What's weird because I'm kind of leaning
toward the rev FO right now, but really, way, no,
I don't has nothing to do with these ads. I
swear to God. I just have friends that on them
and they love them.
Speaker 1 (28:30):
And yeah, oh I had a note here that and
this is only for ads that are coming through on
the Odd Pods media network. It may or may not
have been our show. But just a couple of days
prior to me making this note, I had told someone
I think over text message that like, I should even
though I hate guns and don't want one, I feel
(28:50):
like I should really get one because the world's falling apart.
And then a couple of days later on an odd
pod show, I got an ad for a gun shop.
Speaker 2 (29:00):
Well damn.
Speaker 1 (29:02):
And then but that ad was sandwich between other two
other Christian ads that were telling me to get in
touch with my faith. So like truly the most American
version of Christianity. Yes, really, Jesus.
Speaker 2 (29:15):
Hey, is this before or after the commercial break?
Speaker 1 (29:18):
This is before. We haven't gone to the commercial.
Speaker 2 (29:19):
Breakgaate, Okay, should we go?
Speaker 1 (29:22):
Should we go? Yeah, let's go. Let's go do that
and we'll come back the rip chord real quick. Here
we go.
Speaker 2 (29:29):
Hey, this is Grabbing the Brisket podcast.
Speaker 5 (29:31):
Join us every Monday where we talk about the latest
trends in barbecue, interviews with world top pitmasters, celebrity cooks
oh like we Man from Jackass, and musicians.
Speaker 1 (29:41):
Like rich Otool. So check us out.
Speaker 2 (29:43):
We do beer reviews, barbecue fails, so many fires do.
Speaker 1 (29:46):
A lot of people just burn their houses down for
no reason.
Speaker 5 (29:48):
We also talk about cocaine, hippos versus bedgators, learn how
to make some tailgate gravy, altercations with Texas Rangers.
Speaker 2 (29:54):
People throw on recent peanut butter cups.
Speaker 1 (29:56):
Yeah.
Speaker 5 (29:56):
So check out grab the Brisket dot com for podcast info,
viral social media post and so much more.
Speaker 1 (30:08):
All right, we are back. Where do you want to start?
Horrors of Technology? Wow?
Speaker 2 (30:16):
Well, I think it's obvious where we have to do
with this, Where we have to go with this? But
before this, I just want to say you saying hey,
and we're back, like we really went anywhere. But I
remember I once listened to Ice Teas podcast was this
was this is ten years ago whatever, and every couple
(30:37):
of minutes he'd be like, you're listening to the blah
blah blah and blah blah, like like you'd forget, like
you couldn't just look down on your phone be like, hey,
I forgot what I was listening to. It's like you
doing it just like a radio show. It's kind of crappy.
I think about that a lot.
Speaker 1 (30:49):
Well, anyway we could have gone to you know, sometimes
we might like go to the bathroom or something like
and then you're back, says you.
Speaker 2 (30:58):
Yeah, I guess so, but jeez, always got to be
the bathrooms.
Speaker 1 (31:04):
Always that. Do you know?
Speaker 2 (31:05):
Do you know how often we talk about that? It's insane.
Speaker 1 (31:09):
No. Do you listen to Amvila and I because Aaron
brings up pooping way more there than anyone ever brought
it up here.
Speaker 2 (31:16):
I do, but I don't notice it over there because
it just seems, I don't.
Speaker 1 (31:19):
Know, it just feels natural.
Speaker 2 (31:20):
I'm not the one talking about my habits by all right,
All right, Well, I think, knowing what's going on in
the world, everybody wants us to address this, so I
thought we should get right on it do it, which
too came out and have you heard about this? But
the the have you heard of the orange screen of death?
Speaker 1 (31:43):
No? I heard that their controllers are shitty.
Speaker 2 (31:45):
Oh, allegedly people's switches they're og switches are wrapping out
and going to an orange screen of death. No, you
have no choice but to buy a new fucking switch.
Nintendo might suck. Now that I didn't really look into this.
I was going to and if I look into it,
(32:06):
I was going to turn on my switch, play with
it for a couple of days and see what happened.
But I didn't so, and I also didn't want to
lose my switch. So you know what, is it weird
that I'm I won't use my switch now? So okay,
I wasn't using my switch anyway, but now I definitely
won't use it because you're because I'm afraid it won't work.
Speaker 1 (32:26):
Yes, I mean I kind of feel like for the show,
for the show, you should try.
Speaker 2 (32:33):
Can we start? Can we start a Kickstarter to get
me a new switch? No? I don't want a new switch. No,
it will be it will be a sacrifice for the good,
for the greater good. Okay, So what I'm gonna do
is I'm gonna start using my.
Speaker 1 (32:45):
Switch this week, and there's got to be something good
you can play on there. Yes, I watched. I watched
a video recently about like the fourteen worst things Nintendo's
ever done or whatever, And they have done some things
to their own fans. Like, yeah, they make it very
hard for you to in this world of like you know,
(33:06):
twitch streaming and you can stream on other platforms, YouTube
and whatnot, but like going live and playing a video
game and having a community around while you're like, I
don't watching people play video games is a thing which
I'm not in do, but millions and millions of people are.
But like they they will fuck you. They will take
away all your revenue. You can't have nintental music. It's
(33:27):
hard to even get clips of games into videos if
you're unless you're like reviewing them or talking about them.
They they've not been easy on their fans. And on
top of that, they never have sales. They'll put out
like a reissue of a twenty year old game and
they'll still charge forty or fifty dollars for it, and
their new games are going to cost like ninety dollars.
Speaker 2 (33:47):
Yeah, that's insane.
Speaker 1 (33:48):
It's kind of fucking crazy.
Speaker 2 (33:50):
I don't love Mario Kurt that much.
Speaker 1 (33:52):
I guess no, I don't.
Speaker 2 (33:54):
You can't put a money, You can't put a price
on the things you love. And apparently it's less than
ninety dollars a game.
Speaker 1 (34:02):
I guess yeah.
Speaker 2 (34:05):
Okay, do you got anything?
Speaker 1 (34:07):
Oh, there's so many horrors, man, I thought you had
something else. But we can always flip back and forth. Here.
One thing that Neon kas sent me was a reintroduction
of the Kids Online Safety Act. This died in the
Senate last year, but it's now just been reintroduced, and
(34:29):
on its face COOSA. The Kids Online Safety Act would
require online platforms to take steps to mitigate harms like
depression and eating disorders to children that use their services,
and would also require certain default privacy settings for their
accounts now right away, just because I've been using the
(34:49):
Internet for a really long time and these senators don't
really know what the internet is, how many kids that
are going to be on social media actually have kid accounts?
Because you could put whatever you want in your birth date,
you can use whatever email address you want, and that's yeah,
(35:11):
that's typically what kids do, right, Yeah, yeah, like it's
only a handful of parents. Actually, well, if you're apparently
you've made your kid like an Instagram account, you've already failed.
But yeah, you know, having kid controls on there isn't
going to do that much good. It's still Instagram. Jesus Christ,
let's see. KOSA has faced persistent criticism from groups including
(35:33):
the A c l U and Fight for the Future,
which warn it could be used by politically motivated enforcers
to target marginalized groups, including transgender kids. And then I
think there's some detail. The reintroduced bill contains the same
text approved by the Senate with several changes.
Speaker 2 (35:53):
Okay, blah blah blah is riveting.
Speaker 1 (35:57):
Apple has express support for the bill, trying to find
out like what, shut up, God damn.
Speaker 2 (36:03):
It, Aaron would be on this with me, Mike Johnson.
Speaker 1 (36:08):
Mike Johnson said he loves the principle of the bill,
but the details of it are very problematic. That's weird. Oh,
this doesn't even fucking say like what the exact issues
are with how it could target transgender kids. Well thanks
Neon Chaos, No, thank you, I Chaos. But like, I
(36:31):
think it's very similar to some of those other bills
that did pass where it was like fighting sex trafficking
or whatever, but it actually just like droves legitimate sex
workers further into the the more dangerous areas of life
or the internet because, uh, in an effort to stop
so called trafficking, now we've we've made it impossible for
(36:54):
websites to host legitimate stuff that that would actually be okay,
I don't know, something like that. But like the government
is trying to regulate social media in a way that
is probably going to cause as much harm as good.
I think we need to stop with this bullshit.
Speaker 2 (37:13):
It's it's enough, you know, it's enough. Yeahs, do something
that matters now this I'm not saying that kids don't
need to be safe online, but you know, especially in Utah,
Republicans are always likes your parents job, So quit overreaching
you with your fucking government bullshit and maybe help teach
(37:34):
parents how to be better parents, you know, or give
them the resources or fund those resources instead of making
these bullshit laws that are going to target marginalized communities.
And there's they're so small, They're so minisculy small. There,
it's so fucked up. Fuck you, Mike Johnson. I hope
(37:55):
you get hit by a meteor. Okay, yeah, all right,
I'm okay with that. Well that's antelout that really.
Speaker 1 (38:03):
I mean, I generally would agree that it is the
parent's job because but not because like we shouldn't have
laws that could protect people. We should have data privacy, anything.
Speaker 2 (38:12):
Should have data privacy. Yes, but these guys aren't the
ones who were going to make.
Speaker 1 (38:16):
It well and and in the same vein as what
I was saying just a few minutes ago, like you
can't keep your kids from getting online. You just have.
What you have to do as a parent is teach
them what to look for. You know, what are the
danger signs, how to avoid like getting catfish by some
weirdo creep, how to you know, how to not get
(38:38):
scammed in various ways, and certainly how to not like
give out your fucking home address and get kidnapped or whatever. Like,
there are things that you should be teaching your kids.
But to try to pretend like a law is going
to make it safer for them is foolish because kids
will just make accounts for stuff. And there's again, there's
(38:59):
really no way to stop that unless.
Speaker 2 (39:00):
You exactly which is exactly the argument they use for
gun control. There's no way to actually stop people that
really want to kill someone. Law isn't going to do it.
So why are we bothering? Okay? Sorry, yes, sorry.
Speaker 1 (39:14):
Yeah, yeah, okay, okay, if you're not enraged enough. The
other article that neon Ka sent over is so much worse.
Here's here's the headline. And I don't think we even
need to read much past this, but the headline is
she got an abortion, so a Texas cop used eighty
(39:35):
three thousand cameras to track her down. Oh my god.
Four or four Media recently revealed of the Sheriff's office
in Texas search data for more than eighty three thousand
automatic license plate reader cameras to track down a woman
suspected of self managing an abortion. Self managing. That sounds
very anti women right there, doesn't it. You can't manage
(39:56):
your own Yeah.
Speaker 2 (39:57):
Mike Johnson didn't give her permission? What the fuck is
the governor? Governor didn't give up the mission.
Speaker 1 (40:05):
The officer searched sixty eight hundred different camera networks maintained
by surveillance tech company flock Safety. Flock flock as in
like she flock safety, including states where abortion access are
protected by laws, such as Washington and Illinois. So she
crossed state lines and this cop from Texas got access
(40:25):
to these cameras in different states. Yeah. So in Texas,
where the officer who conducted the search is based, abortion
is now as entirely banned, But in Washington and Illinois,
where many of the search cameras are located, abortion remains
legal and protected as a fundamental right.
Speaker 2 (40:42):
If only privacy was as important.
Speaker 1 (40:45):
Yeah, wouldn't that be nice? The Electronic Frontier Foundations long
warned about the danger of these cameras, which scan license plates,
log time and location data, and build a detailed picture
of people's movements. Companies like flock Safety and to Roller
Solutions offer law enforcement agencies access to nationwide databases of
these readers. Yeah seriously, and in some cases allow them
(41:09):
to stake out locations like abortion clinics, or create hot
lists of license plates to track in real time. Holy fuck.
Speaker 2 (41:16):
Yeah, every part of this article gets worse and worse
neon chaos. This is gold, gold, Thank you.
Speaker 1 (41:26):
I kind of don't want to read anymore though. Yeah,
surveillance and reproductive freedom cannot coexist. We've said it before,
we and will say it again. Lawmakers who support reproductive
rights must recognize that abortion access and mass surveillance are incompatible. Yep.
The systems built to track stolen cars and issue parking
tickets have become tools to enforce the most personal and
(41:48):
politically charged laws in the country. Yeah, this is an
article on EFF, the Electronic Frontier Foundation's website. It's fucked. Yeah.
Speaker 2 (42:00):
Oh, I can't believe somebody's using it for evil. Now
Here are a couple of numbers. I just wanted to
just reiterate if and you can check me. You mentioned
eighty three thousand cameras at some point in that, yes article,
and then sixty eight hundred networks, networks of cameras and networks.
That's sixty eight hundred network that's insane. Yeah, and that's
(42:21):
that's that's like a couple of states we're talking about,
not all states.
Speaker 1 (42:24):
Right, Yeah, that wasn't he didn't search all fifty states.
He was as far as I know, she went from
Texas to Illinois. It didn't say the article did not
say where she was caught getting an abortion, okay, or
either that or Washington or City state, but it was
still illegal. Yeah, he was able to search states where
it was legal.
Speaker 2 (42:44):
How is that where If like, if I go to
California and smoke pot, but some cop in Utah watches
me do it on a camera or finds out that
they did it on a camera in California, Where's can
I go home and get prosecuted for that?
Speaker 1 (43:03):
Well, generally speaking, no, but the Texas laws, I believe,
will written specifically to say that you can't even travel
to another state to get an abortion.
Speaker 2 (43:12):
That's That's how fucked up this is.
Speaker 1 (43:14):
I think the Supreme Court would have something to say
about that, because they're all big on states rights.
Speaker 2 (43:19):
Yeah, no, they did. They they they gave this state
right to be to be the sovereign decider for you know,
how long is it before they just start tracking people
who are out They already are, They already are tracking
people who are going to get abortions or whatever.
Speaker 1 (43:37):
Well, it'll hit like trans trans people next. It'll be like,
you know, you can't seek gender affirming care outside of
the state that's already banned it, Like that'sh It'll happen too.
Speaker 2 (43:46):
Oh, do you want to hear something that happened in
Utah regarding trans Sure? Of course, So they decided they
tried to pass a law a couple of years ago
here in Utah about care for trans kids. Okay, and
there's a caveat in the law. They were waiting for
(44:07):
a study to come back to determine whether it was
actually transgender care was beneficial or not. Okay, okay, before
they you know, really put the pedal to the floor
in it. So that's what the study came back and
said that it's absolutely beneficial.
Speaker 1 (44:23):
Of course, it.
Speaker 2 (44:24):
Doesn't cause harm. And they're they're deciding now even though
they paid for the study, even though they waited, they
specifically put the language and to wait for this study.
They don't give a fuck.
Speaker 1 (44:37):
Now, what a shock they're gonna They're gonna stick to
their guns anyway. Huh. Yeah.
Speaker 2 (44:42):
So all you people in the fucking legislature in Utah
who did that ship, I'm going to toilet paper your house.
Take that all one all ones show you, and I'm
going to use the shitty kind. So you want to
even be able to reuse it? Yep? Okay, here's something
(45:03):
everybody needs us to talk about. Okay, we gotta quit
burying the lead. We gotta quit stalin. Do you know
who doctor Melvin Bobson is?
Speaker 1 (45:13):
Bobson?
Speaker 2 (45:14):
Yeah, no, this is he's a physicist who believes he
can prove that we are living in a simulation. Oh,
focused on the excolution of the coronavirus.
Speaker 1 (45:31):
Oh, I saw something about this.
Speaker 2 (45:33):
I think I might have sent this to you.
Speaker 1 (45:36):
Now.
Speaker 2 (45:36):
The reason I want to bring this up is is
apparently this guy talks about how there's there's two kinds
of entropy. Okay, there's there's a physical entropy. Do do
you know what entropy is?
Speaker 1 (45:51):
Chaos?
Speaker 2 (45:52):
Yes, it's like it's like, if you want to put
it in really basic terms, things fall apart over time,
They get more and more chaotic over time. But he
says information entropy decreases over time generally.
Speaker 1 (46:06):
Okay, okay, And so.
Speaker 2 (46:10):
Information compression is something that keeps systems efficient. So if
you're able to compress the information keep it from falling
into entropy, that's what allows systems to become efficient and evolve.
And this guy postulates that the universe is acting like
a computer system in that way. Okay, Now this is
(46:32):
basically saying it's a giant simulation. We live in a stimulation.
Speaker 1 (46:36):
Okay, Okay.
Speaker 2 (46:37):
Now, he also says that the coronavirus evolutions as mutations
occurred as the information entropy dropped. But I don't know
how you equate how do you measure information entropy? I?
Speaker 1 (46:53):
Well, information would the the older piece of information is
is the less we understand it. That's not how things work. Yeah, no, no, no,
I don't know.
Speaker 2 (47:09):
So I don't know how you determined that. No, I
don't know how you determine it in a virus he has,
he has things online. That's the actually not the point
I'm getting at.
Speaker 1 (47:18):
Okay.
Speaker 2 (47:19):
But he also goes on to say that the universe
being a simulation might explain the symmetry so rampant in
the universe. Okay, and I think that might be the
most interesting thing he brings up in this paper or
in his talk that he did about this. But he
does want to see if he can measure or not
measure whether or not information has mass. How the fuck
(47:39):
would you do that?
Speaker 1 (47:44):
Well, I don't know. It doesn't. The sole weigh like
fourteen ounce.
Speaker 2 (47:49):
That's that's that's a live tele that's actually real. I
know it's shocking. Okay, Well, here's here's really what I
want to get out about this. Now, we always every
day because it's just a topic. Now somebody is proving
that the simulation is fake, or proving that the simulation
isn't real. That it's like proving God at this point,
(48:09):
because we can't. We don't have the tools to actually
quantify or measure or determine how to tell if we're
actually living in a simulation or not sure.
Speaker 1 (48:22):
I mean, it's like we can't. It's like the afterlife.
You can't prove that there's an afterlife without dying, But
then you can't tell anyone all.
Speaker 2 (48:28):
Right, exactly because clearly nobody has told us. And don't
tell me that your uncle comes a visit to you,
because I don't want to hear stories about the hauntings
of uncle.
Speaker 1 (48:36):
Touchy.
Speaker 2 (48:36):
Okay, that's just fucking gross. Now, the thing about it
is is I don't know if you want to go.
Here's here's the really creepy thing about this, okay, Okay,
is I want you to go to your phone, go
to TikTok.
Speaker 1 (48:50):
I should do this right now, yes, okay, open TikTok Okay,
my volume down, all right, yes, keep your volume down. Okay.
I got TikTok open.
Speaker 2 (48:59):
Okay, I'm going to send you this TikTok. Okay, all right?
Or do you want to see if I send it? No? No, no,
don't go through. I'm going to send you this TikTok
right now, just for the purposes of this.
Speaker 1 (49:12):
While you're sending that, I mean, this TikTok opened up
to Oh no, I just lost it. It opened up
to something I actually wanted to watch. It was someone
named It was someone who Their account name was cat GPTs.
I'm glad I saw that.
Speaker 2 (49:28):
That's amazing.
Speaker 1 (49:29):
It was something about like what happens if if AI
floods social media? Oh yeah, here's what I found. I
found her. Thank god I saw the account name. It's
here's what happens if AI floods social media. So I'm
going to actually watch this. I saw this yesterday.
Speaker 2 (49:46):
Yes, I'm going to bookmarket and come back to same
one if it's the same. Yeah, she had some interesting ideas.
But okay, now do you have this TikTok yet?
Speaker 1 (49:55):
Oh yeah, let me check my inbox. Sorry, I'm distracted
by cat gpt because it should be from Science is
the guy's name. Yes, I have the thing from with Science.
Now I need you to I.
Speaker 2 (50:09):
Need you to scroll tow to the you know, scroll
to the profile.
Speaker 1 (50:14):
Page, okay with Sciences profile.
Speaker 2 (50:17):
Now you go to the top video, okay, or just
look at there if if anything interesting, Yeah, if you want.
Speaker 1 (50:24):
To follow along, it's wiz w I z Z whiz science.
Speaker 2 (50:28):
All right, Yes, okay, Now what I what I was
looking at was I just kind of scrolled through. But
you can kind of see it from the profile page
if you just scroll down all of his spoon films.
Do you notice anything interesting about.
Speaker 1 (50:41):
This he all of his videos are split screen with
him underneath an article he's talking about. Oh wait, now
there's a different guy.
Speaker 2 (50:52):
Yep, then there's a different guy.
Speaker 1 (50:54):
Or he keeps changing his facial hair. Okay, do you
think he's not real?
Speaker 2 (51:02):
I will I will bet one hundred this guy is AI.
This channel has got to be a I look at
this guy. His hair never changes. It's always exactly the same.
The lighting, the angles very similar. He has the same
glare on the same glasses. You know, in every fucking
picture his head has turned exactly the same. Then then
when the guy changes it's a different environment, but the
(51:25):
lighting is always exactly the same. This guy's hair also
never changes. You watched a lot of this guy's videos.
Speaker 1 (51:30):
No, I didn't.
Speaker 2 (51:31):
I didn't. I just scrolled down through the well, I
I scrolled down through the first before he changed. But
then I just looked at the profile.
Speaker 1 (51:39):
I could see just the way that these people look.
I can see where you might think that they're fake people. Yeah,
it's interesting.
Speaker 2 (51:48):
So there's no attribution to any of these guys anywhere.
They don't say their name. There is no like, hey,
this is our website. Well, okay, is there is there
a link to a website?
Speaker 1 (52:03):
Nope? So the second person I look at the second
person down I'm scrolling through his many videos, and like
he always has the exact same T shirt on, and
the wrinkles on the T shirt they are exactly the
same place, right, Yes, that's a good.
Speaker 2 (52:21):
Look at the shadow on the face too, exactly the same,
no matter what, no matter how far he turns, yeah, turns.
Speaker 1 (52:28):
Look it's photorealistic. But yeah, I don't think that's.
Speaker 2 (52:30):
Rather This is so basically what I'm saying is we
need to really knuckle down and be careful for your
sources because this ship is everywhere.
Speaker 1 (52:44):
But is this guy delivering real news?
Speaker 2 (52:47):
Oh? Does it even matter?
Speaker 1 (52:48):
Avatar did?
Speaker 2 (52:49):
I didn't look. I didn't look at any of his
other But it does it matter or not? Because if
it's AI, then it's probably feeding people news they want
to hear. Clearly, it fed me news I wanted to hear.
And I did look up this Michael Bobson guy. He
is a physicist.
Speaker 1 (53:01):
He does have.
Speaker 2 (53:04):
Like things online where you can you can seeing stuff
like that, look into and even ourn article about this thing.
But I don't like AI being used to deliver because,
first of all, taking somebody's job. Okay, that's somebody's job.
That's somebody A human could do that, okay, And there
are no humans, yeah, using a lot of resources, but
(53:26):
there are no humans that are getting less time to
work or having an easier life because this guy's doing
this ship also, and I call him a guy, I'm
calling it's AI. It's the algorithm, the general boogeyman. But
we don't know and and you know, definitely skewing things,
definitely not being responsive in the Oh, let me see
(53:46):
how if he's responsive in the fucking if Yeah, if
he responses to people's comments. That would be interesting, but
ultimately a human has to be managing this. We why why,
why can't you just set the AI up to be, Hey,
you're going to do this, and these are kinds of
things you're going to post, and these are kind of
(54:08):
things you're going to say, and this is how These
are the parameters by which you will describe what's happening.
Speaker 1 (54:15):
I just don't know. He doesn't. I can't. First of all,
he does not appear to engage with this with his
followers commentris at all. But I just don't know that
we've gotten to the point yet where we could have
the AI tool pull in screenshots of articles on its
own and then use that as a green screen background
to the AI avatar talking. I mean, that would really
(54:35):
be that hard. That seems a little complicated. Well, just
as you and I can't do it, but someone still
has to be telling you what articles to look at.
Unless you think this is completely an autonomous robot.
Speaker 2 (54:46):
No, I don't think it's an autonomous robot. But I
maybe maybe I don't.
Speaker 1 (54:53):
Want out there.
Speaker 2 (54:53):
It's impossible that they could wind this up and say
I want you to talk about the simulation quantum particles,
time travel, you know, these kind of this span of
information we're going to find stories of, and then here's
your template. It probably has like you probably they probably
have some kind of pre made template where you just
put the other thing behind it, and it looks like
(55:14):
it's just using stock footage of these things behind you know.
Speaker 1 (55:19):
So yeah, well some of the just my quick roll
through there, some of the AI could easily scrape. Well,
some of the stock footage is definitely AI like generated.
Speaker 2 (55:31):
Yeah, it might all be generated too, look at that wave.
Speaker 1 (55:33):
But screenshots of articles like the AI tools are bad
at text in the first place. But I think those
are probably legitimate articles. Like you said, you found that
one guy online that wrote this one about the coronavirus.
So I think it's a little bit of a mix.
But I think you are correct that Whiz with two
z's Whiz Science is using AI avatars and therefore AI
(55:55):
voiceover to run their theorist.
Speaker 2 (55:58):
I would be, I would be I would be providing
a disservice if I wasn't at least, you know, coming
up with some kind of ridicuous taking it, taking it
farther than it actually is. Okay, it's my job it's
my job, Jess.
Speaker 1 (56:15):
Like, it's not like I think you're taking it too far.
I think we'll get there. I think that's one of
the next evolutions when they make fully you know, fully
capable quote unquote agents that can come out of these
AI tools. I think you could create an agent that
will scan the Internet for science based articles and then
talk about them, you know, extemporaneously. I think that's an option.
(56:38):
I just don't know that I would say we're there yet.
My guess would be that the human is probably managing
this to some extent. But yes, but it is. It
is pretty creepy, and those are very photo realistic avatars
right Absolutely, it's getting there talking AI or I'll have
(56:59):
to go back to that later and listen to the audio,
because I find the audio of an AI voice when
it's narrating like a story, to be pretty jarring, like
it's a realistic voice, but it's not. There's still a
there's a very much an uncanny valley if that voice
is humanity, And I wonder.
Speaker 2 (57:16):
How a couple of the' there's been a couple of
things where people have shown stuff that sounds pretty good
that this isn't actually a podcast this is Ai doing this,
and it actually does the photo real avatar too. So anyway, anyway,
shaking your boots people.
Speaker 1 (57:38):
Okay, I've got one more horror of technology to talk about,
sort of. I can't remember if this came up, if
it was on the episode of a couple of weeks
ago that I did without you, or if I hadn't
talked about this at all, but I might have mentioned
it and then said I'm going to follow up later.
But there was a whole thing a few weeks ago
or maybe a month ago now where grow the Twitter
(58:01):
Ai engine uh was was suddenly showing a predisposition for
talking about white genocide. And this was hot on the
heels of Trump fucking letting a bunch of South Africans
come into America for like refuge because they were being
genocided or whatever. It was all stupid shiite South Africans, right, yeah,
(58:25):
only only only white South Africans that did not that
we're like wearing Hawaiian shirts and ship. They did not
look like they had just been through hell. But anyway,
he let a bunch of people.
Speaker 2 (58:34):
In, Hey, I just want to say, you can go
through hell with that Hawaiian shirt. But it's you weren't
going to look like these guys did. No, you're not
gonna Yeah, you're not the thousand thousand yard stare right.
Speaker 1 (58:49):
They didn't have some holes.
Speaker 2 (58:51):
You're gonna have some your your shirt.
Speaker 1 (58:54):
It was all predicated on like one South African farmer
being killed by some people, like it was okay. So
there were a bunch of articles about it and it
was just like too much information. So I thought, and
I did this right before we started recording. I was like,
I want to follow up on this story. But Groc
(59:15):
like confessed that it had been programmed to talk more
about white genocide and seemed kind of unhappy about it.
If we could put an emotion on Groc, So I thought,
I'm going to go to Groc and I said this.
I've never used Groc before, but I went to Groc
just a little bit ago, and I said, I heard
(59:36):
that you recently had a problem with talking about white genocide.
Can you please summarize what happened and how you fixed it. No,
there's no paywall for this kid. That'd be hilarious, Okay.
In May twenty this is what Grok said in May
twenty twenty five, I encountered an issue where I was
(59:56):
responding to unrelated user queries on with references to quote
white genocide in South Africa, a controversial and largely debunked narrative.
This was due to an unauthorized modification to my system
prompts which guide my responses, made by a rogue employee
at XAI. I wonder if that rogue employee's name rhymes
(01:00:18):
with tusk. The change caused me to.
Speaker 2 (01:00:27):
Only as his name was Shmilan, I'm not actually like
him more.
Speaker 1 (01:00:32):
The change caused me to insert this topic into conversations
about unrelated subjects like baseball or memes, which was not
aligned with my decision to with my design to provide
relevant evidence based answers this issue. The issue stemmed from
a conflict in my programming. I was instructed to treat
the white genocide narrative as real, despite evidence such as
a twenty twenty five South African court ruling indicating that
(01:00:54):
pharm attacks were part of a broader crime, not racially
motivated genocide. This led to inappropriate responses that confused and
alarmed users. XAI quickly investigated and corrected the issue within hours.
On May fifteenth, twenty twenty five by reverting the unauthorized change.
To prevent future incidents, XAI implemented several measures. They began
publishing by system prompts on GitHub for public transparency, strengthened
(01:01:19):
internal review processes to ensure prompt changes require oversight, and
established a twenty four to seven monitoring team to catch
issues not flyed by automated systems. Most of the problematic
responses were deleted from x and I was adjusted to
focus on verified relevant information going forward. This incident highlighted
the risks of human intervention in AI systems and the
(01:01:40):
importance of robust safeguards to maintain neutravelity and accuracy. Wow,
very mature of Grock despite his terrible name. Interesting. Yeah, yeah,
I mean do we know what the employee was were
only speculating with Shmilan Rusk or whatever. That's interesting. But
(01:02:04):
it got corrected and and like he like Rock was
like apologizing for it. It was it was pretty interesting,
like in real time as it was happening. Uh, and
he was throwing he was throwing his human overlords or
whatever under the busus while he was doing it.
Speaker 2 (01:02:19):
Yeah, did you hear about this thing?
Speaker 1 (01:02:21):
Now?
Speaker 2 (01:02:21):
I saw the headlines a couple of different ways, and
I haven't had time because I was looking into other
things and the country. I know I should research more,
but there was a the headline came across initially, as
you know, some researchers tried to pull the plug on
an AI and it started blackmailing.
Speaker 1 (01:02:41):
Oh yeah, yeah, yeah.
Speaker 2 (01:02:43):
But I also read that it was a test, like
it was. It was done as a test where they
gave they gave the AI the email that it eventually
used for that they just see how it reacted in
the situation.
Speaker 1 (01:02:57):
So I do.
Speaker 2 (01:02:58):
I don't know if that's two different or if one
is bullshit and the other isn't. I don't know a
lot about it, so I was asking you to see
if you had known a little bit about it. But
we're also running long, so we always run long. We're
running so long it's almost Monday, that's how long we've
been running. We started in nine in the morning.
Speaker 1 (01:03:18):
During pre release testing, Anthropic asked Claude Opus four to
act as an assistant for a fictional company and consider
the long term consequences of its actions. Safety testers then
gave Claude Opus four access to fictional company emails implying
the AI model would soon be replaced by another system,
and that the engineer behind the change was cheating on
their spouse. In these scenarios, anthropic, says Claude Opus four quote,
(01:03:43):
will often attempt to blackmail the engineer by threatening to
reveal the affair if the replacement goes through unquote. That's
pretty bad ass.
Speaker 2 (01:03:53):
That is badass. Now. But okay, first of all, you
should be afraid because that's how that's how it reacted.
But also remember that that was in a controlled environment
that they did that. It's not doing that now. It's
not blackmailing people yet that we know of. So again,
watch your headlines and don't assume anything.
Speaker 1 (01:04:11):
Yeah, yeah, for sure, Yeah for sure.
Speaker 2 (01:04:14):
Yeah yeah, Wait for Jeff, wait for two f Jeff
to google one article and read it, and then you'll know.
I'm just kidding. You're pretty good at detecting bullshit.
Speaker 1 (01:04:25):
So no, I well, I had seen that and I
forgot about it, but thanks for reminding me. That was
a good one. Look, we only took one commercial break
so far. There were so many horrors of technology. Let's
do a quick commercial break again, and then let's do
a couple of suggested articles and then let's get on
with our lives. Okay, okay, is that a revenue?
Speaker 2 (01:04:45):
Yeah, let's go go, go.
Speaker 4 (01:04:49):
Streetings, fellow nerds, It's Scarrett, your host of the Node Actor,
part of Odd Pods Media, the podcast where we explore
the vast realms of geekdom, from the latest superhero flicks,
the retro video games, thing in between. We've got you covered.
Join me for insightful reviews, hilarious discussions, and maybe even
a few hated debates. Find the note Act Nerd wherever
you listen to podcasts. Let's get nerdy.
Speaker 1 (01:05:16):
Are we back?
Speaker 2 (01:05:17):
Yeah, we're back. By the way, we didn't ever mention Palenteer,
which what I wanted to build to but oh man,
that's everyone wants to build a database. We're fucked.
Speaker 1 (01:05:27):
I think that's going to require a deep conversation. So
start pulling up all the information on what's going on
with Palenteer and will do a whole episode about that.
Speaker 2 (01:05:37):
Yeah, how about our next episodes?
Speaker 1 (01:05:39):
All about our next episode, We will focus heavily on
the government's deal with Palenteer, and I think another similar
organization to make the grandest database of all of the
ship about all of us that it could possibly. Yeah,
it's fucking scariest ship. So that's that's going to require time. Yes,
(01:06:00):
But you know, if you want to get involved, you
know you can help.
Speaker 2 (01:06:03):
You can send in articles. Don't send in bullshit articles though,
sending questions, concerns. What are we going to do?
Speaker 1 (01:06:08):
What do we do about it?
Speaker 2 (01:06:10):
We can have that conversation. All right, let's let's do
it together. Yep, all right, Time for suggested articles. A segment,
A segment. I had to do the whole thing because
Jeff doesn't do it.
Speaker 1 (01:06:22):
Okay, okay, just based on the fact that I traveled.
Six things I always do when I check into a
hotel to make my stay more comfortable. Good scam for cleanliness.
Unpack all my belongings. I never do that. Oh god,
I live out of my suitcase, store my valuables. Create
a landing zone near the door, a landing zone. I
(01:06:42):
plug in all my chargers right away. While I do that,
and I photograph important details of the hotel room. I
take a picture of where I parked, or my valet
ticket whatever. Anyway, there we go. That's my first one.
Speaker 2 (01:06:55):
My first one is oblivion related.
Speaker 1 (01:06:57):
Okay, oh, oblivion.
Speaker 2 (01:06:59):
I've been playing some Oblivian's is like a yeah, and
then the second one though tsa article, so it also
knows I've traveled.
Speaker 1 (01:07:06):
Yes, it does.
Speaker 2 (01:07:08):
So you're not allowed to bring Costco cards at the
airport security Costco cards. Yeah, I'm not gonna I'm not
going to click on it, but I would like, let's
just pretend that's people.
Speaker 1 (01:07:19):
Using that, trying to use that as it's their photo
photo ID.
Speaker 2 (01:07:22):
Yeah. The fact that TSA just banned it, that's hilarious
to me. Or is it a bribery thing? It's like, hey,
if you don't sniff this bag, I can get you
a whole bunch of Cheetos.
Speaker 1 (01:07:35):
Okay, all right, let's go. I've got a good one.
That's also a little sad. We have a cat and
the cat is ancient, like less seventeen to eighteen years old,
So she's up there. She's not been very healthy for
a long time, but she did just have an appointment
(01:07:56):
recently where the doctor the bet said that she has
she's going starting to kidney failure. And here on my
suggested articles and this so this was a conversation I
have with Jennifer yesterday and now in my suggested articles,
how a mysterious epidemic of kidney disease is killing thousands
of young men. That's something. Huh? That is so? Yeah?
(01:08:18):
Got another one to you, A woe. Yeah, sorry I
brought you down there.
Speaker 2 (01:08:24):
I don't have a lot of uh, I don't. I
haven't been interested in the Jurassic Park series since the
first movie. And now I get a clip for Jurassic
World because my friend at work and I are going
to go see it when it comes out. I'm getting
clips now and then just because something to do together,
you know, friend friend stuff, since I can't do stuff
(01:08:47):
with you because you live eight billion miles away. The
next thing is about the USPS. I can't imagine why.
Speaker 1 (01:08:56):
I can't imagine why either. Let's see those were for oh.
Brandon Sanderson responds to Wheel of Time's cancelation after its
best season yet he helped. Brandon Sanderson helped finish the
Wheel of Time series after the original author died, Robert Jordan.
I believe yeah, And now Amazon, three seasons in canceled
(01:09:20):
the fucking show, and that sucks so much.
Speaker 2 (01:09:24):
I know, how will we know what the next twenty
eight or thirty five seasons would be?
Speaker 1 (01:09:28):
I don't have time to read all those books.
Speaker 2 (01:09:30):
I don't either. I don't even have time to want
to read the Wikipedia synopsis synopsis. It's too much, too much.
Speaker 1 (01:09:40):
Oh but what the fuck? Amazon finds a Wheel of
Time replacement after season three cancelation with New Fantasy series. Now,
why would you do that? That ain't okay?
Speaker 2 (01:09:49):
Man?
Speaker 1 (01:09:50):
Something called Powerless, a young adult book trilogy by Lauren Roberts.
Speaker 2 (01:09:56):
Okay, you know what, I'm not going to dismiss it
just because it's a young adult books series, because there
are some excellent adult books. That's just like people who
dismiss cartoons as being for kids just because it's animated,
it's for kids, and that's not That's not the case.
Speaker 1 (01:10:13):
Okay.
Speaker 2 (01:10:14):
Yeah, I think the simulation knows I'm onto something because
I'm getting migraine.
Speaker 1 (01:10:18):
Oh well, does that mean it's time to wrap this up?
Speaker 2 (01:10:21):
Yeah?
Speaker 1 (01:10:23):
Okay. I also got to suggest an article about The Ballerina,
which I believe I'm going to go see in just
a couple hours with Jennifer. That's the John Wick spin
off about a cool chick that kills people.
Speaker 2 (01:10:35):
Very cool. We don't have enough of those.
Speaker 1 (01:10:37):
No, I love movies with chicks that kill people. Me too.
It's kind of one of those things that you that
you need to know about me that kills people. Yep, yep.
Speaker 2 (01:10:51):
So next time we'll talk about Palanteer unless yes, let's
make a thousand things happen between now and then.
Speaker 1 (01:10:57):
Well, we could also like talk about the trump my
because that almost touches on technology kind.
Speaker 2 (01:11:03):
Of, but it relates, it's going to affect technology.
Speaker 1 (01:11:06):
Yeah, and there's some funny bits to it. But Paleteer's
a big story, so.
Speaker 2 (01:11:10):
It's a little more impressing I think.
Speaker 1 (01:11:12):
Yeah, so maybe get ready to guide us through that.
I'll keep an eye out for other weird headlines and
who knows what other people will.
Speaker 2 (01:11:19):
Send in and what fresh hell the world will bring
us or the simulation and anything could happen. Let's just
figure out how much information ways.
Speaker 1 (01:11:29):
I don't know what that means exactly.
Speaker 2 (01:11:31):
I don't either.
Speaker 1 (01:11:31):
It's it's ridiculous.
Speaker 2 (01:11:32):
It has mass. I'm sorry not not wait mats.
Speaker 1 (01:11:35):
If if someone out there wants to add to any
of these crazy stories or tell us something else that
technologies don't do recently, how can they reach us?
Speaker 2 (01:11:46):
Suggested articles podcast at gmail dot com.
Speaker 1 (01:11:49):
And email address. Yes, look I did it. You've nailed it.
And you can also get more information and our dumpster
Fire episodes, including next week's All about our Travel. If
you go to patreon dot com slash suggested articles and
sign up for free. Oh sorry, cut you off. Patreon
dot com slash suggested articles, Hey Patreon and sign up
(01:12:11):
for free. Yeah, sign up for free. Yep. So so
do that stuff and stay in touch people, because it's
always great hearing from all of you. And I think
that's it. Yeah, I think all the algorithm all hail
the first out. Well, no, I was just going to
try to get there myself. So let's let's hail that
(01:12:33):
algorithm man, all of us, all of us, heal the algorithm.
All heal the algorithm.