All Episodes

February 5, 2024 114 mins

In this episode, Whitney is joined by UH assistant and podcast producer Star to discuss key aspects of the AI "revolution" including its short and long term effects and if it is possible to use AI without succumbing to its negative impacts.

Show notes
Originally published 02/01/24.
Get early access to podcasts by becoming an Unlimited Hangout member.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
WW (00:19):
Welcome back to Unlimited Hangout. I'm your host, Whitney
Webb, the AI revolution hasbecome more mainstream than
ever. And the rise of generativeAI is having an obvious and
pronounced impact on humanemployment, creativity,
socialization, and much moreoften framed as helping and
assisting humanity into a utopiaof enhancement, and increased

(00:40):
equality. The impacts of AI aregoing far beyond and stand to
transform not just the economyand society. But ourselves. AI
is being rolled out at breakneckspeed across nearly every sector
imaginable, along with otheremerging technologies, creating
a surveillance grid that logsand analyzes every supply chain
every keystroke and everytransaction, its proponents say

(01:01):
it will tackle illicit activityand inefficiency. But is it even
possible to harness AI for itspositive use cases without
succumbing to its more negativeimpacts, especially considering
that AI is largely beingprogrammed and maintained by our
Silicon Valley overlords, andtheir partners in the military
and intelligence community'sjoining me to discuss this and

(01:22):
more is star unlimited hangoutis Podcast Producer and
assistant who has a lot ofinteresting perspectives on AI
that I definitely think areworth sharing. So hey, star,
how's it going? Hi, good,thanks, how are you? Oh, you
know, doing swimmingly here totalk about one of the topics I
get asked about the most, andit's, obviously there's a lot
happening with it. And as Isaid, just a second ago, pretty

(01:45):
much every sack sector is havingsome sort of disruption, quote,
unquote, caused by by AI in themedia, the field where we work
is also one of these sectors,that's, you know, in the news,
even mainstream news,especially, you know, is talking
a lot about the impacts of theeye on on media, but also to,
you know, it's also affectingalternative media quite a bit, I

(02:07):
think, as well. So, you know,like I said earlier, a lot of
the narratives that were fedabout, like the promise of AI,
that it's gonna, you know,
it reduced tedious work, peoplewon't have to do tedious work
anymore, and sort of frame, youknow, the emergence of AI is
leading us to the sort of Utopiawell, maybe there's been sort of

(02:32):
an ability for people to, youknow, not have to do as much
tedious work, I guess, in inwriting, through like, chat, GPT
generative AI, maybe producingthumbnail art for media, and
whatever, that can all be donein a few seconds with AI now,
but you know, the consequence,one of the consequences of these
that we've seen recently arethese, you know, these pretty

(02:54):
big mass firings, that legacymedia that some people in
alternative media are cheeringon, but I don't really think
it's necessarily something tocheer on. Because essentially,
what you have is, you know, ifyou view mainstream media as
essentially synth ographers ofthe state, you have these
companies firing thesemainstream media people and
replacing them with generativeAI, meaning that it's a more

(03:16):
effective stenographer, and theycan produce more and more
content, and they don'tnecessarily need people to do
it. So I don't think that'snecessarily a big win for
independent media. It's not likethese people, these sites, or
these, like, legacy mediainstitutions are going to be
producing less content, but Imean, yeah, I mean, there's
people that are cheering it on,because they like hate

(03:38):
mainstream media and whatever.But unfortunately, a lot of
people and independent media,the dynamics that we, a lot of
us in independent media used tobe really against have been
adopted by a pretty decentamount of people and independent
media these days, and it's, um,it's pretty unfortunate, but I
mean, I'm sure some people inindependent media are using

(03:58):
Well, I know, they are a lot ofthis generative AI stuff. And,
you know, frankly, that kind ofconcerns me because as soon as a
Chet GPT was sort of, likebrought out and popularized, you
know, they were saying thatgenerative AI is going to be
like 90% of all content by 2025.So that's like, a year from now.

(04:19):
And I don't think that'snecessarily a good thing. Right?
I don't know.
Well, and it's not also just,you know, mainstream media
that's going to be using itthere's plenty of shows that
independent media doesn't likethat still talk about the same
type of stuff that they do, youknow, like along those same
themes, so it's like, it's notlike it's not going to happen to

(04:41):
you just because you thinkyou're some kind of truth teller
or something like that. It'sgoing to happen to like
everywhere, it's just a theydon't need as many people to do
the work
well I also think a lot of thislike the AI dot coming where the
emerging really AI dominance inmedia On is going to have a lot

(05:01):
of impacts on the censorshipagenda, which is going to have a
huge impact on independent mediaobviously, because the goal
well, I talked about this in arecent interview with with
Katherine Austin Fitz. And it'sunfortunately, paywall just
because of how she runs for hersite. But, um, you know, in
there, a lot of what I talkedabout was this Henry Kissinger
and Schmidt book on AI. And theyessentially lay out that the

(05:24):
goal is to have generative AI,you know, produce all the
messaging, whether it's aboutnews, or political messaging, or
really messaging about anything,just online content period, and
then have that be curated by AI.So like, AI is censoring out the
stuff that doesn't fit, youknow, and if stuff that if they

(05:48):
want, you know, essentially asthey, you know, Kissinger and
Schmidt layout at all to beessentially AI produced, and,
like, on top of that managed byAI, anything that's like,
written by people that's notwritten by AI is going to, like,
stick out to the AI, you know,and be easier to easier to
censor, which is not good. So,you know, I understand that

(06:08):
people like, like, some of theutility of it, and it's, you
know, there's some convenience,I'm sure, to being able to
produce a wall of text in likethree seconds with this thing,
but it's, um, I think it's a bitcomplicated, too. And I also
wonder a lot about, you know,chat GPT specifically, you know,
I've never used it, but as Iunderstand that you have to have

(06:29):
like an account. And so everyquestion you ask it, it logs,
I'm sure sends back to the SamAltman, mothership, to see what
people are asking.

SP (06:39):
And it keeps your history so that can be subpoenaed. Oh, wow.
Yeah. Yeah. I mean, it keepsyour history. So. And, you know,
I've heard I've seen peopletalking about that. So I always
delete everything. I asked it,but you can use other ones
besides chat. GPT two, which issure thing that people, you
know, can think of,

WW (06:57):
yeah, but I'm just sort of thinking about how when these,
these things that are novel, getrolled out, and people don't use
them without thinking about howthey're going to use your data
against you. So like, you know,Facebook or something, when that
first came out, people werelike, oh, yeah, I'm gonna, like,
ping my location and tell itexactly where I am. And I'm
gonna, like, yes, link it tothis, and that, and I'm gonna

(07:17):
post like, all my pictures andall this stuff, you know, and
then oh, it turns out, Facebookhas all these weird connections
to people like Peter Thiel, andDARPA and whatever, maybe we
like, shouldn't give them ourdata. You know,

SP (07:30):
if people don't even remember what they've given, I
remember when I was young, Imean, I'm a lot older than you
are. But I went to college andlike the 90s, when computers
were first coming around, andwhen I first went to college, I
was using the computer all thetime. And there was this guy
that went to my school, who wassuper, like, he was kind of like
the Unabomber. He was really,like, weird and paranoid. And,

(07:51):
you know, just kind of remindedme of that guy. And I always
thought about how he was kind ofthis guy was kind of right for
being so paranoid, you know, andI kind of was paranoid to like,
I'm not going to, you know,leave my trails everywhere. And
I thought about that, from thetime I first started using the
Internet back in, like, youknow, 90s and there are some
people who, just from the timethey started using the computer,

(08:15):
just, you know, would telleverybody everything. Yeah, you
know, this is where I live.These are the things that I
like, you know, and all thatstuff is still out there. And
they don't remember what they'vetold. But they've pretty much
told everything. Well,

WW (08:27):
I think like you get a computer, you think of it as
yours. And it's like, oh, thisis where I'm putting all my
stuff you're not thinking aboutlike other people accessing it,
even though a lot of well,intelligence agencies have
funded a ton of these SiliconValley companies that dominate
everything that you know,they've been pretty open about,
not super open about but likeit's been reported on and
documented that they can getinto pretty much everything. So

(08:52):
it's not really you know, yoursas much as you as you might
think it is anyway, going backto like chat GBT and this other
stuff, you know, people areasking it, I'm sure all sorts of
stuff. And then you have likepeople, this sort of advent of
like aI quote, unquote, people,you know, AI, girlfriends, AI
therapists, AI, I don't know,everything and that sort of

(09:13):
strain, I guess. And Ipresumably that's logging all of
your interactions to and sendingit somewhere disturbing anyway.
But what really concerns me tooabout AI not just well, so like
the end goal of this for thepowers that be if you believe
people like Henry Kissinger andEric Schmidt is to have
basically everything we interactwith in terms of information

(09:36):
online be produced and curatedby AI and you know, what's
interesting too, is there's beenthis narrative seated about bad
people using AI. That way, youknow, for like AI for
disinformation and for ISIS isrecruiting people with AI and
stuff and, and all of thesenarratives so it It seems likely

(10:01):
that, you know, I think we'llsee more of that narrative. And
as it progresses, they'll tryand get like, you know, only
certain people are allowed tohave a chat GPT account and ask
it stuff. You know what I mean?Yeah, but

SP (10:12):
there's so many of them. I mean, there's that one, I think
I sent you this link thatuncensored.ai You can't actually
get an account on it. I thinkthere's like a waiting list or
something like that. But I mean,you. Anybody can make one, you
know, I mean, there's so many ofthem. So I don't think they can
prevent people from using them.Because, yeah, I mean, maybe

(10:33):
they can prevent them fromusing, like, the big ones. But,
I mean, even Mark Zuckerberg,like last week, because you
know, his language model Lama.He's like, saying he wants to
make it open source, you know,because I think he knows that
that's what people want.

WW (10:47):
Yeah, well, some people like to assume open source means
like, yeah, free of bad,nefarious code. And that's not
necessarily true, open sourcejust means it's available. And
people have to go in and auditthe code. And if you don't want
to audits, the code, you know,whatever. You know, but um, you
know, I've talked a lot aboutthe the push the calming push

(11:08):
to, like, regulate the internetspecifically. Which is like a
definite definite policy goal.And so I think, you know, in if
they succeed in that, andthere's like, a particular
galvanizing event that makespeople call for a privacy, free
internet and all of that stuff,it's very possible that, you

(11:29):
know, there would berestrictions on who gets to use
AI for information. And, and whodoesn't, because a lot of the
stuff in this Kissinger summitbook is like, basically, an
outline about how to use AI totake us back to the Dark Ages.
And a lot of in a lot of ways Ifeel specifically about like,

(11:49):
the flow of information. Solike, a lot of not necessarily,
I mean, this isn't necessarilyhow the dark ages ended, but
like with the invention of like,the printing press, and like the
the Democrats possession ofinformation and being able to
get it, you know, out there, youknow, before information was

(12:10):
like, very controlled, and likeit was controlled, like by the
church. Yeah, specifically, andonly like, Yeah, and like only
like clergy, or like specificpeople can have access to that
and like, learn to read. And allof this stuff. And sort of the
idea that they these guys layout is basically using AI to
take us back to that, which isvery crazy, because it's being

(12:34):
sold as like it as enhancinghumanity and all of this stuff,
and you won't have to do tediouswork. And it's sort of been sort
of the justification for pushingfor things like UBI, universal
basic income, and in all of thestuff, but it seems like, you
know, the way like these guysare actually thinking in the way
this is beginning to manifest asAI is leading to mass firings,

(12:56):
and in some sectors, and surelymore in the future. You know, I
think we're going to be seeingwhat they they really have in
store and more and more. So

SP (13:05):
I read this book, too. And
I, you know, I've heard how youtalk about your interpretation
of the book, and I think I agreewith you, you know, that they're
veiling their true thoughts, butdon't you think that they also
are truly concerned? I mean,these people do know that there
are dangers with the AI and andthat they have to go about it

(13:29):
the right way. And it seems likethe book was warning of some of
the dangers. Yeah. But

WW (13:34):
I think the way these people work is that they like warn up
some dangers, and some of thedangers are true, but they like
their solutions to those dangersare like what they wanted the
whole time, but the benefit tothem, right? Yeah. And so
whatever they're like proposing,they're like, we should be
afraid of this. And this is theonly reasonable solution. But
it's not the only reasonablesolution, you know, right? Yeah.
I mean, Eric Schmidt is is alsogoing around, you know, after he

(13:57):
wrote this saying, like, we haveto link people's social media
accounts to their IDs, so thatwe can report them to law
enforcement when they like postdisinformation and stuff.

SP (14:10):
I can't even believe that people still have this idea that
if I'm not doing somethingwrong, then I don't care. I
really can't understand. Wow,why people think that and they
do. It's

WW (14:24):
like it's very naive, honestly, because if you
consider how AI is being usedright now, for example, like in
facial recognition and stuff,there's been like a series of
issues in the UK with themtrying to roll out like real
time facial recognitiontechnology, because the the
accuracy is like super low.Yeah. And but but they're still

(14:44):
like they're not going back andlike fixing it. They're not like
changing providers, like if youwere the state and you are
meaningfully trying to like makean AI facial recognition system
that works. You would go and tryand find another like provider
that has higher accuracy. Yeah,something. Yeah. And they've
shown no interest in doing that.And so I think essentially what
they're trying to do. I mean, itsort of reminds me of the movie

(15:09):
Brazil, I think you said youhadn't seen it, but it's like
this 1985 Like sci fi movie, oneof the guys from Monty Python
made it, but it's not a comedyat all. And basically, it starts
off with the, you know, it'sthis big, dystopian bureaucracy,
sort of, you know, just like alot of the other, you know,
famous British dystopian, youknow, works. And it's, it's in

(15:33):
basically, like, they make amistake, like some guy and in
the ministry of information, orsomething like squishes of
flying falls, and in like, theprinter as it's printing out an
arrest warrant. And so like, oneletter is changing this guy's
last name. And so they go andthey arrest and they end up
interrogating and murdering,like an innocent guy. And all
these other people that triedto, like report the wrongful

(15:55):
arrest, or like, try and rectifythe situation, or let the
government know, they made amistake, like end up being like,
over the course of the movie,like arrested and, you know,
tortured and stuff. And it's bein basically like, the message
of the of the movie, I think, inthat sense is like, you know,
the, the state, the government,like a government, like this
isn't necessarily interested inthings being right, you know,

(16:20):
because the system, like a totaltotalitarian system, just by
virtue of like, the fear of thatmay be happening to you will
keep people in line, you know,it just

SP (16:30):
kind of reminds me of like, predictive programming and
movies. And it's kind of like,where if it's anything less than
these horrible things that theyshow us in these movies that
people aren't, they're okay withthat. As long as it's not those
horrible things that they'veshowed us?

WW (16:46):
Well, I mean, it's a way of like normalizing it, I guess, in
a way you're like, desensitizingpeople to it to an extent maybe.
But when I was saying aboutBrazil, there's like this French
philosopher who's like, name I'mawful at pronouncing, because
I'm really just bad in generalat pronouncing French names. No
offense to anyone, but it's likefaux coal or something like

(17:10):
that. Like faux faux call washow? Oh, yeah. Yeah, that guy.
So anyway, the people atPalantir you know, Peter Thiel
and Alex carps company, but ifyou're familiar with my work is
the privatized version of totalinformation awareness. They like
love that guy and have likepictures of him in the offices,
New York Times has like a bigprofile on them, and like 2019,

(17:32):
or 2020, somewhere in there, andthey, like posed under his
picture. And that guy basicallydeveloped or, like, expanded on
the idea of of panopticon, whichis reflected in the movie
Brazil, which is the idea thatlike, if you know, you're being
watched, especially if, youknow, you're being watched by
like, something authoritarian,you will, you're more likely to,

(17:56):
like self regulate yourcompliance, you know. So it's
not about it being accurate ornot, like, they don't care. You
know, what they care about islike you like, they surveil you,
because, you know, it meanslike, they're watching and that
you'll regulate your ownbehavior, you'll like self

(18:17):
censor, in all senses, you know,not just like, what you post
online, but like how you actand, and behave. Because you
know, that it's like watchingyou, but it's not about like it
for that to happen. They don'tcare about accuracy, it's about
like inducing that effect atscale. And so like, if, if these
AI, you know, facialrecognition, or whatever

(18:39):
algorithms are put in charge oflike, it doesn't, you know, to
these guys, it doesn't reallymatter how accurate like, they
don't they're none of them are100% accurate. Yeah. And they're
being rolled out to decide, likemajor stuff about, like law
enforcement, and governance andother things, you know, and, you
know, I think that's somethingthat's definitely not talked
about enough. I mean, I'm surepart of it is like the same, you

(19:02):
know, corporate grift Enos oflike, oh, you know, this is my
brother's company, and I'm gonnagive them the contract, even
though their AI algorithm iscrap compared to the other ones,
like, I'm sure there's a degreeof that in there too. But
ultimately, like they're notlike AI in terms of like being
hyper efficient, like it, is itsome stuff, but some of the
things it's being sold as, likea solution for it's not accurate

(19:27):
and has big in in when appliedto like law enforcement
settings, or which have thepotential to decide who lives
and who dies just like militarysettings, you know, it becomes a
really big issue. Well,

SP (19:40):
and you don't know when it's being accurate. That's the
problem is like, if you couldknow, if it was obvious when it
was hallucinating, that wouldn'tbe a problem. But I mean,
sometimes you know, I'veexperimented when it first came
out, I didn't want to do it atall, but then you know,
something may be kind of look atit in a different way. And so I
kind of checked it out a littlebit. And some of the things, the

(20:02):
mistakes, it just makes stuffup. It really does. It'll make
up court cases, studies, it'llgive you names of studies that
don't exist numbers for courtcases that don't even exist, you
know, so, but it seems likeit's, you know, they've had I've
read all these articles abouthow in court, somebody will go

(20:23):
to court with some informationthat they got from AI. And it's
not even true. Yeah, insane.

WW (20:28):
Well, this is going back to media, you know, AI like taking
over mainstream media journalistjobs, it's not a good thing.
It's like, like, even moretalented bullshitter you know?
Yeah.

SP (20:42):
Because if you're gonna have to verify everything that it
says well, then what's the pointof using it you're supposed to
be using it to save time, butyou need somebody come back and
check everything it says anyway,or

WW (20:53):
you just believe everything it says without questioning
because it's sold to you asbeing superior and more
intelligent. You know, that'show I know that the Kissinger
Schmitt AI book is full of Shit, you know, because basically,
if they were being honest aboutlike, their warnings, and all of
this stuff, and not just usingit to like, you know, sort of

(21:15):
give their veiled plans, youknow, leak them out to the
public, they would definitelyhave noted that AI like, has
accuracy problems thathallucinates it's a known
phenomenon. And instead, they'relike AI is our ticket into
undiscovered worlds basically.And it sees all these hidden
realities that we cannot see.And so we should trust super

(21:38):
intelligent AI to be our guideto these undiscovered realms or
whatever. And like no, becauseyou like because of these other
documented things that theseguys obviously know about. You,
there's no guarantee that that'seven real. If you can't verify
and observe it, like aI produceslike hallucinates and put in

(22:00):
produces output that iscompletely like erroneous. And
these guys don't acknowledgethat once in the book, they
frame it as as something thatlike we have to just trust, and
that it's superior to us. Andthat's I think what the elite
want us to think and to put likeblind faith into the AI. And
there's these different groupsto that, like want to create

(22:21):
like religion around AI thathave come out of Silicon Valley
in sort of related fields, sortof these, some of them are
called themselves data s. Andthen there's this one guy in
Silicon Valley that's tried tomake a church of AI and AI is
writing sermons and somechurches and stuff like, getting
a little weird. So theydefinitely are. I mean, I just

(22:42):
think that whole narrative thatAI is super intelligent, and
that it's errors aren't reallyerrors at all, but like
realities that are just hiddento us lowly humans, like I could
not just trust that narrativemore. Yeah. I mean, it's
basically telling us not tointerpret our own reality

(23:04):
anymore. And saying, We shouldlet the AI do that for us, which
is a major theme in thatKissinger Schmidt book. And they
say that that will happenspecifically to the class that
isn't involved with programmingand maintaining AI, the
underclass, but the idea here,and what they overtly lay out in
this in this book is about howAI is increasingly making our

(23:26):
decisions for us, right. Andnot, and not just like, big
decisions, necessarily, butalso, you know, what music we
listen to, like the algorithm onYouTube, you know, it's like
cultivating our learning ourpreferences, and then also
subtly cultivating ourpreferences and all of that

(23:46):
stuff. And then eventually,like, we won't know how to live
without it, that's essentiallywhat they say in that book. And
they talk a lot about that. And,you know, I guess more it's more
broader implications that like,without knowing without having
AI summarize stuff, that's long,we won't read the long stuff,
you know, and without AI likeinterpreting, you know, this

(24:08):
thing or that thing for us, likewe won't understand it without
like the AI summary are thewhatever it produces, that we we
grow accustomed to when all ofthis stuff, and they basically
say that this particular classwon't understand, at a certain
point won't understand AI atall, and won't understand how AI
is acting on them, that they'llthat there will be some anxiety

(24:29):
in this large underclass,because they'll know they're
being acted upon and watched bysomething but not understand
what it's doing to them isessentially what this it's very
disturbing. But they, again,that book relatively well, you
know, close, that is like, Oh,they're warning about these
things, but they cast them asinevitabilities at the same time

(24:50):
and then you know, I mean, thepeople that write the like Eric
Schmidt is like one of thepeople building this vision out
through his work with like theNational Security Commission on
AI and his, like extremeinfluence on the Biden
administration, science policy.I mean, he basically runs it.
And it's funding salaries oflike Biden administration
people, it's totally illegal, heshouldn't be able to do that.

(25:12):
And then also dominating howit's being implemented in the
military and the intelligencecommunity. And then that's a lot
of power for one guy. And so he,you know, has the power to, you
know, make anything happen,really, when it comes to like aI
implementation in the US. And soa lot of the warnings he's
talking about, I mean, it's all,um, if you look at his actions

(25:35):
with it, I mean, with the book,it becomes very clear what the
book is actually saying, youknow, and,

SP (25:41):
at this time, right now, where we're at where everybody's
talking about it, and there's noreal policies, you know, like,
they have interim policies, butthere's no real policies. So
that kind of feels likeeverybody's trying to get what
they want right now, before,before the regulations get put
into place.

WW (25:57):
Yeah, and I think, you know, AI regulation is gonna, just
what you talked about howthey're how they're these
different generative AI is like,not just chat, GPT, and
whatever. Yeah, I'm sure whenthey regulate it, they'll make
it so that those, those littleones that maybe are a little bit
better, whatever, in terms oflike data harvesting, or
whatever, probably will not beallowed to go forward. You know,

(26:19):
I mean, regulation in thesetypes of spaces, whether it's
like, you know, like the comingregulation on crypto or any of
these other like emergingtechnologies or things related
to them. I mean, Congress isessentially acting as kingmakers
for the companies in thisunregulated space, you know,
they get to decide who continuesto get sick, who's, you know, I

(26:41):
mean, obviously, some companiesare going to be more favored by
regulations than others. Andgenerally, how this works is
that those companies have themost lobbyists and the most poll
why those regulations are beingwritten by Congress, and then
they go through and thosecompanies when, and then other
companies that, you know, areessentially boxed out, after the
regulations are pushed through,you know, and a lot of times

(27:04):
when this happens in the States,it's like you either, you know,
it's sort of started, I guess,maybe in the 70s, under Nixon
with like agriculture, it waslike, this whole idea of, like,
get big or get out, like, ifyou're a small mom and pop
company, um, you know,government regulation no longer
favors, you know, you know, andso they tend to favor sort of
these big, big ones that are,you know, we're always going to

(27:28):
be part of it. But they take outthe little guys, once they
regulate, you know,

SP (27:34):
at the West, there was a talk, I shared it, I can't
remember who it was that wastalking, but they were talking
about how in the future, theywant to have it. So this was
about what they train their AISon in the in the content, you
know, these are, so they want tomake it so that they can serve
content from these companiesthat allow them to use their

(27:58):
content to train AI. You know,so like, make these deals with
all these companies, you let ustrain on your content, and then
we'll feed you know, feed peopleto you. Yeah, make it a deal.
Yeah. Which is sounds horrible.It sounds really boring. Well,

WW (28:16):
there's a lot of that going on. I mean, in, in China, they
created like a stock exchange,but it's not stocks, it's like
date company's data. And so theylike trade it, like on an
exchange. And like all thesestate owned companies, like all
of their all of their data inuser data,

SP (28:35):
and then they can use it. Interesting. Yeah. I

WW (28:39):
mean, obviously, a lot of these complaints are like, Oh,
well, and you know, it's been inon immunized. And we don't, you
don't have to worry about yourprivacy. But I mean, yeah,
right. You know, I mean, maybethey do, but a lot of them, I'm
sure, probably don't, or atleast don't do it effectively,
you know? Yeah, but I mean,they've been saying for years,
that data is the new oil and allof this stuff. And so I think

(29:00):
what people don't realize isthat it's like your data is the
new oil, and they're making lotsof money off of your data, and
you are not making any money.And instead, your money is being
hyper inflated away or tricklingup to the billionaire class. But
they, you know, making a lot ofmoney off of you more than ever

(29:21):
before.

SP (29:23):
Wow, depressing.

WW (29:26):
I'm sorry, I'm like a big black, walking, talking black
pill. I mean, I don't reallyfeel like it's black pilling in
a sense, because I think it'simportant to be like, aware of
how these guys see this stuff,you know, and I mean, because
otherwise, we can't really fightagainst us, us, us little people

(29:47):
right at the bottom and the waythings are going. I mean, I
think people need to startdivesting off of some of these,
you know, specifically like bigtech stuff, just because it's
like there are one of theclearest actors involved in
using our data for bad thingsand have taken an increasing

(30:07):
control of the military. And thegovernment in the US. It's
honestly pretty insane. And thenyou add all of that this new
era, they're trying to pushthrough of like AI, weapons, and
all of that, which is eithercoming from it's all coming from
Silicon Valley people. I mean,the push of that, I mean, Eric
Schmidt is a big driver of that.And the other big driver of that
is Peter Thiel. And, you know,these are all big Silicon Valley

(30:30):
guys with very deep ties to theworst parts of the US
government. And I don't know, Ithink them being in charge of,
or developing these likeautonomous AI drones with guns.
I mean, all of that sounds likean awful idea to me, for sure.
Well, war

SP (30:48):
is always like, the reason it seems that it's always the
reason for innovation, right? Imean, since the beginning of
time, is about like,

WW (30:58):
innovation of killing people. Yeah. Well, and making

SP (31:01):
yourself be the one that survives or gets more or gets
what you want. I mean, that'salways the thing that propels

WW (31:09):
innovation. Well, the last few big conflicts so you have
like the Gaza conflict right nowlet's go into spread regionally.
And then the Ukraine conflict.Those had been huge testbed
specifically for like USmilitary linked AI companies. So
specifically, the Peter Thielstuff. Very big in North fun,
he's funding it, you know, inUkraine, like autonomous drones,

(31:32):
and all of that a lot of it isthe frontman for a lot of these
companies is Palmer Luckey,who's the guy that made Oculus
Rift, like the virtual realitystuff that was sold to Facebook
where Peter Thiel was a biginvestor and basically like,
made for help make Facebook thecompany it is today. And he's
his company is and drill whichis also not just making all

(31:53):
these like autonomous drones andstuff, but also making like
surveillance towers that are onthe US Mexico border and all of
this stuff. And, you know, it'sall interface to with this other
Peter Thiel funded thing likeClearView AI, they're like
facial recognition thing wherethey've scraped all your images
from Facebook and includingpeople like that don't have
Facebook accounts, but otherpeople have taken their pictures

(32:15):
and like uploaded it to Facebookand stuff, you know, trying to
make this engine for crazy,crazy dystopia? Yee ha.

SP (32:23):
You just mentioned the border. And you know, Andrew, I
wanted to mention this, becauseit always surprises people. A
lot of people know this, butlike 60% of the US population
lives in a constitution freezone. Yeah. People because they
live within like, yeah. When youlive within a certain amount of

(32:45):
distance between borders, thenthat's considered constitution
free zones. And about 60% of thepopulation lives in those zones.
That's insane. Yeah,

WW (32:55):
because if you think of borders, I think they count like
coastal areas as borders, right.And even yet, right. So that's
like California, all of Florida.Two most populous states, right.
Yep.

SP (33:06):
And I think it's like 100 Miles Dan, or something like
that. Something

WW (33:10):
like that. Yeah. Well, it's definitely important to consider
given all like the stuff goingon right now over the border in
Texas, specifically in thisshowdown as it were between like
states and the federalgovernment over border stuff.
But honestly, I think a decentamount of that is pretty
manufactured, because bad thingsare, if people let them are

(33:33):
likely to play out, as aconsequence of that, and I've
been saying for a long time tothat, like, once stuff in the US
gets particularly dicey, or, youknow, there's too much
overreach, and people get tooupset that the border, you know,
like, specifically the stuffpolymer and Palmer Luckey has, I
mean, it's there, and it'sactive, they're just not using

(33:56):
it for people coming in. Right.So how much of that stuff is to
also keep people from like,coming out at a future point?
You know, it's not just, I don'tknow, I mean, I think the whole
border border thing, I mean,it's an election year, too. So
there's a lot of stuff going on.And I think, you know, this is
going to be the year ofunprecedented psyops for sure,

(34:17):
and I think a lot of that isgoing to be very AI enabled, you
know,

SP (34:23):
right. The process of fighting against AI. I mean,
it's, you can just look atYouTube and appealing
censorship, it's almostimpossible to appeal an AI
decision. Well,

WW (34:36):
sure. And then you have on top of it, like, you know, just
on social media stuff alone. Imean, for the past, like decade,
at least the US military has,like put in a ton of money into
making like social media botarmies basically. And with
generative AI which now likechat, GBT as an example openly
has like a thing with themilitary now. Like they can have

(34:58):
the most sophisticated bots Likeever to like influence opinion,
and stuff like that. And I justthink people don't really
realize that when they'reinteracting on social media so
many people think like, althougha lot of likes and like, you
know, people that are boosted bythe algorithm and all that stuff
is organic because people likeit just like they think that
like the the video, the popsongs on like radio that get

(35:19):
played over and over again orbeing played over and over
again, because people want thatit's not because people want
that. It's because that's whatthey want you to hear and what
they want you to see. Right. Andthey manufacture its popularity,
because everyone assumes, oh,it's being played so much, or
I'm seeing so much of this orthis has so many likes that it
must be popular. Right? Well,must be liking it, but it's a

(35:41):
completely like off. I mean, notall the time, but a lot of the
times it's manufactured.

SP (35:46):
Yeah, you have to wonder like, okay, so why is this
person so popular? And I'venever even heard of them? You
would think you would have heardof some of these people? If it
if it was real?

WW (35:55):
Yeah, well, you know, speaking of like Twitter
specifically, or x or orwhatever it is now, um, you
know, there's been this wholething like around Elon and like
people that promote Elon getbigger boosts and like
monetization and, and what haveyou. And you know, there's this
whole effort to like Co Op thequote unquote, dissident, right,
you know, a lot of the peoplethat were like, against COVID

(36:17):
measures and against, you know,digital IDs, cbdc sort of heard
them into being like, you know,pro Elon, pro Elon brain chip,
who's a contractor for militaryand intelligence agencies. You
know,

SP (36:29):
I saw RFK, praising Ilan the other day saying, Thank you for
providing a free speechplatform.

WW (36:35):
Oh, I missed that. But it's not a free speech platform.
That's unfortunate. Well, AlexJones was calling for Texas to
secede and elect Elon Musk asits first president. Wow. Yeah,
so um, you know, social media,it's it's definitely a warzone
these days. And it's all abouttrying to get people to perceive

(36:57):
reality, a specific way. And Ithink what's likely, in 2024, is
to basically, you know, throughAI and other means AI enabled
means, among others, getbasically this, this faction of
people on the right, that don'ttrust the government at all to
feel like their guy won, meaningTrump, and then there'll be, you

(37:22):
know, a lot more acquiescent andcompliant to the rollout of all
this stuff. Because I mean, justlike it was with, you know,
COVID, you know, yeah. Trumpdelivered on all of that for the
elites. And, you know, he's sortof regained his anti
establishment cred, I guess. Butwith all these court cases,
trying to take him off theballot, when whatever, and now

(37:46):
some of his biggest and sort oflike influencers, like Alex
Jones, I guess, you know, he's,he's been re platformed and
rehabilitated. As like a proElon guy, and obviously, pro
Trump once again, despite allthe vaccine stuff, to basically
you know, sell, you know, Trumpwinning as this is what's gonna
save America, yada, yada, yada.Um, I don't know. I mean, people

(38:09):
forget.

SP (38:10):
I can't blame people for thinking that though, because
you have to look at what we havenow. I mean, obviously, they're
not doing anything for people.

WW (38:17):
Well, exactly. But people people forget how the left right
paradigm works. Right. So youknow, it's the left hand and the
right hand of the same thing?Yes, yeah. And so one side makes
a mess of things. And the otherside comes in and quote unquote,
cleans it up offering thesolution that they wanted the
whole time, but it's cheered onas being the solution to the

(38:42):
problem created, by the otherhand, you know, yeah. All this.
All this stuff with the borderand a lot of this chaos, it's,
it's obvious that the the groupthat's going to come in and fix
that is going to be the partythat's traditionally been tough
on terror, and tough on crime,you know. And a lot of those

(39:02):
policies are going to beweaponized against regular
Americans, right? Make nomistake about it. And there's
going to be a push for IDbecause of the migrant issue.
And it's going to be digital ID,but they want people on the
right to cheer it on. Becausepeople on the right have most of
the guns, you know, and canprobably actually resist stuff
to an extent and make it harderfor them. So they have to

(39:25):
basically sign up that segmentof the populace more than anyone
else to get, you know, what theywant through. And I think a lot
of the stuff that's being set upby Biden, I mean, people act
like it's incompetence. It's notit's intentionally being allowed
to grow into this insanesituation so that they can come
in with very heavy handedsolutions later on. And I think

(39:47):
it's likely they'll have thatwant to have Trump deliver those
solutions instead of Biden.Yeah.

SP (39:52):
Because people seem to like him. Well,

WW (39:56):
I mean, he did offerees I mean, the Teflon Don thing,
right. I mean, he did itOperation warp speed. And I had
an insane amount of his base wasso against that and now they a
lot of his base remember it asbeing Biden's mandates and
Biden's vaccine, like Trumpwasn't involved in it at all.
And that's the I mean, that'sagain how the left right
paradigm works. You can likeoffload all of the sins of the

(40:19):
current area, currentadministration and the other guy
acts like he's going to be allagainst it, but they're the same
at the end of the day. I mean,people forget that when Trump
came to power he like made thisteam economic advisors that was
like Larry Fink from BlackRockand Jamie Dimon and like all of
these guys, and yes, super tightup on Wall Street. And had

(40:42):
warmongers and his hisadministration after campaigning
on being against Neo cons, andall of this stuff, and he's one
of these guys, that's very goodat having rhetoric that's
drastically different than theiractions. And that's the rhetoric
that resonates with people. Andthen and then they just keep
pushing forward and a lot of thesame agendas. And you know, I

(41:02):
mean, when COVID happened, oneof the first people Trump went
to was Larry Fink of Blackrock,and got all this money that was
printed by the Fed, and they gotto decide where to allocate it.
And all this stuff for COVID.Relief, Nolan. I mean, he
printed so much money, I mean,he did stuff that was like, so
against what he campaigned on,and people just have totally
forgotten about all of this. Andthey act like Oh, he didn't

(41:24):
start any new wars. But he triedto kill Venezuela, and he tried
to start a war with Iran. Theylike murdered Qasem Soleimani,
like one of the top Iraniangenerals and stuff. While he was
on a diplomatic mission, likethey tried to start wars. They
just did it. Like it's I don'tknow, I mean, I just feel like
the way people have come tolike, remember, it speaks to the
power of, of how media canmanipulate people, because

(41:48):
that's independent media,supposedly, that's manipulated
Trump's base to feel that way.Or at least the sort of like
dissident right base to go backinto the Trump fold, you know,
and it's because precisely a lotof those guys are seen as being
against the mainstream media.And the, if that's been so
effective, imagine how effectiveit'll be when it's boosted by

(42:08):
all this AI stuff. You know, notgood. And I think also, there's
the strategy they've had for along time called, like, the
flood the zone strategy, wherethey can just put out so much
messaging and a particular wayto manipulate people and with
AI, like, oh, my gosh, you canflood the zone like never
before, you know? Yeah.

SP (42:30):
I don't understand. I mean, it seems like it should be
really easy to make peopleunderstand that Trump is on the
side of the bankers, people hatebankers. I don't understand why
it's so hard to make theconnection there. He, I mean, if
you look at his history in NewYork, and everything that
happened with the, you know, allthe loans that he got there, and

(42:51):
I mean, he's on their

WW (42:51):
bankruptcies. Yeah. Yeah, the the guy that rescued him for
bankruptcy, he made Secretary ofCommerce, Wilbur Ross, a former
who was worked for I think itwas in in Rothschild or one of
the Rothschild Bank, maybe oneof the Rothschild family banks,
is what rescued Trump frombankruptcy. So, I mean, I don't
know. I mean, I think it isobvious, but I what people point

(43:14):
to by default is like, oh, butthen why are they trying to stop
Trump? And they meaning like thedeep state or whatever, and it's
like, if they really wanted tostop Trump, they would have
already, you know, and by thesethese overtures, like they're
going to stop him, but they'renot actually stopping him and
making like this media hooplaabout it. They're making their

(43:36):
manufacturing trust. And thiswhole thing of the World
Economic Forum right now, wherepeople like Larry Fink are on
the board, is how to rebuildtrust. Right. That's their
theme. It's been their it wastheir theme this year. It was
their theme, I think, last yearin the year before, they're very
focused on rebuilding trust. Imean, why do you think they had
someone at the WEF? Like JavierMalay? Why did they give him an

(43:59):
audience to come up? And, and,you know, everyone was like, oh,
yeah, he got up there, and heshit on everybody. Like, I don't
think that's what's happening.What I think is happening is
that there's this phase shiftwhere they're going to try and
sell the same agendas. The the,the quote, unquote dissident
movement in the US is against,let's say, digital IDs and CBD

(44:21):
sees as an example, but it's alot of other policies rolled up
in that they want to sell thatto instead of having these
talking points about it beinglike ESG or for climate change,
or for whatever, they'reretooling that to appeal to
people that are right leaning. Ithink and you're even I mean,
like with Larry Fink, who's likethe point guy for that, like he

(44:43):
was all about, like ESG climatechange all those like left
leaning talking points, and nowhe's moving to the right and
being like, well, we should doall of the same stuff, but it's
not you know, for the planet orfor the good of society or
inclusivity and talk Two pointsthat resonate more on the left,
he's saying, Oh, well, you canmake a lot of money doing this.

(45:04):
And everyone can make a lot ofmoney doing this, you know. And
sort of talking about, like, youknow, pushing for deregulation
and stuff like that. And I mean,that's exactly what Malay is
doing. And like Malay came topower sort of in a similar way,
to Trump having sort of likethis extreme campaign rhetoric
that resonated with people whowere very angry at the political

(45:26):
class. And I mean, it wascathartic with Trump. And it's
also cathartic with Malay tohear them crap all over the
power establishment that havelike been bad to people and
everyone hates, you know, butthe problem is, you know, Malay
gets into office and afterrailing against the political
establishment, he puts thepolitical establishment back in
power, not the one he justreplaced, which was the left

(45:48):
leaning one, he went back to theadministration before that the
center right party guy, MauricioMacri took a bunch of people
from his administration and putthem back in power. And like his
finance minister, he can paintall about being an A, in, you
know, an anarchist, and all thisstuff, and his economics, his
top economics guy and financeministers, like a career, you

(46:11):
know, Latin American point, manfor Deutsche Bank, and JP Morgan
and stuff. Like, it's not good,you know, and he's super cozy
with the IMF. And he everyone inArgentina hates the IMF, because
they've been trying to privatizeall their state assets, and
force austerity on them andstuff. And Malay, just like has

(46:31):
done everything the IMF wantedto do to Argentina and more,
without the whole, like, debtslavery angle of it. It's, it's
very nuts. And so I think thefact that Malays being invited
there is just indicative that,you know, they're trying to get
trust, people that are againstpolicies sought by the WEF, they

(46:52):
have certain politicalinfluencers, they want to roll
out there and have people trustthose guys. And then those guys
will deliver the policy goals,you know, the West, and these
guys have wanted all along. AndI think honestly, the digital ID
thing is going to be sold aslike a solution to the migration
issue. We have the know whoeveryone is, like the the long

(47:14):
standing Republican push forvoter ID, which I'm not against,
yeah, this is like, I mean,people, like hear me talk about
this stuff. And immediately,like, think I have to be on one
side or another, I'm on neitherside, you know, but um, in
there, they'll just, you know,roll out that talking point and
be like, Oh, well, you know,everyone has to have voter ID.

(47:35):
But it has to be digital, orwhatever. Because I mean, people
like Ron DeSantis, who were, youknow, postures being against
CBDCs, for example, like digitalIDs are already rolling out in
Florida. So he's not againstthat. I mean, maybe he's against
CBDCs. But you know, I've donesome reports and interviews
recently about how that's like,just a setup to have like,
instead of a cbdc, like, issuedby the Central Bank, the Fed,

(47:59):
they're going to do it, but it'sgoing to be issued by Wall
Street. And it's not going to becalled the CB, DC, but it's
going to be the same thing. Soanyway, they'll still have that
you know what I mean? I mean, Ifeel like I'm kind of rambling
about this. But I honestly feellike there's this intentional
shift here to try and move toget like this energy behind the

(48:21):
dissident, right? And oh, yeah,independent media is winning,
and we're free, and all of thisstuff, and our guy's gonna come
back into office and like, saveeverything and save the world.
It's, I don't know, people justhave to remember what happened
last time. And no one does. I

SP (48:38):
was reading something about the digital well, like a online
verification, you know, toprevent misinformation or
something like that. But thenthe problems involved with that
being like, well, it could be atarget for, you know, hackers.
And so then the solution to thatthey were looking at, and this
was from, like, one of thesesites that you follow, like, you

(48:58):
know, government press releasetype sites. And it was saying
that they were looking at banks,because they're more secure, you
know, so it'd be Yeah, you wouldauthenticate yourself through
your bank. Yep. Online.

WW (49:11):
Sounds about right. Yeah. Yep. Well, I mean, because
bankers are driving a lot ofthis stuff forward, like the CBD
scene, digital ID thing. I mean,if you read stuff, like from the
Sustainable Development Goals,agenda 2030 of the UN that every
country has pretty much signedon to CBD C's and digital IDs go
together. They must as as asit's laid out there. And I mean,

(49:36):
most of the stuff at the UN,including all their climate
finance and climate actionstuff, and a lot of the other
STG stuff. It's been written bybankers. It's been written by
bankers. People assume it'swritten by like UN experts who
are somehow like not like, youknow, neutral and like, experts
in their field, sort of like theidea like it's an FDR brain

(49:58):
trust style thing. No, it's notthat at all. It's literally
written by bankers about how tolike, Screw you and your
children and all generations tocome, and basically turn
everybody and everything aliveand do financial products to be
traded on, like blockchainexchanges and stuff. I mean,
it's totally insane. When youactually read into it and stuff,

(50:21):
and I just, I can't stand it.But yeah, people really think
the UN is on their side here.But all that cbdc digital ID
stuff was written by, bybankers, pretty much. And so
yeah, I mean, a lot of the stuffI've written about before about
the push for like a regulatedinternet, it's, it's banks and

(50:44):
intelligence agencies, prettymuch. So like, the UN climate
finance thing is like, oh, weneed to save the planet and do
this stuff. And they put MarkCarney and Mike Bloomberg, in
charge of it, we're like, thetop bank, I mean, just like, I
mean, really powerful people whohave built their careers by
like, stepping on people'sheads, you know, and climbing

(51:05):
their way to the top. And you'resupposed to believe these people
like, are setting up all thesesystems, because they care about
the planet, it's madness. Andwhat they're really doing is
they're just like, creating,like carbon markets where they
can tokenize and like, turneverything alive, into like,
assets, and, you know, financialproducts. It's insane. So yeah,

(51:29):
I mean, these guys don't reallycare about people at all. And
they, but they've spent a lot ofmoney, especially, you know,
well, they spent a lot of money,basically, on propaganda on
public relations to convince usthey care. But I mean,
obviously, their actions,particularly like Wall Street
bankers, I mean, it makes itreally clear, you know, what
they're motivated by, and Imean, a lot of it is more than

(51:51):
money. You know, I think, youknow, independent media, people
that talk about these agendas,you know, there's a lot to be
said about how it's really moreabout control than profit at
this point. But I think, youknow, one way of looking at
their interests and control isnot so much of it is like, Oh,
they love to control people aslike, you know, I mean, I'm sure
there's people that are in itfor that, you know, and do like

(52:14):
that, but I think there's somealso that see, it is like
necessary for I guess, riskmanagement, you know, I think
they see, like an on sort oflike a free meant, like the if
the public if the masses werefree to them, I think they would
view that as like justuncontrollable, unpredictable,

(52:35):
and makes it harder for them todo what they want to do. You
know what I mean? And I think alot of these people's lifestyles
is also like, predicated on thembeing to do whatever they want
with the masses, because theylike us, our labor, they use us
in other ways, or they stealfrom us in order to, like,
maintain their specificlifestyle. And they obviously

(52:56):
have, like, no intention ofchanging that, you know. So I
think it sort of comes down tolike this whole, like, risk. I
mean, I'm sure they see it aslike a risk management thing.
And like big parts of the elite.And I, but I think the problem
there too, is that, like, whatdo they see is risk and what do
they see as chaos. And I thinkat the end of the day, just like

(53:17):
human creativity, or like,something that's not completely
controlled, like buyingmachines, and stuff for them is
going to be viewed as inherentlyrisky, because they can't like
unless they can, like, extremeinfluenced us to like, extreme
extreme extreme degrees, they'llnever be able to, like manage

(53:37):
away all the risk of there beinglike billions of independent
people on the planet that aren'tnecessarily going to do what
they want to do every time. Youknow, I mean, they put so much
money and so much effort intomanipulating us and AI is
allowing them to do that atscale, you know, in
unprecedented ways. And a lot ofthe stuff in the Kissinger Smith
book is essentially using AI AIto like, suck us into realities

(54:01):
that aren't even necessarilyreal and stuff. But I think a
big part of that is because, youknow, AI can give us the
impression of this creativityand of this, of this
consciousness and of this stuffthat keeps us engaged and
interested, but with a lot lessrisk for them than if it were,
you know, something happeningorganically and not like a

(54:24):
synthetic thing like AI, youknow?

SP (54:26):
Yeah. When we were talking about doing this podcast, we're
just kind of, you know, talkingback and forth. And you said
something about, they wantpredictability. And that was
really kind of like mind blowingfor me because I was thinking
about it all that time about theangle of like, I don't
understand how they think thisis going to work because they're

(54:48):
building on top of lies. Youknow, they're like training on
the training from the media andthe media has been telling lies.
So how are they expecting to gettruth out of these like, you
know, a I models that they'rebuilding and stuff. And you said
that they don't care whileshrewd, they want
predictability. And that waskind of like the change the way
I thought about it, because Icare about truth. So I just

(55:11):
assumed that that's what theywould care about. But that's not
what they care about. They careabout things. Yeah,

WW (55:17):
well, they tell you, they care about it, you know, and
it's just like, you know, how alot of AI that they're using is
inaccurate, like we were talkingabout earlier, and they act like
it's going to make things moreefficient. Like, that's the
selling point. But it doesn'tactually do that, because it's
like, a lot of the time, it'sinaccurate, and they don't care.
They just want it to be like ina controlled system that they
can manipulate. And then if ithas glitches, they'll cover it

(55:37):
up, like, you know, like happensin like the Brazil movie and
stuff. They'll just cover it upand like, eliminate the people
that know about the mistake andjust like paper over it and keep
going because it's not it's notabout what they say it's about
it's like not about accuracy.It's not about preventing
misinformation, so the truth canEnder, right, it's about

(55:59):
creating, essentiallymanufacturing realities through
AI, and changing how we perceivereality and having us be
dependent on AI to perceivereality. Because if you control
how people receive reality, youcan control how they behave,
right. And so this is like anunprecedented effort to, to be

(56:20):
able to push humans into asystem that they don't know how
it operates. And I think a lotof this stuff like more than
that they plan for AI that isn'tnecessarily here yet, like a lot
of it with health care, andlike, you know, wearables and
the Internet of bodies, and theInternet of Things, stuff, like
escalating a lot, and like AIwill, like, I go through your

(56:42):
genome and, and all of thisstuff, I mean, it's all about
just trying to like tweak asystem. So that like there's
nothing unpredictable thatarises in it. And I think that's
why, you know, pretty much not,I don't know, if it's
necessarily every sector AI isbeing rolled out in but a lot of
them have an extreme focus onlike, predictive analytics and

(57:03):
stuff. Like predicting whatpeople are gonna do before they
do it. And it's all about like,anticipating risks before they
happen. And all of this stuff,and I mean, ultimately, at the
end of the day, it's so like,they don't have to worry about
like, uprisings from the littlepeople, you know, they can like
micromanage it all. And I think,you know, AI a big part of it,

(57:26):
too, when you tie in, like thewhole, like eugenics potential
and like healthcare, posturingof a lot of this AI stuff is to
basically, you know, tweakhumanity so that it can only
survive in the system, they'rebuilding with it, like this
dependence on AI, I think theydon't just want it to be
cognitive, like is sort of laidout in that Kissinger Schmidt

(57:47):
stuff. But I think they want itto, you know, at some point in
the future, like be biological,like, create biological
dependencies on this stuff, Ithink that's part of like, the
transhumanism thing, maybe anaspect of it that's not talked
about. So much, just like justhaving us not be able to live
without these. I mean, we'realready so dependent on like,

(58:07):
big tech, and all that for,like, how we conduct our lives,
but we're not necessarily like,dependent on it to live to, like
actually live, you know, like,in theory, we can still walk
away and like unplug and stuff.And I think, well, there's a
couple of different, you know,reasons as to why they may not
want that there's like, the datais religious level of it for

(58:29):
some of these people. Andthere's also, of course, you
know, as I've talked aboutbefore, on stuff, a lot of like
religious overtones on that'ssome people in view into the
whole trans transhumanistmovement, but I think it's, I
think it's also like just peoplewanting to be able to create
some sort of system that keepshumans like engaged and and

(58:50):
trapped, and we're producing allthe data that they're using to
run the economy now. And movingforward like this. You know,
they call it like the dataeconomy. And there's also talk
of like, the DNA economy and howDNA is going to be used to store
data and like all of this stuff.I mean, like, the the potential
applications of a lot of stuffhappening right now. I mean,

(59:14):
some of these powerful people,Larry Fink included, want to
take all of this stuff to like,an insane level that I think a
lot of people haven't, like,fully understand. So like,
there's this thing that I'vebeen writing about lately, and
the article is not out yet, butwill hopefully be out soon. And
that's about the broader, like,tokenization agenda, where Larry

(59:34):
Fink talked about thetokenization revolution recently
how everything's going to betokenized and so that it can be
traded on on blockchain and theywant to do it you know, not it's
not just like things that arefinancial stuff right now. Like
it's not just I mean, they wantto be they want to tokenize like
every living thing, naturalassets, all of that stuff that

(59:55):
I've touched on before him onstuff on the whole natural
capital natural assetCorporation. and stuff, but also
like, there's people tokenizingtheir careers, their projected
future profits, like from theircareer like trying to tokenize
themselves. They're like, likeartists trying to tokenize like
their creativity. So we can belike, traded and sold and like

(01:00:17):
make them money and stuff. Oh,essentially, we're all of this
stuff. Yeah, it's really crazy.And so essentially, we're all of
this stuff is leading. If thesepeople get their way is that
like, essentially everything onearth will be truth be able to
be like, traded on a blockchainand be a Wall Street financial
product. Oh, yeah. Yeah. It WallStreet. But I mean, I mean, it's

(01:00:44):
not all just Wall Street. ButWall Street is like a key part
of of the, you know, the powerbrokers of the system? Because
they control the money. Right?Yeah. And they control central
banking in the United States.And they have, you know, a lot
of influence over things thathappen in the world. And I mean,
I think sometimes people pointthe finger. You know, I mean, I
think what we're meant to do,is, you know, point the finger

(01:01:07):
at this politician or thatpolitician, but I mean, people
also know, and this should know,by now that politicians are
funded by people and their ideasaren't organic, a lot of the
time, and they're just like, youknow, doing what they're told to
do, and saying what they're toldto say, I mean, you have, like a
politician rolled out, but theyhave like, speech writers and

(01:01:28):
people that, you know, tell themwhat to say, and write their
speeches and like, coach them ondebates and like, develop their
policies. It's like, not allthis one guy. And those people
work for think tanks funded bythese guys. And those guys, you
know, people don't look at thosepower structures a lot of the
time, they just want you focusedon the influencer, you know, and

(01:01:48):
we really shouldn't be doingthat. Because I think, you know,
if there's anything we'velearned, since you know, the
COVID era, it's that there's alot more going on than maybe
people assumed. And there's alot of power grabs happening
right now. And honestly, a lotof this stuff, you know, going
on in the financial space rightnow is, is really all about just
trying to literally turneverything you can possibly

(01:02:10):
think of into like money or anasset that they can
fractionalize meaning like, cutinto little pieces, and then
tokenize make a token of it sothat they can like trade it and
rob you in unprecedented ways,you know, and the way this was
being pitched before was stufflike, oh, that we had to do

(01:02:31):
this, like for the planet, likewe need to tokenize everything
with carbon in it, which is likeall life forms carbon based
life. Right. tokenizerainforests and stuff, you know,
we were doing it for the planet.And then now you have like
people like Larry Fink, like Imentioned earlier, going
through, like this big shift andrhetoric where it's not about

(01:02:52):
that kind of stuff anymore. It'sabout oh, well think about how
much money you can make bytokenizing, your private
property, your land holdings,and then you can use it as
collateral on loans. Oh, look,you can't pay back your loan, I
guess. BlackRock owns, you know,three fifths of your land now.
And then they'll eventually ownall of it, you know, and then

(01:03:15):
because everything when youthere's like, this thing, this
push also to like, fractionalizeat all, like fractionalized
ownership. That is like thewhole you'll own nothing and be
happy thing. You know,everyone's gonna rent
everything. And it's beingpitched right now is like a
decentralized, like, rightleaning anarcho capitalist thing

(01:03:36):
right now, between people likeMalay and Frank, and all this
stuff happening right now. Andpeople, I mean, some people
might buy into it thinking likethey're gonna get rich or like
this is, you know, a chance forthe little people to claw back
some wealth. But I mean, comeon, guys.
They don't want to share theirwealth with you, and they've

(01:03:58):
stolen wealth from you, and thenyou have no intention of giving
it back. And if they're goingto, like, offer you a carrot to
try and get it back. Be verywary about that, you know,
because that's a way to get youroped in and they know that like
their existing talking points ofESG. And let's save the planet,
let's build a new better andmore inclusive, diverse society.
They know all of that is notworking anymore. And now they

(01:04:21):
have all their best mindsthinking about how to get people
suckered into the same systemunder different talking points.
And it's happening in real time.And I suppose that this podcast
is mostly about AI. And maybeit's been a little more about
some other stuff too, but Iguess AI is touching, you know,
essentially every facet of liferight now. And there's just a

(01:04:44):
lot going on with it that I feellike doesn't get talked about a
lot. So if it's cool with youstar unless you wanted to say
anything else that's related tothat, then maybe we could talk a
little bit about some of the AImilitary and governance stuff
um, going on? Well, we touchedon it a little bit earlier, but
there's a little more I'd liketo say about it. Sure. Cool. So

(01:05:05):
talking about like the AIhealthcare eugenic stuff. And I
think that should be looked atthrough the lens, also of what's
going on with like AI in themilitary. So people, I'm sure
you've heard about the use ofthe IDF use of AI in Gaza to
pick targets and it'sessentially picking tons of
civilians, obviously, because ofwho's getting killed. And that

(01:05:27):
the death toll is just likecompletely insane. And it's in
the IDF won't say like, what thehell the AI like chooses its
targets, like what theparameters are, or anything. But
essentially, what you're havinghere is like aI developing kill
lists for people. So like, backin the I mean, I'm sure you
remember star like back in theObama administration, Obama

(01:05:48):
having a kill list was like,super controversial. And now I
guess it's not because peopleare making like aI generated
kill lists that are bigger andbigger with like, no
transparency into them at all.And essentially, AI is picking
who lives and who dies, and whatare the parameters and what
happens when that gets, youknow, scaled. I mean, Palestine,

(01:06:12):
Ukraine also is a testbed for alot of this AI weaponry, and
it's going to be weaponizedagainst, you know, countries are
going to use it against theirown populaces and also at, you
know, populations they're at warwith. It's not, I mean, once
this stuff comes out of the box,it's not just something that is
going to just be a war timething, necessarily. I mean, a

(01:06:33):
lot of like, historically, likethe IDF and the Israeli defense
industry, they do a lot of like,testing of products. I mean, I
hate to call it that, because Imean, it's genocide right now.
But, you know, they're, fromtheir perspective, this is a way
to say that their products arebattle tested, even though like,
they're blowing up kids andstuff. But I mean, in terms of

(01:06:55):
marketing, that's how they sayit, you know, I mean, once they
do that, they sell this stuffall over the world. And it ends
up getting used, I mean, a lotof like, Israeli spyware, for
example, that's framed as likehelping catch quote, unquote,
terrorists gets used by like, Idon't know, the the United Arab
Emirates, or Saudi Arabiaagainst their own people, like
as an example, you know, and so,like, the whole idea that AI,

(01:07:19):
you know, I think one of themain things that AI is going to
be used for and why peopleshould be wary about freely
giving your data to it is thatit's going to be increasingly
used by governments to decidewho gets what. And it's not
necessarily who lives and whodies, though that is happening.
But it could be, you know, in afuture situation, let's say
like, you know, more supplychain shocks to the, you know,

(01:07:42):
the food system or whatever.And, like, you know, food stamps
are essentially been obliteratedin the US at this point. But
what happens if they roll backsome sort of, you know, rollout,
some sort of system, like the UNright now basically uses the
world coin system for, like,food rations, right? Where you
have to, like scan your eyeball,and like link to your digital ID

(01:08:05):
and your wallet, and it like,takes out how much money of your
wallet automatically when youlike, sign out at the cash
register by scanning youreyeball and stuff. That's like,
the World Food Program is doingthat to millions of people every
day refugees around the world.And it's very likely that
they'll be trying to do that,for like food assistance and
welfare stuff, domestically, andall of that, but you know, if

(01:08:29):
the AI determines, oh, thisperson's done this or that, and
shouldn't qualify, I mean, itenables all of this kind of
stuff. And to think, you know,the people in power right now,
when it use it for those ends,honestly, I think is pretty
naive. And I think ultimately,you know, there are a lot of
people that are sort of eugenicsminded in power. And it seems to
me that a lot of them want AItrained on all this personal

(01:08:50):
data of everyone because theywant to decide, you know,
certain traits they want topreserve, and people and they
want to, you know, favor thesuccess of those people. So
those people will getpreferential treatment, and then
the ones that, you know, haveundesirable traits will probably
not get selective treatment, youknow, what I mean? I mean, it
has the potential for all ofthat, you know, if we let this

(01:09:14):
advance enough, and you know,the way things are going right
now, I mean, a lot of peoplelike, are pushing back in some
ways, but I think also peoplejust don't realize, like, what
these people plan for AI, it'sbasically going to be like, the
livestock herder and like we arethe livestock and it decides who
to call and who not to call, whoto feed and who not to feed. You

(01:09:37):
know, and I think, I don't know.I mean, we're just willingly
giving it all of this power byfeeding it all of the data, and
not divesting from thesecompanies that are you know,
saying they want to do that. SoI guess maybe that's a good
time. Then to circle back to thequestion I had at the intro of,
you know, if the people proberamming and maintaining AI now

(01:10:01):
and that are poised to set AIregulations where presumably
after they make thoseregulations, only the AI, they
these groups, you know, programmaintain will be allowed, you
know, can we use AI for positiveuse cases then? Or is the
negative to negative? I mean, Iguess it would depend ultimately

(01:10:22):
on on regulation? And if theywould allow any sort of, like,
open source or alternative AImodels to exist?

SP (01:10:32):
I don't think that they can. What? How do you think that they
can stop them from existing? Howcan they realistically say that
you can't have, you know, it'salready out of the bottle, the
genies already out? I don'tthink that they can, like say

(01:10:52):
that, because I mean, there's somany language, there's so many
AI systems that are already outthere. And it's only been a year
now. How are they? What are theygoing to do to say that people
can't use it however way theywant? I just don't think that
that's possible. Yeah, I

WW (01:11:09):
think I mean, I would normally agree with you. But um,
they're definitely going to tryto regulate the internet. And
when that happens, it's going tobe a completely different
internet than it is now. So ifthe internet as it is, now we're
going to persist, I tend toagree with you that at least
some stuff would slip throughthe cracks or whatever. You
know, but I think, you know,it's similar to how they're

(01:11:31):
probably going to regulate likecryptocurrency in the US,
they're going to decide like,which, you know, which stable
coins are okay, you know, whichdollar pigs stable coins are,
okay, like, which, you know,which companies can produce a
digital dollar, and which onescan't, you know, that they'll
make the regulation, so they,they're kingmakers, basically.

(01:11:52):
And I think they'll probably dothat too, for artificial
intelligence. And I think it's,um, you know, what this like
regulated internet to come? Thewhole narrative about it is
like, oh, you know, there'shackers, and there's these other
people that do bad thingsonline. So to stop illicit
activity, we have to end onlineprivacy and know what everyone's
doing and says online. And so Ithink AI, you know, in that

(01:12:17):
paradigm, like, they'll onlywant to allow AI that like
tracks and logs, everythingyou're asking it, and then sends
it back to the intelligenceagencies. But

SP (01:12:27):
people care about privacy. I don't think they're just gonna
go along with that.

WW (01:12:33):
Yeah, I know. But the problem is, like, the
infrastructure of the internetis actually pretty centralized.
When you think about it, likemost of the internet, like, runs
basically on like, 13, orsomething servers globally. Um,
yeah, that's pretty centralized.And a lot of, you know, like,
some of the people that runthat, like dominate the domain

(01:12:55):
name system of the internet,like I can, for example, they're
very tied up and all theseefforts to regulate the
internet. And, you know, inpolicies like taking down
people's websites forthoughtcrime, and stuff, so I
think there is going to be apush, and I think people may
still be able to use it in waysthat they don't want. But yeah,

(01:13:16):
but you don't need to use itonline. Right. But I think
that's the only people that aregoing to do that kind of stuff
are gonna be people that arelike technic, technical, like,
sophisticated technologically,and I think most people are not,

SP (01:13:28):
I understand what you're saying. But I disagree. Because
it's not hard to install. Youknow, I have one installed on my
computer right now, it's nothard. You just install it on
your computer. It's local,there's tons of local API's that
you can put on your computer. Soyeah, but

WW (01:13:43):
I guess like some of the negative impacts, like I'm
trying to talk about, you know,like the data harvesting and
sending it back to them forlike, predictive analytics, and
like all of the stuff, likeharvesting data about you and
whatever. And if they want to,like go after quote, unquote,
thoughtcrime, and all thisstuff, which like the honestly,
they seem to be gearing up todo. Like, how safe is it to use

(01:14:03):
that? I mean, ideally, you wouldlook for ais that don't harvest
your data that way and send itto these guys, but like, but

SP (01:14:10):
they take the data off the internet anyways. I mean, using
the chat JpT is no differentthan using the internet. I don't
think there's much of adifference.

WW (01:14:19):
But what I'm saying is like, the internet is going to be
regulated, and then the internetis not going to be safe to use
as my opinion. You know, and youlike how AI is going to be in
that paradigm. I think it's alsogoing to be fundamentally like
very unsafe. Yeah,

SP (01:14:36):
I don't agree with you that I understand your side for sure
that you know, you're givingyour data to the AI I think
we're already giving it to them,they're already taking it
regardless of if we give it tothem or not. And I think that
there's a lot of things that arereally powerful. You know, it's
a tool like everything else, youknow, people don't like people

(01:14:57):
who use Bitcoin. People don'tlike people who use all kinds of
technologies, right. But I thinkthat, I think that the thing is,
is you have to know. I mean,this could just be me being
idealistic or wanting to be ableto use it. So I'm justifying it.

(01:15:18):
And I'm not even really using itvery much. I mean, I'm exploring
it to see, you know, kind of thethings that it can do and stuff.
But I don't really think thatnot using it is really that
impactful, I think that you canget something out of it. You
know, instead of deciding thatyou're gonna not use it. Yeah,

WW (01:15:41):
but I, you know, I feel like I've gone over a decent amount
of like, negative impacts of iton like, people cognitively?
Well, I mean, I guess I couldhave said more on that. But I,
you know, in terms of like, adual use thing, you know, that
was like, the whole reason for,like, going back to Palantir.
Right. So like, the namePalantir derives from, like,

(01:16:04):
Lord of the Rings, and it's likean object in The Lord of the
Rings that is, like, neithergood nor bad. It's like a
powerful tool, and depending onwho holds it, right, you know,
determines whether it's good orbad, right. So I think AI is
much the same. And I think, youknow, once they regulate AI,
they'll regulate for the purposeof having AI be as firmly as

(01:16:25):
possible in the hands of the badpeople, I guess, is what I'm
trying to say.

SP (01:16:30):
But AI is such a broad term. I mean, what are you talking
about? Because AI, artificialintelligence, when it seems like
people have just started callingit AI in the last year since
chat DPT? You know, so likegenerative models or whatever?
Is that what we're talkingabout? Because AI,

WW (01:16:49):
I'm not talking about generative AI specifically,
because that like generates textor generates images. I mean,
it's very different than, like,some of the other AIS we've been
talking about in terms of like,military targeting, or like
facial recognition, you know,and some of these other ones.
And, I mean, we didn't reallyget at all into like, you know,
singularity, like artificialgeneral intelligence stuff.

(01:17:12):
Right. Yeah. But I mean,obviously, there's like
different AI eyes, but I thinkultimately, you know, whatever
regulatory framework is passedthrough in the in the coming
years is going to be focused on,you know, preventing AI that
isn't under their control frombeing widely adopted. And, you

(01:17:33):
know, what does that, you know,mean, for the utility of AI to
the masses, I mean, I thinkmaybe now people can get stuff
out of it. But I think peoplealso have to, like, be wary of
the risks. And that ultimately,AI is like risk management, from
the elites in the sense of,like, keeping you from doing

(01:17:55):
things they don't want, or, youknow, bucking against the system
they're trying to create, andlike, you know, it's a novel
tool, it's a powerful tool, itdefinitely has positive use
cases. But can we make use ofthose, given who's programming
and maintaining and dominatingthe space right now? So

SP (01:18:18):
what are people using AI for right now, what people are
excited about is using it to,you know, clean up their text,
make pictures, you know, writestuff, whatever. I mean, there's
all kinds of computer programsthat can help you clean up your
text, you know, there's, it'skind of like, it's like a, you
know, a one size fits all typesolution. It's something that

(01:18:41):
can do everything instead ofhaving to go to you know, all
these different apps and kind ofdoing it yourself. It's kind of
like, like a, just a betterversion of all of those things
in one. Yeah,

WW (01:18:53):
I mean, I get that I think what I'm worried about is people
getting lulled into a spot wherethey can't work without it.
Because you I mean, obviously,it's still novel, right? But
think about like three yearsfrom now. And everyone is like,
writing with Chet GPT. And kidsin school, like instead of
writing essays or Chechi, teeingthem all and they never actually

(01:19:15):
learn how to write and like,what kind of impact does that
have down the line, especiallywhen like, these bigger thinkers
are saying, like, this is whatis going to happen and tacitly
saying, This is what we want tohappen to the underclass. You
know,

SP (01:19:29):
do you think that people said the same thing about
computers? Yeah, I

WW (01:19:32):
mean, I'm sure they did, and like television, and all of that
stuff, right. And I mean, Idon't necessarily think they
were wrong about a lot of therisks, but the problem is people
were never really I think, at awide level made aware of the
risks. And it ended up likehaving those negative
consequences once like thenovelty sort of wore off. So I

(01:19:55):
guess what I'm saying is peoplehave to be aware of the I guess
the where they Wanna take thisand like, sure it's fine to use
for now, but just be aware ofwhere they want this to go and
like, make sure you have redlines that you won't cross about
the stuff. And about, like, whathappens when the regulatory
hammer comes in, and they tryand, and online privacy

(01:20:18):
entirely. Because I mean, likeyou said, Now, like, you know,
people care about their privacy.But I mean, as I've done, I've
done a lot of work on this, youknow, over the past few years
about, there's definitely goingto be some sort of event where,
you know, privacy, onlineprivacy is the enemy. And the
only way to stop these cyberattacks or whatever they are, is

(01:20:43):
to eliminate privacy online,like we have to D mask everyone,
or unmask everyone, and we haveto know who everyone is. And you
already have people like JordanPeterson has been pushing for
this, Nikki Haley, and a bunchof people and like, you know,
right leaning. And then alsolike, you know, on the left,
there's pushes for it, too. Imean, it's it's a pretty like,

(01:21:03):
talked about thing, even ElonMusk before he bought Twitter
was talking about like, verifyall humans and all of this
stuff, I think that is a redline people should definitely
have and not cross is when theystart doing the link, your
government issued ID to youronline activity. If you want to
know why I think that pleaserefer back to all my reporting
on the war on domestic terror inthe infrastructure for that. And

(01:21:28):
because honestly, it's targetingpeople that like, you know, I
mean, what would be viewed astraditional Americans, the
domestic terror stuff, but alsoanyone who's like against the
state or state policies or antiwar? I mean, probably people
that listen to this podcastenvironmental, yeah,
environmentalists are on there,too. Yeah. I mean, people assume

(01:21:49):
that it's like all, you know,people on the left, think the
domestic terrorist stuff is allfor, you know, right wing people
who are at January 6, and blah,blah, blah, and, you know,
people on the right on, youknow, will think it's for, I
don't know, Hamas supporters, orwhatever I don't, I don't know
what the rhetoric is at thispoint. But I'm sure it's dumb.
So. But ultimately, it's aboutanyone that that threatens, you

(01:22:14):
know, or isn't willing to complyabout certain things. So, you
know, don't make it easy forthem. Because, you know, here's
the thing about the ID stuff onthe internet. They already know
what you say online, and whatsites you visit, and all of that
you linking your ID to thatisn't going to give them greater
visibility necessarily thanbeyond what they already have.

(01:22:36):
The difference is once they canlink your ID to that, they can
legally go after you. Becausethe way they're like spying on
everyone is technically illegaland unconstitutional. So they
can't prosecute you necessarilyon that stuff that they obtained
illegally, you know, right. Andso they can, if they can tie

(01:23:00):
your ID, legally to it,

SP (01:23:03):
if they say this is a law to use the internet, you have to be
using it with your, you know,because right now, you could
say, somebody else use mycomputer.

WW (01:23:12):
Yeah. There's more of a gray area now. And also, like, you
know, the illegal wiretappingand all of that of
communications, they can't, Imean, you know, they can use
that to get like warrants andstuff maybe in these like FISA
courts and stuff, but it's like,they can't go after most people
with that, you know. And it'snot like they want to put

(01:23:34):
everyone in jail. But you know,as an example, under the Trump
administration, they almostcreated this agency called
Harpa, that actually Biden endedup making, but he changed it to
ARPA H. But it's like health,DARPA is the idea of it. And
it's the same people that weretrying to push it in the Trump
administration to and the firstprogram they wanted to put out

(01:23:56):
which was promoted by JaredKushner and Ivanka Trump was
called safe homes. I've writtenabout it before. It's an acronym
for something. And basically,that program was about using AI
to go through social media postsand identify social media posts
for early neuro psychiatricwarning signs of violence. All
of this being under the guise ofstopping mass shootings before

(01:24:17):
they happen in the US. It wasn'tjust like, oh, okay, it gets
flagged, and it sends people toprison. It was like send them to
a court ordered psychologist andstuff and like Medicaid, or
like, put them under housearrest. And there was like a
whole variant of there was awhole like spectrum of stuff you
can do to someone who getsflagged by this thing, right.

(01:24:41):
And the best way to not beflagged is to not be able to do
at all because again, AI isreally inaccurate. There can be
in certain situations andmisunderstand certain things and
not be able to parse certainthings. It's probably not great
at detecting sarcasm forexample, and if It becomes if
there these programs come tofruition. It's not going to be

(01:25:06):
very good, you know, becausepeople that don't deserve to be
caught up in this mess are goingto be caught up in this mess,
basically. And it was pitchedduring Trump. It almost
happened. Biden created theagency and a lot of the other
infrastructure for domesticterror, but I'm sure that kind
of program is going to be heresoon. Whether it's Biden or

(01:25:28):
Trump. I mean, I don't think itreally matters. I mean, during
the Trump administration, theylegalized pre crime, which is
something that like, hardlyanyone knows about. William Barr
created a pre crime program,that's still department of
justice policy called Deep. Youknow, they've arrested people
and put them in prison forsocial media posts and stuff.

(01:25:51):
That could escalate. I

SP (01:25:53):
feel like when you say, though, not be on it at all, you
I don't see a difference betweenwhat you're saying about AI? Or,
you know, whatever you mean, bysaying AI and, and the internet?

WW (01:26:06):
Yeah, I mean, I see what you're saying. But I guess like,
what I mean, is like socialmedia, like being, like I'm
talking about, like, once youadd the ID thing, you know, or
like, once these programs getrolled out where they're trying
to, like, hunt for domesticterrorists like don't, the
easiest way to make it hard forthem is to just not engage with

(01:26:26):
that system. It's weaponizing, asystem that used to be good
against people, you know, in away that's like,
unconstitutional and completelyinsane, frankly. And like maybe,
you know, it was great before touse social media to reach
people. And for certain things,it's obviously had, like some
negative consequences, socialmedia, particularly like on

(01:26:48):
young people and stuff. Yeah.You know, I'm not saying like,
illegalized social media, butthese people are trying to twist
and use all of these things thatwe've gotten used to, or
dependent on for various things.And then, you know, under the
Palantir model, this dual useneutral tool thing, you know,
are trying to turn it to thedark side, right? If you get

(01:27:11):
what I'm saying. So like, Iguess what I'm saying is not
like, necessarily, like be aLuddite, but once this stuff,
these these things happen. Yeah,these regulations and laws and
programs come in, you should notengage, or you should engage
with something completelyparallel that does not interact
with that system. Or you should,I mean, the internet is just a

(01:27:33):
bunch of shared servers. I mean,there's nothing really stopping
people from making some sort ofparallel server system where you
can still do some of this stuff.And some of this stuff can get
out, you know what I mean? Myproblem is about, like, the
centralization and howliterally, the worst people in
the world are trying to use AIfor particular ends, and we

(01:27:54):
definitely can't use their AI.Going forward, we have to find a
way to stop that. And it's andin that sense, it's also true of
the internet. Yeah. Becausethey're, they're, you know,
doing similar things to both Ithink, yeah. Yeah,

SP (01:28:09):
I think that, you know, the local AI thing, I think, is, you
know, there's like this, there'sall different kinds of them. You
know, and I think right now,everybody's just really excited
about the fact I read somebodysay something like, that the
fact that a dog is talking, youknow, like, they're not excited
so much about the thing they'reexcited about, just wow, look

(01:28:31):
what I can do. Yeah. But yeah,we can take away the ability
that they have to control ourlives by, you know, cultivating
what we want, and by, you know,following websites through RSS
instead of, you know, going toTwitter to get your news. I
mean, there's lots of thingsthat we can do. But I also
think, I don't think they'regoing to make it so that you

(01:28:53):
can't use a language model, youknow, on your computer. I don't
think that that's going tohappen. I think that what if

WW (01:29:01):
they're like, you have to get digital ID to use it. Do you
think that's likely?

SP (01:29:05):
I don't think that that's possible. I think as long as you
can buy a computer, I think thatyou can install whatever you
want on your computer. Andthere's no way that they're
going to be able to stop youfrom installing that as a
program unless they make theentire thing illegal. Like, are
they going to make largelanguage models illegal? No.

WW (01:29:23):
But if it's if it's if your access to it is like, an account
based thing, I mean, I think thebig ones could definitely
require most

SP (01:29:31):
of them do a lot of them. You have to pay money to even
use them. But like, they're notall owned by the big
corporation. Well, for

WW (01:29:38):
now. I mean, open AI and chat. GBT is basically
Microsoft. And I'm sure a lot ofthe other ones will get
swallowed up.

SP (01:29:45):
Yeah, I mean, who knows what's going to happen in the
future, but I don't think thatit's just going to all there's
plenty of people that areinterested in it and enough to
make sure that that there'sstuff that isn't like
corporately controlled I justdon't see that it's going to end
up like something that we can'tuse to our benefit. Yeah,

WW (01:30:10):
I'm just I'm sure some stuff will slip through the cracks, I
guess what I'm saying is thatit's, they're going to make it
so that you have to be reallylike technologically
sophisticated in order to dothat.

SP (01:30:20):
I also don't agree with that, because it's easy to
install a program on yourcomputer. And there's plenty
people, there's so many peopleonline, who are like trying to
help people divest from thissort of stuff, you know, all you
have to do is know how to get onReddit, but to what you say,
Well, you know, maybe thisinformation is gonna be harder,
you say the internet's totallygonna change. Maybe people

(01:30:40):
aren't even going to be able toget on the internet unless
they're willing to give up, youknow, their ID, you know, and so
you have to decide, yeah, so,you know, you brought up
awareness, you know, and that'skind of like what I wanted to
talk about to about that bookthat I was reading. So I don't
know if you want to talk aboutthat now or later or what? Sure,

(01:31:02):
go for it. Okay, well, I heardabout this book that was written
in 1964. It's calledUnderstanding media, the
extensions of man, by MarshallMcLuhan. This guy is considered
like, the father of modern mediastudies. He, after this book
came out, some ad executiveskind of swooped him up and put

(01:31:23):
them on a tour of like, youknow, interviews, and he was on
a lot of TV shows and stuff,he's kind of reminded me of
like, like a Bernays, orsomething, somebody who they
were like, really embracing hisideas, and kind of everybody was
talking about him. And so he isthe person who coined the

(01:31:43):
phrase, the medium is themessage. And so he kind of
thinks that we shouldn'tnecessarily focus on what the
content of the medium is, butwhat the actual medium is. And
study that when you're lookingat like its effect on people.
Understanding media, theextensions of man. So what is a

(01:32:05):
medium, he says, it's anextension of ourself, or any new
technology. So like the wheel isa medium that extends our
senses, he defines it as like ansomething that extends our
abilities past ourselves. And soa light bulb is a medium, a fork

(01:32:26):
is a medium, because that is anextension of our fingers. Light
bulb is an extension of oureyes. Anything that is law
allows us to sense further thanourselves. And so he talks about
how we used to be in themechanical age, and that would
be like the wheel and stuff likethat. But we're now in the
electric age. So when he talksabout media being an extension

(01:32:47):
of ourselves, it's basicallylike, it's an extension of our
central nervous system. Becauseit allows us to know more about
things that we can't see rightin front of us. And so now that
we're in the electric age, wecan, everything is
instantaneous, you know, we canaccess all of the information.

(01:33:09):
Right away, whatever we want.There's a quote from the book,
what we have to consider is thepsychic and social consequences
of the designs or patterns asthey amplify or accelerate
existing processes. So whenyou're looking at a medium, the
message is the ways that itchanges society. So like, that's

(01:33:34):
what I think that we need to do,we need to look at AI and kind
of examine it in a way where wecan understand not necessarily,
you know, the content that ithas, but how it's changing us.
And, you know, we talked aboutthat a little bit, but I think
that brings when you when youthink about it, like that, it

(01:33:59):
allows you to understand it in adifferent way, instead of
focusing on what it's doing.Like what it's doing for you, we
can think about what it's doingto all of us as a society. And
once we understand that, then wecan maybe decide if we want to
do that, decide if we don't wantto do that, you know, that type

(01:34:21):
of stuff. So

WW (01:34:22):
if you if you look at you know, AI is like the newest
iteration of this extension ofthe central nervous system, then
I guess then people coming in tomanipulate AI to like, instead
of, you know, being part of theextension of our efforts to
sense and understand the worldaround us and like process

(01:34:44):
information, truthfulinformation, and like seeking
that out. It's like an a way ofsort of hijacking the next inner
iteration of that to like, leadus in a different direction, you
know? Yeah, like instead ofleading us To finding more about
our reality and understandingthe world To lead us to sort of

(01:35:05):
like a closed off system,hurting us into that, and sort
of trapping our central nervoussystems, they're

SP (01:35:16):
totally sort of how I

WW (01:35:17):
see. So I guess then in the sense, what we have, then is a
decision of how how do we avertthat diversion AI being used in
that diversionary way by thepowers that be and keep it as a
thing that sort of helps us toexpand our access to

(01:35:38):
information? And I thinkultimately, it comes down to
who's, you know, making therules and, you know, dominating
the AI industry, and how can wedecentralize that and, you know,
prevent this extreme Central,centralized control over all of
it?

SP (01:35:55):
I guess I don't really think it's going to be able to be
centralized the control of AI asa medium?

WW (01:36:03):
Well, I hope not. But I think again, it comes down to
like, what people are going todo to prevent that. And what I
see right now is that the peoplethat are trying to prevent that
are like, much moretechnologically sophisticated
than everyone else that's usingit. Right?

SP (01:36:18):
And then it's easier to just go along with the way things are
than to try and like, you know,be like, conscious about things.
Yeah, yeah, totally. Totally. Iagree with you. I do agree with
you. I mean, I know it soundslike,

WW (01:36:34):
you agree, no, no, but I think it's good, because it just
helps me explain it better.

SP (01:36:38):
Like, no, I love it. Because this is the first time I've
heard you talk about the book,I've told you a lot about the
book, but I haven't heard yourresponse to it. Something else
interesting that he talks aboutin the book is how all media
shapes our identity. And thatall new medium contains content
from the previous medium. Solike, books contain content from

(01:37:03):
the previous medium, which washandwritten text, short, like
manuscripts and stuff, and thenthe printing press, and then
like radio and TV, so like TVcontains, you know, radio, and
plays and stuff like that. Andso it all kind of shows us the
past, the new media shows us thepast. And like, we live in this

(01:37:24):
idea of what the past was,because the new thing is showing
us the past. It's kind ofinteresting, and like, yeah, big
shapes our identity. I found itinteresting that artists and
writers and stuff like that arethe ones that are still
threatened right now by AI asthey should be, because their

(01:37:45):
jobs, you know, and theirability to make money is going
away. But it's just interesting,because it AI is actually a
threat to their identity. Asartists as creators, you know?

WW (01:37:58):
Sure. Yeah. Kind of disturbing to think about it
that way. But yeah, yeah. No,but I feel like I feel like I've
heard that somewhere before. AndI mean, it does make sense. But
something I've thought aboutbefore about like identity. And
I've said this, and I think someinterviews a maybe a few years
ago, um, you know, in terms oflike, the control of
information, like why it's soimportant to the elites, you

(01:38:21):
know, is because it like shapesour identities. So like, if our
identities are shaped by like,who we think we are, like, where
we come from, you know, it's allabout, like, our history, and
also like, you know, humanhistory, the history of our
families of our communities, oursocieties. And so like if these
people control how history isnot necessarily just written,

(01:38:43):
but how it's like, remembered,like they have control over
memory, then they can controlour identities, I think, or how
people perceive them. And so Ithink that's why they want such,
like, extreme control over likethe flow of information right
now. And I think they'redefinitely trying to use AI for
those ends, which, again, is whyI think it's very important that

(01:39:03):
people have physical books. Andif you don't have physical books
have offline copies of otherbooks and like, remember to
read, you know, yeah, it's goodfor you. And it's good for your
brain as you age and all of thatstuff. And, you know, it's sort
of getting phased out at thesocietal level, it seems like
and I think we should definitelyresist that. Because, you know,
if they centralized controlautomation, and all of this,

(01:39:26):
they'll invariably control allthe historical accounts of how
we got here, all of that, andyou know, what, you know, who
the winners are, and who, youknow, all of that. So, you know,
I think can't

SP (01:39:39):
learn from the past if they control the story of the past.
Yeah.

WW (01:39:43):
Or if the pastor being told about never even happened, and I
think a lot of you know, mywork, specifically history and
also like my book and stuff, youknow, we're sort of trying to
find, you know, what reallyhappened and how we really got
here and just trying to answerthe questions about like, how
did Epstein how happen, youknow, I mean, it's obviously a
lot more than that. But that'ssort of how I got to answering

(01:40:06):
those questions. And there's alot of history that's like,
intentionally hidden from us ifthese people, you know, in the
texts, you know, historicaltexts, textbooks and whatever,
that these people, right, theygive very specific narratives
that oftentimes are notaccurate. You know, and that's
used to like shape identity. Solike, you know, like US public

(01:40:28):
school, American historyclasses. Pretty much every
textbook I ever, like,encountered in grade school was
like, the US government is likeyour father, and it's been on
this steady stream of progress,from the revolution to now. And
it's so great and protectsfreedom and does all this great
stuff. And then you find out thereal stuff. And you're like,

(01:40:50):
what, you know, I mean, I'mpretty sure most people
listening to this podcast, soI've gone through that to some
degree, but what happens whenthat those alternatives to
finding out what's going on,like, aren't readily available
anymore? And how will thatimpact how people view and feel
them to, you know, feel aboutthemselves? And I think, you
know, that's, that's part of it.But I think, also to like, in

(01:41:11):
terms of our understanding ofhuman history, I mean, there's
so little we know, about, like,the distant past, and all of
that, and one of the reasonsallegedly for this, you know, is
the whole, like, burning of theLibrary of Alexandria, and all
of that, but with the internet,you know, if these guys take
down the internet, and try andrelaunch it, you know, they

(01:41:32):
could do something like that,again, you know, like a digital
version of that. Yeah, becauseso many people have stored
knowledge and books that aren'tin print anymore, and other
things, purely online. And soagain, that's why I like really
like to tell people to try andmake some sort of offline or
physical library, you know,because it's, they're definitely

(01:41:52):
interested and purging, youknow, historical accounts that
don't favor their, how they wantpeople to perceive things. And
again, this is all aboutmanaging perception with the
intention of, you know,controlling behavior. And a lot
of that is memory. And a lot ofthat is identity. And
ultimately, you know, it comesdown to information and data

(01:42:12):
ultimately, you know, yeah,yeah. So, anyway, that's my
soapbox about identity andinformation.

SP (01:42:21):
Okay, so I have two more quotes from the book that well,
okay, so I want to read thisfirst quote, and then I'll read
another one. If we understandthe revolutionary
transformations caused by newmedia, we can anticipate and
control them. But if we continuein our self induced subliminal
trance, we will be their slaves.So this is why we're having this
discussion. This is why we needto talk about AI, what it is

(01:42:44):
what it's doing to us andeverything like that, right. So
the self induced inducedsubliminal trance, he talks
about this a lot in this book,he calls it Narcissus. Narcosis.
So here's another quote from thebook. The hybrid or the meeting
of two media is a moment oftruth and revelation from which
new form is born. For theparallel between to media holds

(01:43:06):
us on the frontiers betweenforms that snap us out of the
narcissist narcosis the momentsof the meeting of media is a
moment of freedom and releasefrom the ordinary trance and
numbness imposed by them on oursenses. So that's where we're at
right now. You know, we're notnumb to it yet. And we're like,
right there at that spot wherewe can examine it, and decide

(01:43:29):
what we're going to do. Youknow, so we need to be aware of,
that's why we're having thisdiscussion. Again, we need to be
aware of how AI changes the waywe interact with people,
information and oursurroundings. You know, we can't
remain ignorant of theenvironment that we're living
in. Yeah,

WW (01:43:48):
I mean, AI is rolled out as a novel tool. And this has
happened with other stuff beforeand if people don't, aren't
weary, were like, payingattention to it, it quickly
moves from being a tool toempower them to something else.
And there's always this phase atthe beginning where they want
people to like onboard to aparticular technology, where it

(01:44:09):
is open and useful like that,and then it starts to change.
You know, if people aren't wearyabout it, so if you're using AI,
make sure you're using it insuch a way that it's a tool that
is helping you not one that isdiminishing you and not one that
is endangering you in the eventthat these were on domestic
terror, predictive policing,dissident whatever, you know,

(01:44:33):
all that stuff, when that getsrolled out, obviously you should
reconsider what you're doing.But even before then, you know,
you have to be aware of howyou're using it and think that
stuff through because if youjust keep using it, because it's
Oh, it's convenient, convenient,convenient, that has been used
historically to hurt people in aparticular direction that isn't
good for them or for humansociety. So I definitely think

(01:44:56):
it ultimately leads to dumbingus down right, which you know,
there is In a progressivedumbing down of society is
definitely in the West. Butobviously it's happened
elsewhere too. And, you know, Iwould argue that there's a lot
of intentionality behind that.And it's, it hasn't all been
technology's fault, but it'sdefinitely been an engineered

(01:45:17):
thing. And technology has beenused in part to facilitate that
engineering. So if you're goingto engage with this kind of
technology, you have to be awarethat it can be used to do that
to you. And there's an intentionto have that happen to people
who become dependent on it. Sodon't become dependent on it,
you have to keep yourrelationship with it. So it's a
tool serving you. And your don'tit, you know, in one day, the

(01:45:40):
tables don't turn and thenyou're serving it, you know,
totally, but

SP (01:45:45):
it's dangerous, you know, because you can, like, even
going into it, knowing thatknowing everything that you
said, it's a narcosis, you know,like, you can totally become
immersed and unaware, you know,and all of a sudden, the
conveniences of it are much toogreat to even consider anything

(01:46:07):
else. Yeah,

WW (01:46:08):
I mean, if you're one of the people that feels like, you
can't handle that kind ofsituation, maybe you shouldn't
use it at all. No, yeah,totally. And that's my view
about it. But I mean, you know,you just got to be wary that
it's like a tool that can beused against you if you're not
careful. And, I mean, it'ssubtle, you know? Yeah, I mean,
I go back to the social mediastuff, and how this was like,

(01:46:29):
sold the people as Oh, you canstay connected with everybody,
and it's gonna make things somuch better socially. And there
was no talk about all the datayou're giving away to it. And it
turned out to be like a hugedata harvesting thing that
actually made people moredepressed made people for feel
more disconnected. Right? Yeah,it doesn't necessarily like
always have those consequenceslong term. And maybe if social

(01:46:54):
media hadn't been so I don'tknow, co opted from the very
beginning it maybe it would havebeen different. I mean, we know
also, like Facebook, forexample, experimented with
making people more depressed bypopulating their newsfeeds a
certain way. So maybe they'veintentionally produced that
outcome of making people feelmore disconnected and more, you

(01:47:15):
know, more depressed when theyuse it and all of that, like,
maybe it's intentional, and notnecessarily social media that
does that to people, I don'tknow. But it definitely has,
like changed. I mean, peopleengage differently with the
discourse on social media thanthey would like in the real
world, you know, and it'sdefinitely had a lot of
consequences that I think, youknow, users of it don't

(01:47:38):
necessarily think about, andthen over the years that you're
using it, you get acclimated,and that gets normalized, but
it's, it wasn't normal, youknow. And so I think people need
to be worried about that kind ofstuff with AI, because that's
how they did it, you know,before, like, oh, everyone's
using it, look how cool it is,look what I can do. And then
you're on it, you give all yourdata away, and then you end up

(01:47:59):
feeling diminished, you know?

SP (01:48:02):
Yeah. About this book, they talk, they talk about numbing
our central nervous system, weare so stimulated with all of
this stuff that we can get, youknow, that we were numb to it,
you know, and, you know, it'scause causes a lot of anxiety
and stuff like that. I think I'mjust gonna say there is a lot in

(01:48:29):
this book that I think is worthconsidering. And I know we
talked about people should readbooks, and I agree, I love
reading books, you've learned somuch from them, you learn so
much more by reading andunderstanding an idea and
thinking about it than you dofrom hearing somebody talk about
it. But not everybody can read.You know, not everybody has a
lot of time to read everythingthat gets recommended to them or

(01:48:51):
whatever. So I think I want toput some of this stuff in the
show notes. So you know, peoplecan, at least if they're not
going to take the time to readthe book, maybe read some of the
ideas in the book, because Ithink if you can understand
things to consider, it's goingto at least help a little bit in
deciding where you're going toplace AI in your life, you know?

(01:49:15):
Yeah,

WW (01:49:16):
I mean, I think it's really important to find out if you
know, think about this stuff,are you going to use it? How are
you going to use it in such away that it doesn't negatively
impact you think about what yourred lines are and pay attention
so that you don't cross them?Are you avoiding crossing them?
I think that's really the onlyway we can really interact with
the stuff because to do itwithout being cognizant of the

(01:49:38):
risks are aware of some of theagendas that AI is meant to
serve and how it's meant how,you know, the people dominating
the field right now we'reseeking to use it mainly against
us. You know, we have to beaware of that and ensure that
it's not going to be usedagainst us that way, at least as

(01:49:58):
much as we can controlobviously. So yeah, I guess
that's probably a good place to,to leave our discussion. So
thanks a lot star for beinghere. And thanks to everyone who
listens to this podcast that is,of course a production of star
and myself even if star isn't,you know, necessarily part of

(01:50:19):
the conversation. She's alwayspart of the art part of the
podcast. And with that beingsaid, star is also the person
who does the show notes forevery episode. So I'm not sure
if everyone listening takes timeto look at the show notes. But
you definitely should. And I'msure no one is better at telling
you why then starve yourself.

SP (01:50:38):
Thank you. I just wanted to, you know, remind everybody, so
when you, you know, you get yourpodcasts and the podcast app,
there's the description, andthen I always put the show notes
page in there. So you should gothere and check it out. Because
there's all kinds of stuff onthere anything that Whitney's
talked about, you know, there'ssome extra information sometimes

(01:50:58):
like she's done an interview orsomething like that, that's
related to what she talks about,you know, all kinds of stuff.
Everybody knows what show notesare right. But sometimes,
there's like playlists of clipsfrom the podcasts and all kinds
of stuff like that. Sodefinitely, please check out the
show notes page. And I wanted tosay while you're there, check
out the website, too. You know,like a lot of people don't spend

(01:51:18):
a lot of time exploring, youmight go to the website when you
see a link to an article orsomething like that. But there's
a lot of stuff on there. Like,people are always emailing
asking how can I find out whereWhitney's new interviews are and
stuff like that there's a pressand media page Whitney's got all
we put all of her interviews onthere, you find all that there's
like an awesome search bar onthe website. So if you're

(01:51:40):
interested in something like youknow, CBDCs, just type that in
the search bar. And if Whitney'sdone an interview on it, if if
she's done an article on it, ifit comes up anywhere, it shows
up. It's great, really awesome.And then also to check out the
fact on the website, thefrequently asked questions,
there's the page, and it's gotall kinds of info, including
stuff like how to follow awebsite with RSS, which is the

(01:52:05):
technology that's used inpodcasts, you know, when
podcasts publish a new episode,and how it just shows up in your
app. That's RSS. So you can dothat for websites too. And then
whenever a website publishessomething, it'll show up in your
in your app, it's great. Andthen along those lines, I wanted
to mention that you shouldlisten to your podcasts on a

(01:52:25):
podcasting 2.0 app, which is,you know, a more advanced, it's
got more advanced features, it'sgot transcripts and chapters,
and you can make clips, you canleave comments, you can send
lightning payments to thepodcasts that you listen to all
kinds of really cool features,you should totally be listening
on something that supports thatsort of stuff. So I really love

(01:52:48):
this app called pod verse. Youcan use it on your computer or
you can use the app it's calledpod verse and you can use it on
your computer at pod verse.fmsuper awesome and yeah, that's
about it and thank you toeverybody who supports Whitney
because that also supports me

WW (01:53:07):
Yeah, so

SP (01:53:08):
thank you. Thank you to the people listening to this I never
get to say that thank you

WW (01:53:13):
yeah, thanks

SP (01:53:18):
set out if you want no

WW (01:53:20):
it's fine. It's let me I liked it well star is amazing
and has kept unlimited hangoutalive and probably wouldn't have
survived last year and someother things if it wasn't for
her so she definitely deservesyour support for sure. Um and
thanks for everyone who'ssupported you know the podcast

(01:53:41):
and my work up up and you knowthrough now I know I haven't
perverted producing as muchcontent as I used to I've tried
to keep members kind of updatedwithout getting too personal
about how you know things withmy son are still going on and
you know, I had to move and allsorts of other stuff is has been
happening and you know, there'sobviously some other stuff going

(01:54:02):
on but I'm hoping to get back tolike a normal content production
thing pretty soon hopefully onceyou know kids are back in school
and in March which is a littlebackward from the US because
remember I live in the southernhemisphere so seasons are
backwards it's summer vacationhere now for for us but thanks
for everyone who's been you knowreally supportive through all

(01:54:22):
this crazy stuff. I'm surethings are only gonna get crazy
you're not just for me but butfor everybody but just want to
say you know, thank you all foryou know, allowing me to
continue to do this work and tosupport other people who you
know support me and the sitelike star and just just can't
thank you guys enough hopefullyyou enjoyed this podcast.

(01:54:44):
Hopefully I'll get you know moreback in the groove with having
them out you know every twoweeks like I did before you know
starting now wish. Yeah, thanks,everyone for listening.
Hopefully you got something outof the conversation today. If
you did, please share thispodcast around. Be very free.
She did and we'll catch you onthe next episode. Thanks so much
Advertise With Us

Popular Podcasts

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

24/7 News: The Latest

24/7 News: The Latest

The latest news in 4 minutes updated every hour, every day.

Crime Junkie

Crime Junkie

Does hearing about a true crime case always leave you scouring the internet for the truth behind the story? Dive into your next mystery with Crime Junkie. Every Monday, join your host Ashley Flowers as she unravels all the details of infamous and underreported true crime cases with her best friend Brit Prawat. From cold cases to missing persons and heroes in our community who seek justice, Crime Junkie is your destination for theories and stories you won’t hear anywhere else. Whether you're a seasoned true crime enthusiast or new to the genre, you'll find yourself on the edge of your seat awaiting a new episode every Monday. If you can never get enough true crime... Congratulations, you’ve found your people. Follow to join a community of Crime Junkies! Crime Junkie is presented by audiochuck Media Company.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.