Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:36):
Welcome to Unlimited Hangout
I'm your host. Whitney Webb. For much of the year this podcast
has been exploring the phenomenon of the PayPal
Presidency. That is the extremely significant influence
of the PayPal Mafia and their allies in Silicon Valley on the
current Trump administration from healthcare to finance and
beyond. Today, we will be taking a look at how big tech
(00:56):
neoconservative think tanks and the Trump administration are
shaping current education policy, including the so called
school choice movement and the promised dismantling of the
Department of Education, that seem at first glance to be about
reducing federal control over education. However, the deeper
reality is anything but. Other education policies we will be
(01:17):
interrogating today include efforts by the current
administration to control anti-war protests on campuses
under the guise of combating quote, unquote anti semitism, as
well as how this effort and other potential future efforts
to control speech in classrooms and on campuses could soon be
aided by the growing importance of the FinTech industry On
(01:37):
education funding, we will also touch on how AI analytics of
students, learning metrics and other data harvested by big tech
are being used to steer and determine children's futures.
Joining me to discuss these topics and more, is John
Klyczek. John has a master's in English and has taught college
rhetoric and research argumentation for over a decade.
(01:57):
He is the author of the book School World Order, and is a
contributor to several publications, including
Unlimited Hangout. Today we will be focusing on John's recent
pieces for Unlimited Hangout on Education Policy, the first
School Choice Corporatization, Social
Impact Finance, and the Dismantling of the Department of
(02:17):
Education, and the most recent being From Project 2025 to the
School Choice Fin-Tech for a Blockchain Social
Credit Economy. Thanks for joining me today, John, and
welcome back to Unlimited Hangout.
It's great to be here. Whitney, thanks for having me back.
Of course, my pleasure. So a lot of your recent work on
education for Unlimited Hangout has focused on the groups behind
(02:41):
the so called school choice movement. And school choice
sounds like a good thing just looking at the term
superficially. So why do you believe this movement is hiding
ulterior motives and who benefits?
Well, I mean, you know the reason why I think that it's not
a great thing was many, many years back, and it really stems
from the research of my mentor, Charlotte Thomson Iserbyt, who
(03:04):
worked in the with the Ronald Reagan administration. She was a
Senior Policy Advisor in the Department of Education, and she
blew the whistle on something called Project Best, that's
basic education skills through technology. It was a program to
set up public private partnerships between big tech
companies, Department of Education, other local education
(03:27):
agencies, in order to set up, basically operating conditioning
through computerized education. And I've written a few pieces at
Unlimited Hangout, sort of tracing the history of that and
how it ties in something called UNESCO study 11, and how,
basically, how that set the groundwork for all things global
ed tech and fourth industrial revolution. But during her time
(03:49):
at the Department of Education under Reagan, she was also a
liaison with a task force on private sector initiatives, and
during her time there, she's part of that private sector
initiative, was the school choice movement, and as the
liaison, she basically asked the question, isn't this like
corporate fascism? And they're all kind of like, Gee,
(04:10):
Charlotte, we didn't think about it like that. And then, you
know, they moved her to another office at that point. But you
know, is the way that she pointed it out from day one.
It's essentially, you know, she wouldn't use stakeholder
capitalism. Was the wouldn't even been the phrase she used,
but it's public private partnerships. It's the merger of
corporations in the state. That's, you know, sort of
Mussolini's definition of fascism. Yeah, it's probably an
(04:33):
apocryphal quote, but colloquially, we understand that
to be his definition corporate fascism. And today we would call
that, you know the stakeholder capitalism, right? It's the
mixture of the public and private on a global scale. So
basically, it's stakeholder capitalism for schools. It's
going to subsidize private companies with public tax
dollars, and it's also going to remove any sort of civil
(04:55):
process, right? So you're not going to have a local school
board to parts of this. The school choice movement, more
broadly is the charter schooling movement, which is pretty much
locked in. So charter schools are private corporations that
are subsidized with public tax dollars. But then this next
round school choice is basically these different funding
(05:16):
mechanisms. So, you know, we might just call them all
vouchers colloquially. I think part of the reason why they've
come up with these different terms, like education savings
accounts and scholarship granting organizations or tax
credited scholarships is largely to kind of muddy the waters make
it harder to figure out what what they're doing. Now there
are some there are some nuance differences between how the
(05:39):
money is distributed and access between, you know, traditional
voucher, an ESA and a tax credit. But basically what they
all have in common is you're taking public tax dollars
instead of subsidizing a school or a, you know, a charter
school, private corporation, you're just you're subsidizing a
basket of could be a charter school, could be a private
(06:01):
school, could be a religious school, or could just be any
range of ed tech products and services, you know, including
now they're adding into a therapy so, like mental health
services as well, that you know, could be also service with
different technologies we might talk about. So ultimately,
right? All that, all that together is indicates that, you
(06:22):
know, with the charter, as I mentioned, with the charter
schools, there's no they, they don't have an elected board,
right? And the same thing with if you take those those
vouchers, or those ESA monies, and you purchase, you know, your
tech products, I mean, essentially, you're being
serviced again, by a company that doesn't have an elected
board. So whether it's the charter schooling system or the
(06:43):
sort of Neo voucher system, the school choice movement, again,
right? It's basically destroying the public sector. It's
privatizing everything, and then it's also cutting off any, any
routes that we would have had towards any court of civic
recourse, right, through an elected school board. So those,
those are all the bad things about school choice.
(07:06):
So in that kind of scenario, what happens to the kids that
can't go to these private institutions that are, like, the
bulk of public school students, right?
That's the thing. I mean, that's like, if you really think
about the situation, right? So the pitch is that, Oh, fun
schools, or fun students, not schools, right? And the idea is
(07:28):
that, well, how is it fair that you know, if you're stuck in
this zip code, that you should be forced to have to go to that
school? You should have an option if you want. And so the
pitch, you know, like people like Corey DeAngelis, the people
that are really selling this, if you, if you listen to the
buzzwords and the sloganeering, it almost sounds like these,
these students in these impoverished zip codes are going
(07:50):
to be able to take their voucher money and go to like, some elite
private school, okay, let me
that's the sales pitch. Yeah, they're
nowhere nearby. Okay, so the first problem is transportation,
if you're in a very impoverished neighborhood, right? It might be
a food desert, and also, you know, have problems with
transportation. Now you can use some of your voucher money for
transportation, but then you're not going to have as much left
(08:11):
over for the tuition if you could even get to that school.
The other thing is, the amount that they're going to give you
with these various esas, vouchers, scholarships, etc,
wouldn't cover the entire tuition anyways. So where is
the, where is all that money going to go for the average
student that actually, you know, really wants to get out of the
stuck position they're in with whatever, you know, with their
(08:33):
public school, with their with their so called government
school, you know, the main thing that's going to be their option
is maybe there's a charter school nearby, or maybe they
could go to another public school in a nearby district, but
most likely, most of that's going to go to ed tech products
and services that can be serviced online, right, in their
home, right? So basically turning your home into the
(08:56):
government school, because with that vouchers comes the
government regulations, right? So a lot of the other pitch was
that, oh, you're going to get rid of the woke stuff. And, you
know, during lockdowns, everybody, you know, that didn't
want their experimental gene therapy, you know, they exited
the schools thinking that, right? This whole they're going
to build this homeschooling Co Op pod movement, and they're
(09:16):
going to sort of create a parallel structure. Well, that
could be the case. But if you take that voucher money, as I've
showed in the first article in this series, at the state
levels, the Federal pitches are based on all of those come with
some sort of strings attached, right? And some of them include,
like, you have to be up on all your health code regulations,
(09:39):
which would mean, right? You'd have to have your your
experimental gene therapy, right, and so all of that to say
that right, the money that you're going to get right is
going to be largely, is going to be used for bringing in
government regulations into a quote, unquote homeschool
environment, and also bring. In big tech to data mine the
(10:02):
students in your home as well, and so you know that's, that's
what those children are going to get with those vouchers. Okay,
so a few months ago, Catherine Boyle, who's a general
partner at Andreessen Horowitz, which is a VC firm that's very
much in the PayPal Mafia orbit via Marc Andreessen and other
(10:23):
figures told an audience at the neocon think tank the American
Enterprise Institute a few months ago that, quote,
technology is the backbone of the education choice movement,
meaning school choice. So how are Silicon Valley, quote,
unquote, Libertarians and some of the people in Silicon Valley
(10:43):
backing these pandemic pods and kind of these, this parallel
idea of a parallel system, at least, you know, publicly. And
how are they teaming up with neoconservatives to sort of
shape the role of technology in US schools, whether it's public
schools or some of these schools that you know, you know, the
(11:04):
Silicon Valley people are sending their kids to,
yeah, well, it's interesting to note that, I mean, right
there, you have a connection between an executive At
Andreessen Horowitz and also one of the coke backed state policy
network think tanks known as the American Enterprise Institute,
(11:28):
and one of their representatives is actually one of the authors
of project 2025 and the project 2025 education section does call
For they wanted to transform a certain portion of Title One
funds into esas, right, ESA vouchers. And then they also
wanted to push this ecca bill that would basically get tax
(11:53):
credited scholarships authorized. Now they didn't pass
that, but the quote, unquote, big, beautiful Bill did ram
through something that actually kind of combines the worst parts
of both of those bills. So like the scholarships for the ecca
was just supposed to go to for tuition for schools, but what
they have now is this tax credited scholarship program so
(12:15):
they can use it for a basket of curricular materials,
educational therapies, personal tutors, etc, etc, okay. And so
just, that's just one little example of sort of where, like
the Silicon Valley venture capitalist tech bros, sort of
overlap with sort of the old school neocon Koch brothers
(12:36):
think tank consortium. And so do a couple of ways that that sort
of the Silicon Valley arm of this, this partnership, is
pushing sort of Ed technocracy. So one we could look at would be
the FinTech angle. The other one would be the actual educational
(12:57):
technology products that would be funded by the FinTech
product. So in general, what we have is they want to service
these vouchers, scholarships, education savings accounts, et
cetera. They want to service these through these digital
wallets. And so you have a range of various companies to do this.
(13:20):
So one's called Class wallet. Another is called Odyssey.
Another is merit International. Another is student first
technologies, and then you have another one called SAP araba. So
I can sort of break down some of the Silicon Valley venture
capital connections to some of these companies, and then we can
(13:41):
talk about how they, how they are used to facilitate the
funding of the ed tech products through those through those
voucher systems. So class wallets been around for a while.
It's been funded by Lazard partners. And Lazard
Incorporated is a partner with the World Economic Forum. Lazard
has people that have worked for Lazard that are notable, being
(14:04):
Nathaniel Rothschild, who's actually a Young Global Leader
of the WEF, and another one would be Vernon Jordan. Vernon
Jordan being a close, sort of an advisor to Bill Clinton, I
believe
he's the one that took Clinton to Bilderberg, didn't
he? Yeah, yeah.
I believe he's one, and brought him to Bilderberg, yeah,
and he's trilateral as well. And Clinton, again, his world, world
(14:25):
economic forum. So I know it all that because I want to keep, I
want to know these World Economic Forum connections,
because this is right all the the base right wing people that
think that Trump is like, somehow this foil to the W, E,
F. I mean, only in so far as it's a dialectic, but not an
actual. Oh, absolutely,
yeah. Mark Goodwin and I were saying that, like, January
(14:48):
of last year, that basically, you know, everything that they
had tried to sell through the public sector during the COVID
era, they were going to squirrel it away and try and market it
through the private sector and from the right instead of. Left,
yeah, after they demolish trust with COVID, the, you know, the
the World Economic Forum's goal after that openly stated, was
rebuild trust. And how do you do that? Well, you you build up the
(15:11):
opposition, or what looks like the opposition, and you sell the
same policies through the opposition, right? Yeah, yeah.
Like Andreessen Horowitz is, I think they're part of the W E
F like blockchain digital council would not surprise me. I
got a screenshot of the actual profile up there, and in the
(15:34):
article, you know that SAP araba company, that's a WEF partner,
or the parent company is okay, and then that company partners
with IBM and Amazon, both W E, F partners Peter Thiel, WEF,
but yeah. So, so other, some of these other companies, FinTech
companies, that are facilitating these digital wallets, when
(15:57):
would be Odyssey? Now, Odyssey is funded by Andreessen
Horowitz. Andreessen Horowitz also funds merit International
and Odyssey also won the Yass prize for what they call stop
education. So it's sustainable, transformative, outstanding and
permissionless education. And Jeff Yes, another billionaire,
(16:21):
eyes is also supported by Jean Allen center for educational
reform. Cer I believe I got those words right in the
acronym, and she's somebody that's partnered with Bill
Bennett to push was voucher scholarships, or ESA is one of
those programs, and Bill Bennett was the guy that set up first
(16:42):
online charter school, former Secretary of Education under
Ronald Reagan after Charlotte left and K 12, Inc, basically,
you know, brought in sort of the this online corporate charter
school industry, which also provided the online platforms
through which you integrate all these other applications that we
(17:03):
might mention. So some of these would be like adaptive learning
course, where, like clever and Newton, these are both funded by
Peter Thiel. Basically, this is the modern digital version of BF
Skinner's teaching machine. So it's using operant conditioning
algorithms that basically trains train students for performative
compliance. You have something like Course Hero and simbo lab.
(17:25):
So Course Hero, basically platform, symbol lab. Symbol lab
is more adaptive learning. Stuff that's been funded by craft
ventures, which is saxes venture capital firm Andreessen Horowitz
has funded Udacity and a company called effective, which I've
written about going all the way back to my book, basically, it
(17:46):
specializes in what they call social emotional learning. So
the adaptive learning courseware is going to data mine students
thinking algorithms or behavioral algorithms. The
social emotional learning stuff is going to basically data mine
their effective or emotional algorithms, and that's going to
be done through like wearables. So either EEG wearables brain
(18:09):
waves, or EKG wearables, a data mine heart rates.
Well, HHS has recently announced that they're all
behind the mass use of wearables and that it's a major priority
for them. So having it become a priority also for education
policy, I guess that means they're going to be they're
(18:30):
going to try and make them be everywhere soon.
I wrote a piece once upon a time in I think Natural News was
the only people that put it up, and it's called Precision
Medicine Initiative. There's two it's a two part series. I can't
remember. The subtitle is the second one. And the second one,
I looked at a guy by the name of Eric Dishman. And Eric Dishman
(18:50):
had this was the head of this thing called all of us. And it
was this project, like, basically. So precision medicine
is where they want to basically make personalize your your
medicine just like you want to personalize your learning. So
that means AI analytics, but in particular, for medicine, right?
Analytics, extrapolating correlations between your DNA,
(19:12):
your actual genes, genetics, and your environment as well. So
what Dishman was known for before he got on the all of this
experiment was smart homes for people with Alzheimer's, okay?
And so basically, what he the project was to come up with,
it's sort of an epigenetic study. So it's to take a look
(19:33):
at, like, what sort of genetic predispositions Do you have?
Data, mine what you're doing all day. What are you eating? How
many steps do you get right your your activities and see what
correlations between your activities in the environment
might trigger, right, the pathological expression of a
particular gene, etc, right? And so I've said, I said it a long
time ago, and I was, you know, I speculated. I said, you know, I
(19:55):
feel like, you know, I Bobby, I was, you know, you. Me some hope
during the during the lockdown years, but when I sort of looked
into some of his investments, I was in the and they let him in
there, and then they put Jim O'Neill in behind him. I'm like,
we're gonna see wearables all over people. I bet you we're
gonna see the Dishman problem. And I guess I was right.
(20:16):
Oh yeah, totally. Well, it goes back to what I was sort of
saying about the the rebrand of the quote, unquote opposition
during the COVID era. A lot of those people were the big names
recorded by, you know, Neo cons, or Neo conservative adjacent
types, like during COVID. I mean, you had people like Frank
Gaffney creating websites like stop vaccine, passports and
(20:38):
stuff. And Gaffney used to work under Richard Pearl, if I'm not
mistaken. I mean, he's like a well known Neo conservative
figure, and Pearl helped create Palantir, among other things,
which is, you know, I mean, the more you look at the Silicon
Valley, quote, unquote, libertarian types that have been
really elevated in the last election cycle or so by figures
(20:59):
like Trump and also, like in Argentina of Malay, a lot of
them have a surprising amount of overlap with the Neo
conservatives who I you know, have, at least you know, in
terms of what their policy solutions are for things. And I
think that's, you know, neocons used to be so hated in the US,
and now this very successful rebranding and built, rebuilding
(21:22):
of trust has allowed them to, basically, you know, continue
pursuing, you know, their policies under, you know
different names, basically. And so I think that's why it's very
interesting that it's also happening in education, among
among other areas. But I think it's been a very successful
(21:43):
rebrand. And it's not exclusive to, you know, where these guys
most often operate, which is, you know, finance or social
media, or some of the things we most you know are most publicly
associated with Silicon Valley. You know, it's, it's really a
partnership that is permeating through everything is very bad,
(22:04):
and a lot of these, I mean, I think the wearable thing is
definitely something to watch. I know in the past, you've covered
it pretty extensively for unlimited Hangout, where it sort
of fits in with the education stuff. But this push to have it
be in healthcare, and I'm sure there'll be, you know, pushes to
have it, you know, for other reasons too, at some point, you
know, it's, it's gonna get it's gonna get weird. And also, a lot
(22:27):
of these people that are known, you know, Silicon Valley
billionaires that are known for funding a lot of the suspect
stuff in healthcare and also education, like Bill Gates, if
I'm not, if I, if I remember correctly, he, like, funded a
wearable that was being piloted in New York State for students
where, like, if they weren't paying attention, it would like
give them a light shock. It's like a shock collar, basically,
(22:50):
but like, it's on your wrist or something like that. I mean,
just absolutely, I don't know. And I mean, I have like, a seven
year old in, you know, basically, kind of a public
school here, and she, like, gets distracted. I mean, she's just
an easily distracted kid. I can't imagine her having to go
through that kind of stuff at school. Or, like most kids, you
know, yeah, I know what you're
(23:11):
talking about. It was a galvanic skin response monitor
they called GSR. Like, through skin conductivity, they're
supposed to be able to infer some sort of the emotional
intensity that you're experiencing, but they have to
kind of correlate it with other other metrics. I don't know
about the shock part, but I do know that Amazon, which I also
wrote about my book, did have something like that. I don't
(23:33):
know if it was a shock, but it's like a it's a very disturbing
vibration, I guess, you know, like, it's like, that's
what it was, it was, it was, it was that, but it basically
came across as being like, you know, a nice version of a shot
caller. I probably shouldn't have said that in the sense that
it's not exactly accurate, but I think that the idea of, like,
(23:55):
giving you a jolt that's non painful, or whatever to you
know, because the algorithm decided that you're too
distracted from learning. I don't know. We don't need that
kind of stuff for our kids, obviously. I mean, it's just so
bonkers.
It's entirely skin area, right? So, I mean, you know,
familiar, but you know, it's basically a very simple system
(24:18):
of punishments and rewards with four quadrants, so positive and
negative punishment. And negative, punishment, reward,
positive and negative just means adding or removing stimulus,
okay? And so the idea that is that, you know Skinner,
basically believe he wrote a book called Beyond freedom and
dignity. So for him, there is no bad students. A bad student is
just the student whose environment has not conditioned
them to respond appropriately, right? And so it's the best mix
(24:41):
of reward and punishment or stimulus response is going to
get the student to so called, learn, right? Or, in other
words, to be programmed the way that that you want that student
to be programmed. Now, if you believe that humans are
basically just little widgets in a supply chain, I guess that's a
great way to make things efficient. But if you believe
that human. Beings are agents that have, you know, soul, or at
(25:02):
least a consciousness. Yeah, it's pretty it's pretty
horrific,
yeah, just say the least. Well, where to go from here, I
guess, in talking about all of these things, you know, it comes
to my mind how Trump has been pledging to make the the US the
world capital of crypto and artificial intelligence, and
(25:26):
obviously that's going to have an impact on his
administration's education policy, and it arguably already
is. So you sort of already broached the topic of FinTech in
education, but we didn't really talk about, you know, crypto, in
the sense of the recent passage of the genius Act, which is,
according to Treasury Secretary Scott Besant, going to unleash
(25:49):
billions of stable coins around the world. And obviously that's
going to affect some of these things he brought up earlier,
like education savings accounts and scholarship granting
organizations, but also AI. So there was this recent executive
order from Trump, I believe, in April, and it was called
(26:11):
advancing artificial intelligence education for
American youth, which is part of, according to the order,
quote, ensuring the United States remains a global leader
in this, meaning, AI, technological revolution. So
what are some of the impacts of these policies on education thus
(26:31):
far, and where do you see this ultimately going?
Well, okay, so that AI executive order is largely
focused on training, basically like stem so basically making
students literate for the artificial intelligence era more
so than it is about using AI to teach students now obviously,
you know they're gonna it's gonna be a recursive sort of
(26:53):
reciprocal relationship there. I didn't mention this in my
article, but the American Federation of Teachers, who I
have written about at unlimited Hangout, and I traced the
history of their long term involvement, not only with
Rockefellers and the Trilateral Commission, but big tech
companies like IBM and the whole ed tech industry. They recently
(27:14):
Randi Weingarten, who was part of a global union federation
called education International, with all sorts of ties to the
World Economic Forum, they the aft just announced that they're
going 500,000 a big chunk of the aft unionized teachers are going
(27:35):
to be trained to integrate open AI, I think There's two other I
think anthropic is in there, and then the third one's either
Google or Microsoft. And then she also announced specifically
a partnership with the World Economic Forum to directly
partner with them to set curriculum. So it's interesting
(27:58):
to note that open AI and anthropic were also to the AI
companies that were supposed to be part of this ai.gov platform
that now that it's actually been launched, doesn't seem to be
launching the same things that the the archived website had
previously announced, but two of them were were the same
(28:20):
companies. We know that open AI is also part of the Stargate
Project, so this is so this is sort of indicating to to us
that, right, we're basically setting up a centralized
panopticon, while everything else is going to be so called
decentralized, right? We're still, we're going to have as
much as it looks like open AI and Palantir and these different
(28:41):
companies are competing. If you go through my article, you'll
see that the portfolios of sax and feel and horror Andreessen,
the A large portion of their portfolios overlap, like a lot
of these AI companies. Yeah. So I mean, that's, you know,
indicates to me, it's all part of the same sort of ecosystem,
(29:04):
okay? And, you know, they got that recent article about
palantir's master database. I mean, maybe I'm, you know, maybe
I'm a stickler for being overly precise, but technically not
making a master database. It's just the case that it basically
has contracts with essentially every federal agency. And then
there's an executive order that was recently Trump issued to
(29:26):
basically remove any barriers to sharing information across
agency, right? So therefore you can see how all these different
AI companies are basically have the green light to share all
this information. This is gonna this is why, you know, the the
deep the so called decentralization part is super
important, because then we can actually have personalized
(29:49):
profiles on everybody, right? And that personalized social
credit profile is going to aggregate your HHS, Dishman
wearable data with your AI. School choice, voucher, FinTech,
Ed Tech Data, right? And it's going to give you career
pathways and maybe mental health services based on those, those
(30:11):
analytics. And so that's sort of the that's why the FinTech is
super important to this, to this process. Now they pitch it as
well. Look, if you got, if you have an education savings
account for each individual student, and the education
savings account can be used to purchase, you know, a huge array
of products, right? This is a lot of accounting to make sure
(30:33):
there's no fraud and waste and abuse and stuff like that,
right? So how are we going to solve that problem? Oh, well, if
we can have programmable money, and we can have these third
party digital wallet companies service it. Basically that's
supposed to solve the accounting problems and the waste, fraud
and abuse problem, but what it also is basically it's putting
(30:57):
in the building blocks for a tokenized system in which right,
the monies that you get are programmed only for particular
types of school choice, products, services, etc, and
then right, based on how you perform, you'll either get new
tokens or you'll get less tokens, or maybe you know you
(31:19):
and those tokens can be either be used for maybe career
pathways, training or mental health services, but all of this
is basically a way to not only track and trace all the data,
right in terms of learning data, health data, etc, but also to
control how it can be used, and sort of make it like, you know,
(31:40):
a global or a Skinner box on steroids, right? Where, like,
everything, like, if punishments and rewards, we just mentioned,
like, maybe have a wearable, this is right, another one that
might, you know, like, just peer pressure, okay, back in the day,
maybe it was a detention or a gold star for punishments and
rewards. Well, you know, one of the, probably one of the
(32:01):
strongest incentives, is money, right? So, if you take away
people's money, right, as a you know, that's a pretty good way
to sort of condition people to behave accordingly, so that. So
that's why this the FinTech system is it's integral to
funnel people, to forcibly funnel them into these ed tech
(32:22):
products and services to data mine, data track for social
credit, but then also, right? This is important to streamline
this new stablecoin economy that's also being set up to
basically prop up the dying dollar,
right? So bringing up stablecoins Here, so the genius
act, I mean, you know, since it's passed, Treasury has been
(32:45):
asking for comment about various pillars that are required to
implement the genius act and enable, you know, the bulk of
Americans to hold digital dollar stable coins, as opposed to
digital dollars. And I don't know, a Bank of America account
or something like that. And one of those pillars is Surprise,
surprise digital ID. And so questions are being fielded
(33:10):
about that. So it seems likely that, you know, with stable
coins, you know, if they move to only fund these wallets, you
know, these digital wallets for education funding, they only
will work with stable coins. At some point, it seems more likely
than not. Then, in order to finance an education with these
(33:32):
savings accounts or granting organizations or whatever,
you'll have to have a digital ID. Yeah, well,
and one of the companies, I think I mentioned, Merit
International. This is another one that's funded by Andreessen
Horowitz. It's also funded by another venture capital firm
called alumni ventures, which shares common investments with
Andreessen Horowitz and also with Peter Thiel, Founders Fund
(33:54):
another, another company that funds merit International's
Experian, which is one of the big three credit reporting
agencies, and I remember during lockdowns that in Illinois, they
were supposed to be somehow managing the databases that were
going to have your vaccine passport, and so, you know,
it's, you know. So when I see a credit company basically
(34:14):
allowing me to come to work based on my health records, I
see like social credit. I mean, essentially, that's a step in
that direction, right? One other company that funds it, or
rather, I should say it's another venture capital
institution, is stand together. Venture Labs. Stand together is
another one of these SPN coke backed think tanks, which is
(34:35):
which, you know, again, the SPN state policy network and its
consortium of think tanks being adjacent with and behind the
development of project 2025
they're neocons, right? I mean, the Koch brothers and
American Enterprise Institute and the Heritage Foundation of
project 2025 infamy, right?
(34:57):
Yeah, that's yes. This is the neocon. Arm. And I'll tie
that into sort of, because, because it was interesting to
find that right, we think of the sort of PayPal mafias like the
Silicon Valley arm, maybe, sort of maybe, like a so called
Radical libertarian arm that, you know, maybe holders with the
neocon that's why I use so called and radical right. But
(35:21):
right, sort of like the CATO type, right? Which, again,
that's a coke funded Institute, you know, before they start, it
was founded by the Koch brothers before they broke away from
Rothbard. But you know that that so called libertarian arm has
always in this sort of, you know, old school neocon, yeah,
you know, they would maybe get together, but, you know, you
know, maybe, maybe didn't always agree on stuff, but if you, if
(35:42):
you go back far enough, you know, feel and David Sacks, you
know, starting Stanford review as basically this pushback
against Jesse Jackson's Rainbow PUSH coalition. And then later
come up with this book, the diversity myth, which is becomes
like this, you know, pillar against, you know, so called all
things woke like they've been sort of, you know, I guess,
(36:04):
rubbing elbows with or also fueling some of the culture wars
that the neocons have propped up just as much as they've been
funding this futurist arm of this, you know, pseudo
libertarian Cato arm of basically the right wing
establishment, but I had set all that up on merit International
(36:27):
to say that one of the things that they specialize in is a
digital identity ecosystem to verify their purchases. Okay?
Now if you look at another one of these companies called
student first technologies, which partners with MasterCard,
they have their platform which, quote, unquote, empowers
stakeholders, uses a PLA, uses something called Quinn IQ,
(36:49):
artificial intelligence, human in the loop, machine learning.
Ai, a third company that I mentioned, SAP araba, right?
They They do all sorts of FinTech services, but they'll do
blockchain AI and then also do ESG monitoring. So between those
three companies, what you have is digital wallets, digital
identity, AI, ESG monitoring and blockchain. So you know they're
(37:14):
not, there's not one company that's, I guess SAP kind of is,
but you know that it largely is like servicing other companies,
so that it's up to the company how much of their product they
want to integrate together. Technically, they could give you
all of it. But the point is this digital wallet industry, which
has been funded by right, all these PayPal Mafia venture
capital people backed by these coke backed neocon people across
(37:38):
the board, they're all leaning towards either digital identity,
AI or blockchain platform. Eventually, all this stuff is
going to converge, and when you do, you're going to have a
system in which, right, the government takes taxes, sends it
to a digital wallet company. The digital wallet company looks at
(38:00):
your AI, your analytics for your ed tech learning. It's going to
give you the money, it's going to tell you what you're eligible
to buy. It's going to the AI is going to analyze not just your
learning analytics on the Ed Tech product, but also it's
going to correlate like which types of voucher payouts and
correlated with which kinds of outcomes, right? And so there's
(38:21):
going to be predictive analytics on the Ed Tech, and there's
gonna be predictive analytics on the FinTech and and then the
blockchain part is the one that's going to basically lock
you into a permanent profile ledger. And it's going to
program so that you can write to restrict how you get to use your
coins, until you, you know, comply with your get your social
credit up to use them in a different way. Basically,
(38:44):
this is turning out to be a real mess. Yeah. I mean, it's
crazy. I haven't really heard Well, I haven't really been
looking extensively, but you would think, since digital ID
was such a big issue during the COVID era, with the people that
now form a large part of Trump's base, there'd be a lot of
(39:06):
uproar, perhaps, maybe that's hoping too much about about
this, whether it's coming through, you know, the Treasury
Department or, you know, education policy and these
things, and it just really, really doesn't seem like it's
happening at All. I mean, it's not really surprising, if you
were paying attention and noticed all the Silicon Valley
people and some of the worst people in the crypto industry
(39:29):
line up behind Trump pretty early on. And I don't know, I'm
unsettled by it. And I think, you know, I was thinking the
other day, I don't want to get on too much of a sidebar here,
but I was trying to figure out something. Because along with
the genius act, you know, it was passed during the so called
(39:50):
crypto week, there was another piece of legislation passed that
was called the clarity act. And in there, there were a few
blockchains that they like, singled out as they gave them
the mature. Labeling. And they classified all these blockchains
based on their maturity, and only three were in sort of this
top tier. And it was, it was blockchain, I think the second
(40:10):
one was Ethereum, which had been around, you know, the longest,
arguably, but then they put another one called Cardano there
when it doesn't belong there, in my opinion, I think in a lot of
other people's opinion, but it's interesting because Cardano was
created by Charles Hoskinson, um, who was a co founder of
Ethereum, but he's also, you know, the Cardano guy. And
(40:32):
Cardano, um, if you look into them at all, well, not only have
they're teamed up with the extremely Epstein funded
scientist Ben gertzel, very extensively, who's behind
Sophia, the robot and all of this stuff. But they also did a
digital ID for education for all Ethiopian school children. And
(40:54):
so I wonder, why was Cardano put in the clarity act and given
this boost? Well, they have this whole, you know, digital ID
platform for school children and basically for, you know, I guess
the general public already ready to go. So if you need a digital
ID on a blockchain out the door to make the genius act stuff
(41:15):
work, why not sort of King make, you know, something like
Cardano, which, by the way, Hoskinson was a big donor to
Trump last last cycle. Why not sort of single out, you know,
the blockchain that you want to use for digital ID while you're
signal signaling out the ones that you want to use for a lot
of the, you know, stable coin crap too.
(41:38):
You know, the other thing about like, so with class
wallet, it got a really big boom during lockdowns, because it
was, it was a lot of states. I can't remember the particular
number. It's in the article, but you know, many states, and I
think some federal I think maybe some, like FEMA projects, etc.
If they're not doing FEMA on HHS yet, they are set up to do it.
(42:01):
But they definitely were distributing some of the cares,
money and cares, you know, under cares, there was, like, the gear
and here and ESF, I think it's cares and ESF, but that was how
it sort of got its foot in, in the door. I mean, it had been
around, like, that's really how it, like, locked itself in as
like one of the go tos for this FinTech stuff. So I mean that
(42:24):
alone, you would think, right, these, you know, the people that
were against lockdowns, you might look at something like,
well, maybe this all of a sudden, the the people that are,
you know, supposed to be against this, are basically propping up
the company that you know took off during lockdowns. Is maybe,
maybe a red flag, but in the long term, and this is kind of
(42:45):
speculative, but I'm pretty sure, I'm pretty sure this is
the case, you know. And shout out to mark Goodwin, who took
his time to help me make sure that I understand some of the
financial stuff better. And one of the things I was asking about
was, like, you know, like, what exactly is the definition of a
security, right? Because they were trying to figure out what
these did. Figure out these digital assets, should they be
like a security or not? Come on? Well, yeah, right. And so he
(43:08):
basically explained it was security is basically expected
future profits, and not on just like the appreciation of a
basic, you know, asset, like a house would, but it's like
you're buying of the an increased value that is expected
in the future that is not just part of its, you know, natural
valuation as an asset. So if you think about like, I think where
(43:31):
they're going to go is basically securitize human beings and,
like, own them with, like, human capital, yeah, human capital
development. And I'll tell you about a company that's already
taking some of my tutoring hours at the end of this thing. It's
called upswing, and it ties into this because So basically, this
is how I see it. So where's the expected or the future value?
(43:52):
Well, let's see if we can get student X to reach education
goal y through career pathway, Z, right? That will produce so
much money in the economy, because they'll get this job,
and they'll produce these products, and it'll also save us
the money that we would have had to spend on retraining. And
(44:13):
maybe the student needs, maybe they need, you know, they fall
out of the fall through the cracks, and they need to have
mental health, or maybe they become delinquent, they go to
jail, right? So, and I've seen metrics at like AEI, where they
actually are talking about impact investments. In this
manner, they're quantifying education outcomes in terms of,
you know, either monies lost in those ways, right, student falls
(44:35):
through cracks, goes into, you know, they got to take care of
them with health care or prison or right? What do they produce
if they're successful? So you quantify that algorithmically,
right with this, with this, all these AI analytics and supply
chain dynamics, and then basically the you send that
information out to these companies. Maybe it's one of the
SGOs and the SGO which right a lot of them are focused on
(44:58):
impact investing, social impact. Active vesting. So they look
around and they go, they go, Okay, well, I'll sponsor this
student. They've got some decent analytics. I'll give them these
tokens through this digital wallet to buy these products,
because I know these products are the most likely to make
these outcomes. And if they make these outcomes, then I get my
money back. And potentially, if they exceed those outcomes, I
(45:20):
get a profit if I don't. You know, the way something like Pay
for Success, impact investing is basically a company puts the
money up front, and if, for some reason, the student doesn't meet
any outcomes, then the government doesn't subsidize it,
right? The government eats it. But if they do, if they do meet
those right, then, then they get the money back with interest,
(45:42):
right? And so basically, what you'll have in that situation is
basically, right, you'll have these human capital bonds.
You'll have companies, quote, unquote, sponsoring students.
Or, in other words, they'll be like, you know, this will be a
legit techno feudal system in which, right, you'll have these
feudal, big tech overlords that, basically, you know, own these
digital serfs on these blockchain digital wallets, and
(46:04):
they'll be forced to go through their skin area and operant
conditioning programs through whatever wearables or screen
based AI platforms they got. And then that'll all be geared
towards them, sort of stabilizing or growing the
global supply chain economics.
This reminds me, actually, some of the stuff we're talking
(46:25):
about, of this policy that Javier Millay promoted, I think,
a year ago, and I'm not sure what progress he's had on it
since it was announced, but I think it's kind of fitting to go
over in some of these because of some of the things you brought
up. And also, because this, this shift in dialectics from public
to private, and whatever that we were talking about earlier, I
(46:48):
think Javier Millay was absolutely the poster child for
that. He's like, beta testing, kind of, for all these policies,
you know. So, like, he was the first one to be like, I'm going
to eliminate the Department of Education. And now Trump, you
know, has made that part of his policy. But, you know, in Malays
also, you know, been like public education is not a right, and
(47:09):
all of all of these things and has sort of promised to sort of
take the state out of schooling entirely. But what has he
proposed in its place? Right? And so the thing that he sort of
proposed as a replacement for public schools was basically
having Facebook come in with their a or meta, you know, their
parent company, come in and he, his quote is, so meta has has a
(47:34):
whole system set up for the formation of human capital, for
the formation Of the people so that they can set up a career.
And said that he, you know, plan, planted this idea with his
minister, who's in charge of this stuff, and we're going to
start making the contacts with the people that made us so we
can implement an artificial intelligence plan for the
(47:58):
formation, or let you know the schooling of our of our kids,
basically. And so I'm kind of thinking back to what we sort of
broached earlier about school choice, how a lot of it leaves,
kind of like public, public school kids behind, and sort of
this idea of human capital and where it's going. And, you know,
the idea of, oh, we need more efficiency, and like rightly
(48:21):
identifying that there's a lot broken with the education
system. But also, you know, offering a solution that's
arguably just as bad, if not worse. And in this case, you
know, he's having talking about Facebook AI coming in and
deciding basically what you laid out, this idea of career
pathways for kids and having AI, you know, be teaching kids from
(48:46):
the beginning all the way through having, you know,
preparing them for a career that the AI determines, essentially.
And I, you know, I would like that not to happen in the US,
but it seems like that, you know, as you've been pointing
out that kind of is where it's ultimately going. And I'm also
thinking about, you know, so, you know, if Facebook were to do
(49:07):
that in the US, and you have, you know, all of these FinTech
wallets, you know, to fund education or whatever, and a lot
of them are set up by, you know, entities to like Peter Thiel or
other people in the PayPal Mafia. So what better way to
have, you know, those companies that are tied to people, like
Thiel, manage your funds, and then Facebook, that was put on
(49:28):
the map by Peter Thiel, who was on the board for decades, and
I'm pretty sure he still has stock in it. You know, they're
the ones that are being funded, funded by their funded FinTech
company. And then, you know, I mean, it's just like a passing
the money around the same people, basically at the end of
the day, is kind of what it's starting to look like. And
doesn't really seem to be about, you know, education. It seems to
(49:53):
be a lot more about profit motives and, quote, unquote,
efficiency, but efficiency under the paradigm. Of like, you know,
humans are hackable, programmable animals, and this
is how we can, you know, more efficiently, efficiently milk
them for profit. I don't know if I'm, I don't know if you agree
(50:14):
with all that. I kind of just went on a,
yeah, no, I 100% I mean, look, it's in, it's even, you
know, I had conversation with my socialist friends not too long
ago, and, you know, they talk about profit and capital. I
think it's a little bit past, I think it's, think we need to go
a little bit further than that. It's really just about supply
chain management systems, dynamic control and creating a
(50:36):
cybernetic control grid. I mean, because at this point, like, I
mean, I don't see how the very notion of profit, when
everything becomes automated, and basically the human beings
become again, these human capital bonds, essentially, like
you own everything. You know what I mean now, it's just
controlling the slaves and keeping everything, keeping the
(50:57):
wheels turning as smoothly as possible. Okay? I think that
like, if you look at the history of like, so the Macy's
cybernetics conferences, these are set up in like, 40s and 50s.
Well, this is basically the exact same time that some of the
first venture capital firms are set up. Now, you know, what did
they do at
Macy's cybernetics conference with military
involvement, exactly.
(51:17):
And they laid out, right? They what they said, basically
two things, the the theoretical implications of how the feedback
loop, computerized feedback loop, so That's right. It's
transmitting data, analyzing the way that that the response to
that data, taking that new data, that response data, and then
adjusting the the transmission in accordance with right the
(51:39):
feedback right? And you could do this to map right, the human
psyche, cognitively, what they called the conditioned reflex,
but also the subconscious, right, what they called, I
think, mesmerism or hypnosis, but it's basically the Freudian,
the psychoanalytic, the two parts of right, the behavioral
School of Psychology and the psychoanalytic school. So you
take that they understand theoretically, that this is
(52:02):
possible. Then you get these venture capital firms set up,
they get Moore's Law laid out, and they basically know how long
it's going to take for the transition transistors to
exponentially grow or be improved until they can actually
process all the data right? And so they and they look, okay, we
(52:22):
have however many decades, and what do we have to do in the
meantime? Well, we set up these venture capital firms, some of
them, like in Q, Tel, directly set up by the CIA. Others, like
Sequoia, like you know, all you know, saturated with with CIA
connections. And basically, what do they do? They throw they they
(52:45):
for that time period, they look around to see which programmers
are coming up with something that might be promising or
otherwise useful. They throw money at it. If it doesn't go
anywhere, no biggie. If it does, they pull it into their control
grid. And so basically, in the meantime, they have to do two
things between the Macy cybernetics conference and sort
(53:06):
of the climax of the fruition of Moore's law, and that is,
control the development of the technology through venture
capital, and also manipulate the culture in ways that people will
be will will accept this or otherwise be unaware of it. And
that's your sort of this right wing, left wing, you know,
(53:26):
culture war stuff, and so, I mean, I think that all of that
together, and as you know, you know, as you as you've done, you
know, good research on you show that. You know, companies like
Palantir and Facebook basically, were life log and total
information awareness right, which came out of post 911 era.
(53:47):
And we can argue that that was sort of this inflection point
where, like, we were able to start to sort of integrate some
of these technologies, right? We went from the theoretical and
sort of the pilot phases and right we and at that point right
with the national security threat. We had sort of reasons
to onboard this stuff. So, I mean, if you just look at the
last, you know, 60 years, it looks like it was all, very
(54:11):
much, all everything converging, not accidentally, towards
basically creating this control grid. I do want to note
something you mentioned about Venezuela. I just thought it was
really interesting. And that is that, you know, so Charlotte
wrote, before she wrote the deliberate dumbing down of
America, she had wrote a shorter piece called Back to Basics
(54:32):
reform, or obe scannerian international curriculum. It's a
short little thing she used to rail about how she said, I
wouldn't have had to write the big book that's deliberate
dumbing down, if not, the Conservatives wouldn't have
boycotted me on the little book. And that was a big thing. She
used to tell me to say, you know why you don't get as much
traction as you should? She says, Because of me and because
(54:53):
of school choice. Those are the those are the two things. And in
this piece, she mentions here, I. Um, this is according to
March, April, 1981 issue of human intelligence international
newsletter, critical thinking skills research is taking place
within the United Nations Educational scientific cultural
organization, the Office of Economic Cooperation and
(55:13):
Development and the World Bank, which plans on increasing the
bank's international lending for education training to about $900
million a year. The Department of Education, that's the US
Department of Education's National Institute of Education,
possibly in response to a meeting Luis Alberto Machado,
the Venezuelan Minister for Human intelligence, had with
former Secretary of Education, Terrell Bell, and various
(55:34):
senators, to update them on the progress of this his nation's
work in human intelligence, and was awarded a three year
contract totaling approximately 780 approximately $780,000 to
bolt bear Mac and Newman incorporated that was a tech
company back in the day and might still exist, to analyze
current programs of instruction on cognitive skills. And then
(55:54):
she also cites in here a bill that had, in there they were
proposed, there's a quote from it, the community intelligence
project and the applied thinking skills project in Santa Barbara,
California, and the nationwide intelligence project in
Venezuela have shown good results with promising social
and educational benefits. So it looks like Venezuela, Terrell
Bell, by the way, that is this Secretary of Education that
Charlotte was under. He was the one that was in charge of
(56:17):
project best. So while they're setting up project best, and you
had sort of, right, all the precursors to this Trump version
of neocon Silicon Valley, you know, weird amalgamation, you
know, the the origins of this school of choice revamp. Go back
to th Bell and looks like Venezuela was basically on the
(56:38):
same page as us then as well, and as we know, you know, I
mean, if we look at this administration as, like, you
know, I know that for a long time, the word fascist got
thrown around way too loosely, but this is corporate fascism.
And where did all the Nazi fascists go? Or a lot of them go
after the, you know, project, paper clip. Well, they came
here, and they also went to the Latin American countries. I
(57:01):
don't know if they went to Venezuela, but you Venezuela,
but, you know, I know
Brazil in our Argentina, and they definitely do. They go
there also, right? So,
you know what I mean? I just, you got to wonder, you
know, how much of that is, how much of the parallels between
the way that these different, these Latin American countries
and we're going maybe, maybe stems back to some, some of
(57:22):
those, some of the stuff that they both brought back from
Project, Project, paper clip,
yeah. Well, whether it's from then, I mean, I don't
really know, but definitely now there's a lot in sync between,
you know, the Trump administration in South America,
and it's likely to only increase, particularly with
(57:42):
Chile's upcoming election. I mean, the kind of Malay Trump
equivalent candidate is probably going to end up winning. So
we'll probably see a lot of those policies pop up here too,
in the not so distant future. So before we wrap up here, John, I
wanted to bring up something in the context of these the fin
(58:02):
FinTech angle here, and the stable coin angle here. So in
terms of Trump's education policy, you know, a lot of it
hasn't gotten that much coverage. And just in general, I
think, you know, in favor of a lot of the other wild stuff
that's been going on over the past, you know, eight ish months
or so. But one thing that I think most people are familiar
(58:26):
with in terms of Trump's what he's done with respect to
educational institutions has sort of been this, this
crackdown. Well, it's been framed. It was initially framed
as a crackdown on, quote, unquote, woke ideology, right in
schools and universities. But then it sort of became this,
(58:46):
this clamp down, particularly on finances of educational
institutions, where students were protesting the war in Gaza
being waged by Israel, and the administration justified these
moves under the guise of combating quote, unquote, anti
semitism. So if we're in sort of this paradigm where education
funding and just basically Americans money in general, is
(59:10):
going to be moved to these digital dollar stable coins that
are just in the sense of being similar to CBDCs, they can be
seized, you know, and they're survivable, and they're
programmable, you know, if, as you brought up earlier, the
administration could program these, you know, education
savings accounts and things like that, to only be used on certain
(59:33):
things. It seems more likely than not that there's also a
precedent being sent here where they could just turn off
institutions money, or also turn off, you know, offending
students money, you know, off at the spigot, basically, with some
of these, you know, FinTech companies. Not sure how you feel
(59:56):
about that, but you know, do you see this, the creep of these
technologies? Is enabling the government's capacity to control
speech on campuses, whether it's about Gaza or really anything
else, like if we have another sort of COVID scenario and
criticism of, you know, the narratives there or the policies
there, you know, what are your thoughts?
(01:00:19):
Yeah, you know. So there's the third, I think it's the
third chapter in my book. I looked at what are called P
Well, there's P 16, P 20 and P 20 plus, you know, one, another
one of these where they got to have 12 different acronyms for
one thing. So you can't have a concise conversation about it,
but they're, we'll just call them p 20 councils, okay, to
(01:00:40):
expedite. And basically, there's these consortiums or these
clearing houses to set up public private partnerships between
educational institutions at the state level, almost I think, I
think every state does it. If it's not everyone, it's very
close to all 50 between the education department or a local
education agency and various other what they call in the
(01:01:03):
community schooling model. So called Community schooling model
would be what they call wraparound services. So these
are public, private partnerships between, again, these, these
schools and either healthcare institutions, but also what they
call community oriented policing. So in other words,
right? They're part of these, the these p 20 consortiums is
(01:01:23):
not just to write help the students with their workforce
training skills, but also to help them with their mental
health, and also to make sure that they become delinquent and
fall through the cracks, right? So they have this, like
preemptive criminal justice component as well. You can, you
know, tie that in both of these in, actually, with national
security, and that is mental health and criminal justice.
(01:01:46):
So now, yes, yes, yeah. And now they're
gonna, they're gonna, the next part is, you know, this is,
this is one of the scariest things to me, is this homeless
thing that they're doing, because if you're going to
destroy the economy and basically make it so that even
people with multiple jobs can't, you know, can't get by then,
then right? You know, it's not just going to be, and even then
(01:02:07):
it wouldn't be, right, you know, because of explain why. But even
then, I wouldn't recommend this, this homeless project. But you
know, it's not going to be necessarily because, you know,
you had a substance abuse problem or gambling problems.
This is going to be that the economy's pushed you out of your
house, and then, you know, if you get angry, or, you know, you
(01:02:27):
have the wrong tone of voice, when somebody tells you, you
know, get off the street, they just put you in a mental health
facility and be basically become a ward of the state and under
conservatorship. And if you know this is part, part of where this
defund the police stuff is going to go the middle ground is going
to be, oh, well, we're going to have AI. I've seen, I've written
about it, make
them more efficient by outsourcing police work to
(01:02:50):
artificial intelligence and predictive policing algorithms.
That's the Palantir bread and butter. And there's a litany of
companies that are doing that
right. And then on the other end, they also have for and
that's just to catch him, right? Oh, he's mentally ill. He might,
which means he might be a national security threat or or
(01:03:11):
criminal threat, etc. They
also want AI to determine if you're mentally ill. That is the
new CDC director, Susan manarez. She used to be head but this
number two at ARPA H, the health DARPA that Biden made and her
whole, one of her main priorities there was about
(01:03:31):
having like generative AI, determine if people screen
people for mental illness. And Trump, in his first term,
promised that he was or called on social media companies to
stop mass shootings before they happen, by going over their
their people's posts and like analyzing them. And then he was
considering this program for Harpa, which again was made
(01:03:54):
under Biden, under the name of ARPA H, to have people's social
media post analyzed by an algorithm to determine if
they're mentally ill and that they might be violent at some
point, and if they do, determine that, you know, so don't get
angry on social media, guys. But basically, you know, you could
be ordered to, like, a court ordered physician or like rehab
(01:04:16):
place or put under house arrest by an algorithm. So the homeless
stuff, I mean, people need to see it in the in the appropriate
context. And also the move to federalized police, if that
happens on a bigger scale, that just means the federal
government can be like, well, we're going to have the
federalized police outsource their police work to these
specific AI algorithm companies, you know.
(01:04:40):
And they're also going to do for the no cash bail, they're
going to say, Oh, you get arrested. So Palantir goes
predictive analytics say, Oh, you're this type of a threat,
and you go in and, oh, well, or let's say they get you after the
crime. Doesn't matter. They bring you in there, and they go,
Well, you know, bail. They're going to determine your bail not
by money amount, but by the same analytics. Oh, well, you know,
(01:05:00):
you've reoffended three times in the past, and you live in this
zip code where, you know, everybody's a criminal in your
neighborhood. So we're gonna add that to the analytics, whatever,
right? And that's gonna determine whether you get in or
whether you get so, you know, like people that think that
sending a mental health person and having a no cash. Like do
you think that if you I will take my chances with the prison?
(01:05:24):
I'm sorry, I'll take my chances with because at least I still
have, if I am going to the courts, I still have, you know,
I can still make a case, right pro se or with a lawyer, etc. If
you're mentally ill and they put you in conservatorship, doesn't
matter what you say. I don't care how lucid you are about it,
you don't have agency as a human being anymore, like you don't
(01:05:45):
get to challenge it in court. So I mean, that's terrifying to me,
and to your point about AI and mental health. So maybe this one
last thing we'll wrap up on. So I know you got to keep going a
little bit, but so I come to find out I got a big chunk of my
tutoring hours taken from me because at one of the schools,
this is the school right where I tutor, we're going to contract
(01:06:07):
with a company called upswing and Upswing
beat John, yeah, it's fun. Somebody's
gonna have to swing me up, because I'm gonna swing me. I'm
gonna swing me up by my bootstraps here when I don't
have a job, and by the end of the year. So they're gonna do
half and half. Now, okay, it's gonna give, it's using
(01:06:30):
generative AI to tutor students, but it also gives them mental
health checks, and it also will make them connections with like,
what they call basic needs, like transportation, food, etc.
There's a little tutorial on it. When you go into company and
it's, like, got this really happy music day, yeah. And like,
(01:06:50):
then there's, and it just shows a text, you know, between the,
you know, supposedly, the student and the bot. And it
starts off with, like, oh, you know, I need some help. And
like, this, AI is, like, overly Affirmative. It's like, it's
okay. A lot of students have that problem. Like, every
response before it gives them the solution is like, you know,
to me, if somebody talked to me like, I'm like, Man, I'm an
adult. Please don't, you know, please don't belittle me. But by
(01:07:13):
the end of the thing, I'll cut to the chase. By the end of the
thing this, this, you know, a little video they made. It's
like, it doesn't say I love you to the bot, but it's like, it
says, I can't believe you cared so much about me, and you're a
total stranger, and it clicks the heart emoji. So, I mean,
might as well say it loved it like so, you know. Now, so this
thing is going to be, you know, not just replacing me, but it's
(01:07:37):
basically going to be checking this student mental health. So
it's going to be also collecting the students education records
under FERPA and their health records, which is HIPAA, okay,
so I don't even know how, how they're gonna does it have two
different databases that they go into, or is it just allowed to
aggregate them? And then I don't
really aggregate them. I mean, like you said way earlier.
They want to remove the information silos, right? And
(01:08:00):
here's I'm gonna lean towards that. I just, you know,
I like to steel man. My case, here's what I'm gonna lean
towards that because half the companies that fund it are
venture capital firms and impact investing funds, and I had some
of them. I gotta go. Luckily, the director is actually
skeptical. I'm gonna go and meet with the director tomorrow. And
so I got some these tabs up, but it's a little fresh in my mind.
(01:08:23):
But a bunch of them. Social finance is one of them. I've
written about social finance in the previous piece, I mentioned
that they were in some partnership with one of the
impact investing companies that I wrote about in the first
project, 2025, piece. But I also mentioned social finance in a
video I did on UNESCO's study 11, a follow up to the article I
(01:08:44):
did for unlimited hangout and social finance partners with, I
think it's Future Learn, which is this online learning company
out of the UK that was partnering with study 11 back in
the day. But so, so when I'm so and then it's showing you like
on the it's, it's it's like, oh, we improve students by this
amount, this percent, had all these different metrics that
(01:09:07):
it's going to give you about how the students improving. Okay, so
if it's taking all that, all that data, and it says in it,
they sent us an email, it's going to keep it not just for
the student, but for the institution. So that means that
what I see here is that what we're leaning into is the
company is going to become or the school is going to be
leaning into social impact grants. I think social impact
(01:09:30):
grants are going to take off big, and you're going to have to
have the AI, because in order to get the the impact grant, you're
not just going to do an old school grant, like,
qualitatively, like, Hey, I got a good idea for the for the
money you want to hand out? No, no, you're going to have to show
very precise metrics, the types of metrics that can only be
calculated and crunched with something like generative AI, at
(01:09:51):
least at the speed and scale. Okay, and so. So, in other
words, the extra incentive for this company to get this upswing
isn't just to pay me less. Get me save money and not have me
tutoring, but they're going to use that data to go, oh, look,
we contracted with upswing, and our students mental health
increased by this much, and their education outcomes
(01:10:12):
increased. So then they'll take that data and they'll write a
grant proposal to some for Pay for Success for some big tech
company or venture capital firm or whatever, and and then that
person will, you know, that will be how they finance this new
education system. And so, I mean, like, I mean, he said we
got one semester, and I'm going to tell him, like, Listen, I
(01:10:34):
don't know what we're going to do, but at the end of the
semester, they're going to have a bunch of numbers to show that
it's either as efficient, or more efficient than paying us,
and I'm pretty sure that by next year, I'm going to be done
tutoring at that school. I mean, they're not going to let me
anymore. So that'd be maybe that's a happy ending.
Just think it's a it's a signpost of where we're going
(01:10:56):
with with AI, and how a lot of the hype around it. I mean,
ultimately, what's happening is just the the outsourcing of way
too much stuff to AI. And when you start outsourcing, you know,
education so extensively to AI. I mean, historically, you know,
(01:11:16):
student, teacher relationships were really important, you know.
And so having that relationship be built instead with a with an
algorithm, maybe they'll put it in a, like, a little nice, cute
robot casing, or whatever, like they're doing with domestic
robots that they're, you know, they have similar videos, like
you mentioned for those two where it's like, the kid is
like, I love you robot. It's like, very creepy. But the idea
(01:11:38):
is to sort of, you know, prepare the future generations for to
have this sort of very extreme connection to an AI that knows
everything about them and can predict what they're going to
do. And, you know, it basically, you know, leads to a very
Orwellian future when you consider to the impact of
(01:11:59):
wearables and some of these things that, some of these, you
know, Silicon Valley house philosophers like Yvonne Noah
Harari have said about, you know, hackable animals and the
end of Free Will when the algorithm has studied you so
extensively. And so even though a lot of this is frame, you
know, a lot of these introductions of AI into schools
and and things like that, is sort of being framed as having
(01:12:21):
kids learn how to use AI. It seems increasingly like where
it's going to end up is it's going to be about AI learning
how to use us most efficiently, not the inverse. And who does
that benefit? You know,
I've said it. I that was one of the that was, like my
epiphany moment, why I even wrote the book? Because I was
(01:12:43):
sitting there and I was tutoring for, I'll say the company now,
because it's the non compete clause is way past. But it was
Pearson. I used to work for Pearson. They had a subsidiary
called Smart thinking, and guess what? They paid me with my
master's degree. They paid me $11 an hour. Now, you know, mind
you, if I would have went and five years more school and got
my PhD, I could've got a whole 12 hours, dollars an hour. Now,
(01:13:03):
you know, so, but they had, one day, sent me an email and said,
Hey, IBM's Watson, which, you know, people don't know, was
basically like the premier AI system before generative AI, by
the way, it's named after Thomas J Watson, who did business with
Hitler and processed the punch cards for the Nazi concentration
camps, okay, but so I'm looking at it, and I go, well, so
(01:13:25):
basically, every day I go to work, this thing's gonna
basically data mine me and replace me. And I go, Well,
okay, so I start thinking, you know, you think down the line,
you go, Well, if it replaces me, then if it can teach better than
the teacher, then it can learn better than the student. In
other words, right? Like anything I can teach the
student, the AI can, right? It can do that just as good as it
(01:13:48):
can teach it. So not only did Are you making the teacher
obsolete, making the student obsolete, like over time,
whatever you might be gaining from the algorithm, or you might
so called, be learning from it, it's learning more and it's
learning faster. So the fact that it's data mining you and
you're basically feeding it is not a secondary effect. It is
the primary effect. Because if I can think that from just being a
(01:14:12):
tutor that's making $11 an hour and going like, well, let me see
if I just go three steps down the line. Know where the
ultimate outcome you're telling me these venture capitalists,
PayPal Mafia, these big technocrats that they don't know
that, of course, these are people that write the code.
Which goes back to why I you know, why I speculate that? You
know that timeline I gave you, from the Macy cybernetics
conferences, through venture capital, through Moore's law, to
(01:14:33):
911 to where we're at now. You know they were basically just
doing their scenario planning. You know, ran Corporation style
scenario planning for 1015, years at a time, right? The
scariest part for me now, though, you know, is something
that I never really thought of a whole lot, you know, when I was
seeing how eventually, you know, I'm looking at it, the Ed Tech,
(01:14:55):
and extrapolate out, well, it's basically just building AI. And
then you also look at, well. Just one part of this broader
panopticon. So the Ed Tech metrics, or the education
metrics and healthcare metrics and the workforce metrics and
the criminal justice metrics, it's all one system. At the end,
what I didn't anticipate was that people were gonna have
emotional attachments to these bots, which is frightening
(01:15:18):
because we watched a little short piece on, like people that
are dating their AI chat bots. Well, this product that I just
mentioned upswing, by the end of it, it's, it's advertising it to
the teachers, as if it's a great thing that your student is going
to have an emotional connection to a bot. Like, if it's one
thing for it to be pushing you around algorithmically, it's
(01:15:39):
another thing for you to actually bond with it. When that
happens, I just, you know, I don't know. I don't know where
we go at that point that is, that is terrifying to me,
yeah, well, ultimately, you know, I mean, there's been a lot
said about, like, the cognitive drain of regular people just
using AI for whatever, you know, but having that, I mean, I feel
(01:16:00):
like it's so different than having, like, a grown adult use
it and the cognitive drain that causes, versus a kid learning
stuff in schools and then having that cognitive drain happen,
like, before they're they've, like, learned very much at all,
or, like, learned how to think any sort of critical thinking.
Not that schools are good at teaching critical thinking
(01:16:20):
anymore. But you know what I mean? Like, it's just, I feel
like the developmental the impact on something that's still
in development versus something that's more developed is
obviously going to be the type of dependency that'll create on
AI, I think is going to be really different and really
significant and very troubling. And I just, the more I think
(01:16:41):
about AI, the more I kind of see it as, like, you're feeding
stuff into it to make it basically, you know, you're
feeding it to become this. You're like, basically giving it
your brain power to become a big, giant, global brain. And
then at the same time, it's like, also taking all of these
things, like the amount of pollution it's creating, it's
(01:17:02):
sucking up all the water, it's using all the power, and, like,
raising our power bills, like, Why? Why are people, like, still
using this? I mean, not that, I know I want to crap on people
for using it, you know, but it's like, the more we become
dependent on it, the more that trend is going to exacerbate to
the point where, like, we don't have brains left, we don't have
(01:17:24):
an environment left, and it just seems like crazy that there's
really no talk about it, and there's just all this hype about
it online, and also, like, AI slop everywhere, like, I feel
like even people in independent media are leaning so much on the
AI, and then, Like, I don't use that at all, but I have people,
I have like, AI bots and like aI using people clip up my
(01:17:48):
interviews, and they get like, tons of views on YouTube. And
it's like they put like clickbait titles of like things
I've never said.
They're everywhere. The way it's narrated too. It's like
that and like
and they attribute things to me that I've never said. And so
I'll have people be like, I don't like Whitney because she
says this. And I'm like, I've never said that. And then
they'll like, link to the video, and I'm like, oh my god, I can't
(01:18:11):
get rid of them. I tried to go on James Corbett, like, a year
ago to talk about this problem, but like, I feel like I have to
constantly remind people, and then, like, a lot of them will
say, like, Whitney's final warning, or Whitney warns. This
will happen by June or and then there's another one that comes
out two weeks later. This will happen by July. This will, I
mean, they like, constantly come out. And I'm like, I'm not
warning about anything with, like, a date. I've seen more
(01:18:35):
like, I mean, it's so crazy. And so it just,
I saw one where it cut you. It was like, it cut you off
yours, like you said something, like, halfway through the
sentence, it literally goes to the narrator, and they finish
your sentence. I was like, wait, what you know, but the cadence
of them in the way they like splice in these still images.
Like, I've seen a lot of videos where I'm like, this is AI.
(01:18:57):
Like,
you know they are AI. It sucks and they're but, I mean,
it's ultimately a weapon of perception at the end of the
day, you know? And so it's like people that like my interviews
and my points, I won't say what they want me to say, so they've
taken AI to make it sound like I'm saying what they want me to
say, even though I'm not saying it isn't that crazy.
(01:19:19):
To your point, uh, look, you know, he said, I don't want to
poo poo on people that are using it. But, you know, I think
you've also been had the black pill label thrown at you, like,
I have you here. You want a black pill? I'll give you a
black pill. Keep using it, you know, I'm sorry. I don't want
to, like, you know, poo poo on people who are using it, because
I lots friends and, you know, people in, you know, my
(01:19:41):
colleagues, etc. Here's the thing, though, like, I expect
the average consumer to use it because it's, you know,
convenient, etc, and, you know, it's just, if you just buy into
the way it's commercialized. I expect the administrators and
the people that run big companies, right, because it's
going to make them profit. I don't expect people that. I've
had these types of conversations, conversations
(01:20:02):
with people that are on the snow wavelength, people didn't know
about social credit, technocracy, transhumanism, all
of this, people that know that, like I was in a meeting the
other day for a conference, right? And all nice people and
everything, but somebody was supposed to take notes for the
conference. Well, I didn't register at the time, and I'm
looking at it, and it and it didn't, didn't mean anything to
(01:20:23):
me. It's time. But one of the windows in the in the meeting
was like something.ai and didn't have a person on it like so
later they email out the minutes, you know, because the
person thought that would be faster than writing them down.
Well, that's fine enough, except that when you read the printout
of the notes, it had three other metrics in there. One was, uh,
(01:20:44):
was is, did you get the the objective finished, right? So,
like, whatever the title of the thing was, did you actually come
to the solution? The two others ones, one was engagement, and
the other one was like, interest or something. So engagement was
like, How often were people participating? But the other one
was, like, an emotional analytic, like, did we like it?
So I don't know if it was just going qualitatively on the
(01:21:06):
things we said, or if it had facial recognition and was like
looking at, like the facial expressions. But this is in the
part of this conversation, right? We're talking about part
of it. We discussed my presentation and data mining and
social we talked so if you know, if the people that, the few
people that are aware of where this goes, are just going to
(01:21:26):
lean into it, that's a black pill that you're going to
swallow, whether you want to or not, because you know people
you're just getting I feel like I'm going to get dragged into
this, whether I use it or not. The
slipperiest slippery slope, I feel like, and the more you
slip down the slippery slope, the harder it is to get back up.
(01:21:48):
So I know a lot of people sort of in this space, like you're
mentioning, you know, use it for only a few things. But
eventually, if you don't check yourself frequently, your
reliance on it grows and grows and grows, and then how do you
get back to a point where you're not using it at all? Or, you
know, it just kind of it gets harder for people. The
(01:22:10):
dependency thing is super real, and that's really troubling. And
also, I mean, some of the stories about people like doing
insane stuff and, like, starting new religions with chat GBT and
all this stuff. I mean, it's crazy. And having the idea of
that being in, like, elementary schools, and telling elementary
school kids to, like, develop relationships with this stuff,
(01:22:32):
and middle schoolers, like, if adults are having like, these
psychotic episodes where, like, they're creating their own cult
with chat GPT and stuff. I mean, you don't think that's going to
be even worse. To be even worse with, like, preteens and teens
and kids, you know, with
with trans, you know, like, so people that are worried about
transhumanism and they're worried about neural link in the
brain, shit, dude, you don't have to have anything plugged
(01:22:55):
into you to have, uh, technology, take over half of
your consciousness. All you got to do is outsource half of your
thinking to generative AI. Now, you know, I'm not a math
teacher, maybe a generative a large language model might be
maybe could help you with like numeracy, quantitative skills,
but I'll tell you that as a language arts instructor, as a
(01:23:16):
as an instructor of Writing and Rhetoric, that if you're using
generative AI, I don't care if it's just a Grammarly to, like,
make your sentences more efficient, but if you're
definitely, if you're doing it to organize your paper or to
make the connection between the thesis and the topic sentence,
you know, more clear. If you're doing that, you're outsourcing
(01:23:36):
your thinking skills to generative AI. And if, if
there's one thing that makes you a better writer, a better
speaker, better communicator, it's having an inner monolog and
monitoring it. It's having an introspective awareness of
whether or not the words in your head right, the categories of
your mind match the categories of reality, being able to look
(01:23:58):
in and see, did I contradict myself, right? And so when
you're writing a piece, I had a moment where it was like, you
know, I wonder if I could just do, like, a rough draft, really,
like, throw all my citations in there and just have it, throw
stuff in there. Well, the problem with that is that, like,
for me to be able to have this conversation as efficiently as
possible, a lot of that comes from the recursive process, not
just of, right, throwing information on a page and having
(01:24:20):
something organized it, but like the process where I write, make
sure that the thesis connects to each topic. Sense each topic
sentence has reasoning evidence warrant that the summary
statement connects right, that there's transitions between each
paragraph, that everything is cohesive, that I have a citation
for each sentence, that there's no wasted words at each side,
that recursive process is not only what makes me better at
(01:24:40):
speaking and writing, but it's also the my ability gives me the
ability to actually have a cognitive map of the thing that
I just researched and wrote about. So, you know, there is no
way in which you can have this thing do part of your reading,
writing or speaking, and have the your language. Charge
muscles in your brain atrophy. There's no way you can do it,
(01:25:03):
right? And so at that point, the
studies that have been done on it, they basically show that
when kids do that, they don't retain anything, like they're
able to execute the task with AI, but they didn't actually
like learn anything, and the information just doesn't stay in
the like they've learned, literally nothing, you know.
And add the emotional attachment to it on top of that,
(01:25:25):
getting weird. And now, you know, they'll dress them up as
anime girls, or, I guess, like the grok one. And grok also has
an Alex Jones Alter Ego, so, you know, a chat bot for every
segment of the populace, I guess it's a mess. Well, John, I
probably have to run here, but it's been a great conversation,
(01:25:48):
I think. And are there? Do you have any final thoughts as we
wrap up?
No, that was that was good. We covered pretty much the
waterfront.
Okay, super well, where can people find your work and follow
you and support you.
So all my most recent articles always, you can find
that unlimited Hangout. I'm always kind of slow on updating
the website, so that's where you find the most recent articles.
(01:26:12):
I'm Taoist professor on X but the website is school world
order dot info, which has links to all the social media. There's
also a link to trying day where you can get my book that school
world order the technocratic globalization of corporatized
education. Well,
thanks so much. And thanks to everybody that that listens
(01:26:32):
to this podcast and that supports it. And if you found
this content interesting, please share far and wide, especially
as social media algorithms become more restrictive, and it
gets harder to distribute important information like this.
And yeah, thanks a lot, John for being here. And thanks again to
everyone who listened, and we'll catch you all in The next
episode.
(01:27:02):
You You You You You You You. I'm.