All Episodes

October 3, 2025 65 mins
John's website 
https://www.schoolworldorder.info/

Doors of Perception is available now on Amazon Prime!
https://watch.amazon.com/detail?gti=amzn1.dv.gti.8a60e6c7-678d-4502-b335-adfbb30697b8&ref_=atv_lp_share_mv&r=web

Doors of Perception official trailer
https://youtu.be/F-VJ01kMSII?si=Ee6xwtUONA18HNLZ

Independent Media Token 
https://www.independentmediatoken.com/

Merch
https://fknstore.net/

Start your microdosing journey with Brainsupreme
Get 15% off your order here!!
https://brainsupreme.co/FKN15

Book a free consultation with Jennifer Halcame 
Email
jenniferhalcame@gmail.com
Facebook page
https://www.facebook.com/profile.php?id=61561665957079&mibextid=ZbWKwL

Watch The Forbidden Documentary: Occult Louisiana on Tubi: https://link.tubi.tv/pGXW6chxCJb

C60 PurplePower
https://go.shopc60.com/FORBIDDEN10/
or use coupon code knowledge10

FKN Link Tree
https://linktr.ee/FKNlinks

Forbidden Knowledge Network 
https://forbiddenknowledge.news/ 

Make a Donation to Forbidden Knowledge News 
https://www.paypal.me/forbiddenknowledgene
https://buymeacoffee.com/forbidden

Johnny Larson's artwork
https://www.patreon.com/JohnnyLarson

Sign up on Rokfin!
https://rokfin.com/fknplus

Podcasts
https://www.spreaker.com/show/forbidden
Available on all platforms 

Support FKN on Spreaker 
https://spreaker.page.link/KoPgfbEq8kcsR5oj9

FKN ON Rumble
https://rumble.com/c/FKNp

Get Cory Hughes books!
Lee Harvey Oswald In Black and White 
https://www.amazon.com/dp/B0FJ2PQJRM
A Warning From History 
Audio book
https://buymeacoffee.com/jfkbook/e/392579
https://www.buymeacoffee.com/jfkbook
https://www.amazon.com/Warning-History-Cory-Hughes/dp/B0CL14VQY6/ref=mp_s_a_1_1?crid=72HEFZQA7TAP&keywords=a+warning+from+history+cory+hughes&qid=1698861279&sprefix=a+warning+fro%2Caps%2C121&sr=8-1
https://coryhughes.org/

YouTube 
https://youtube.com/@fknclipsp

Become Self-Sufficient With A Food Forest!!
https://foodforestabundance.com/get-started/?ref=CHRISTOPHERMATH
Use coupon code: FORBIDDEN for discounts

Our Facebook pages
https://www.facebook.com/forbiddenknowledgenewsconspiracy/
https://www.facebook.com/FKNNetwork/

Instagram 
@forbiddenknowledgenews1
@forbiddenknowledgenetwork

X
https://x.com/ForbiddenKnow10?t=uO5AqEtDuHdF9fXYtCUtfw&s=09

Email me
forbiddenknowledgenews@gmail.com

some music thanks to:
https://www.bensound.com/








Become a supporter of this podcast: https://www.spreaker.com/podcast/forbidden-knowledge-news--3589233/support.
Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:02):
I think it's just going to get weirder and weirder
and weirder, and finally it's going to be so weird
that people are going to have to talk about how
weird it is. Eventually people are going to say, what
the hell is going on. It's not enough to say
it's nuts. You have to explain why it's so nuts.

(00:23):
The invention of artificial life, the cloning of human beings,
possible contact with extraterrestrials, the systems which are in place
to keep the world, saye, are in utterly inadequate to
the forces that have been unleashed.

Speaker 2 (00:46):
Welcome back to Forbidden Knowledge News. I'm your host Chris Matthew.
Today my guest is John Klaizac. First, be sure and
check out my films. October Spooky Season is the perfect
time to watch Those. Doors of Perception is available on
Amazon Prime. A Cult Louisiana is available on t B, Roku, Channel, Apple,
and more. We are booking guests for December. If you

(01:10):
have suggestions or you'd like to be a guest, email
me Forbidden Knowledgenews at gmail dot com. Today, I want
to welcome back to the show, John Klaizak. He is
author of School World Order, The Technocratic Globalization of Corporatized Education. John,
Welcome back.

Speaker 3 (01:28):
How you doing okay, Chris, Thanks for having.

Speaker 2 (01:31):
Me, Thanks so much for coming back on today. We're
going to be sharing information from a recent article you
published on Unlimited Hangout titled from Project twenty twenty five
to the PayPal Presidency, school Choice Fintech for a blockchain
social credit economy. There's a lot to get to with this,

(01:52):
and most people hear things like Project twenty twenty five,
these other agendas that are rolling out, and because there's
this brainwashing that's occurred with a lot of the conservative
right right now that Trump and this administration can't do
any wrong, and they are just letting all these agendas

(02:14):
roll out gladly, not questioning anything, not looking any deeper
into some of this stuff, which you have done a
great job doing. We're gonna get to that before we do.
It's been a little wild and since you've been on,
remind the audience just a little bit about yourself and
let them know how they can find out more.

Speaker 3 (02:32):
Yeah. So, I'm the author of School World Order, the
Technocratic Globalization of Corporatized Education. I published that in twenty
nineteen really took off during the lockdown years. The foreword
was written by my mentor, Charlotte Thompson Israbet. She also
wrote the famous Deliberate Dumbing Down of America. She worked

(02:53):
in the Department of Education under Ronald Reagan. She blew
the whistle on this technology project known as Project Best.
So that's basic education skills through Technology. And there was
this plot to set up public private partnerships between the
Department of Education and big tech corporations in order to
implement computerized programmed instruction that would use operant conditioning algorithms

(03:15):
to psychologically conditioning students for workforce training and a planned economy.
And so that's sort of my background. She passed away
in twenty twenty two. And what you see behind you now,
I don't know if you saw this before, Chris. I
think last time, I still had my old setup, which
this was behind me with a curtain, which the curtain
was back there. And now that's that's Charlotte's library, most

(03:36):
of it is, let's say, about ninety plus percent of it.
And then I got a bunch of her files and stuff.
And so people can check out if you want to
subscribe to my database. I upload I upload her files,
you know, about every month, and sometimes I do a bonus.
So that's good. That's a good overview. And I'm a
public educator. I've been teaching at the college level for

(03:58):
about twenty since about twenty ten, so getting close to
fifteen years now right on.

Speaker 2 (04:03):
Thank you for the intro. I'm looking forward to getting
into this because, like I said earlier, people hear things
like Project twenty twenty five, one big, beautiful bill, and
they get excited because it's Trump doing this, and Trump
can't do any wrong, and we just have to blindly
allow whatever agenda he's trying to put forward go through

(04:23):
because he's gonna save us all. But in reality, there
is a lot behind what is rolling out right now,
and unfortunately it doesn't seem that education is one of
those big issues that people are looking into it all.
But let's get into your article. Tell us a little
bit about what you focused on.

Speaker 3 (04:46):
So this article is building on a piece that I
wrote on I think it was published in February earlier
this year, and that was just looking at how the
Trump administration was pursuing the school choice per visions that
were put forth in Project twenty twenty five, and so
those would be two particular provisions. So one is back

(05:08):
in the day, we used to call these school choice
provisions vouchers. So they're taking public moneys and they're sending
them to private institutions. So the vouchers were typically for
private schools, religious or parochial. Now they've got these other
mechanisms known as education savings accounts and then tax credited scholarships.
These are not the same as the education savings accounts

(05:30):
that have been codified under the IRS forever, and these
are not scholarships like you would just earn based on merit.
These are basically what they call neo vouchers. And so
what I saw was that a lot of the people
that were promoting these policies were also well, not just
not just in particular the authors of Project twenty twenty

(05:53):
five's Department of Education chapter, but the think tanks behind
that project and those authors. So these are a consortium
of what are known as the State Policy Network think tank.
So that's a consort of think tanks that are funded
by dark money, largely from the Koch brothers, and these
include American Enterprise Institute, American Legislative Exchange Council, CATO Institute,

(06:17):
ed Choice, and the Heritage Foundation is a good cross
section of all those. So what I focused on in
this most recent piece was, as I was doing the
research for the prior one, I'd noticed that a lot
of the individuals, some of them that are directly involved
in the Project twenty twenty five, but certainly the think
tanks behind it, we're promoting the use of digital wallets

(06:39):
to distribute those school choice funds. And you know, it
really stuck out to me because I'd noticed that we
all are also witnessing the rise of the stable coin
sort of blockchain crypto economy. And so what you see
here in this piece is sort of the convergence of
like the old school arm of the Republican establishment, that

(07:00):
being like the neo conservative Beltway libertarian think tanks that
I just named, and then sort of the new arm,
the futurist kind of Silicon Valley arm that is basic
of the of the Republican establishment. That's basically the epicenter.
It's the PayPal mafia in particular, people like Peter Thiel,
Elon Musk, David Sachs, and then others in that orbit

(07:20):
such as Mark andresen So, so that's sort of a
decent overview of the piece, basically looking at the convergence
of those two arms of the Republican Party pushing this
new public private school choice partnerships to bring in blockchain
stable coins that you could then data mine social credit
analytics from.

Speaker 2 (07:41):
Right on, Well, let's get deeper into that and reasons
why people really need to dig into this and pay
more attention. You mentioned the social credit aspect. This is
baked into everything. This is a huge part of it, right, Yeah.

Speaker 3 (07:56):
Yeah, So the social credit element here is going to
come in through the base the feedback loops between the
financial technology or fintech inputs and then the education technology
or ed tech out outcomes. So basically, by hooking up
artificial intelligence to the to the digital wallets, you can
sort of data mine basically, uh, what are the best

(08:23):
voucher programs or what are the best ways to spend
these stable coin tokens in terms of education to uh
get the best outcomes, right, and so basically, uh, based
on how well the student performs on the ed tech products.
So those vouchers, right, the neo vouchers will say the

(08:45):
education safetyes accounts and the attached graded scholarships. These can
be used for more than just tuition for a private school, right,
they can also be used for a basket of other
educational products and services. So these include various ed tech
products that I'm sure we've we've just discussed on our
private prior episodes, but maybe we'll dive into that later
when we talk about sort of how the paypalm off

(09:06):
is invested in those as well. But from those so
from those fintech digital walls, the student gets the money
they purchase an a tech product, the data comes out
to show what they've learned and what career pathway they
should be put on, what curriculums and modules they'll need
to achieve the objectives of that career pathway, and then

(09:27):
based on how well that student performs, right, that's going
to the feedback is going to be where the distributors
of the of the voucher neo voucher moneys are going
to look at what's the best way too or what
are the best products and services that those second set
of school choice funds should be programmed for. So the

(09:48):
stable coins are basically, you know, their cryptocurrencies that are
backed one to one with the US dollar, but they
have all the negatives of a CBDC, a central bank
digital currency, so that means programmable, trackable, traceable. So those
essays and scholarships, the neo vouchers, they with the stable coins,

(10:10):
you could program them so that they can only be
used to purchase certain ad tech products and services, and
that programing would be based again right on that feedback loop, right, Like,
so based on how well the student does, that's going
to determine what types of coins they get, what those
coins can be used for, and then based on how
they perform on those purchases, right, and you continue that

(10:30):
and then you get these predictive analytics that tells the
student this is the career pathway that you have to
move through.

Speaker 2 (10:36):
Talk a little bit more about fintech and its role
with all of this.

Speaker 3 (10:44):
Yeah, so you know FinTechs as a broader industry. I mean,
the origins are essentially come straight out of the PayPal mafia.
And so you know, if we go back to the origins,
we could look at something like field link, which is
more or less the first well commercially available platform for
online purchases. PayPal was sort of the first for like
peer to peer, but field Link became Confinity, and then

(11:07):
Confinity was merged with x dot Com. That was Elon
Musk's fintech platform before he basically rebranded all of his
new companies under the X model, including the former Twitter.
So feeling Musks Confinity, and X dot Com, they get
merged together to form PayPal. Later PayPal merges with eBay
Marc Andreessen being on the board of directors there and

(11:30):
PayPal sort of was, I guess, the put in the
door for the digital wallet industry, and so now we
have sort of an array of digital wallet companies that
are sort of expanding on this infrastructure, putting in place
the necessary pieces where we can install the blockchain distributed

(11:50):
ledger technologies. So that means that right instead of just
a digital wallet being a way to transfer funds, right,
the digital ledger, the blockchain ledger has embedded in it
basically data tracking for all the metadata for where and
when you know, who took the funds, where it was
transferred to, when it was transferred, how much was transferred,

(12:13):
what product was All that data can be logged on
this blockchain ledger, which then can be dual purposed as
essentially a digital idea, and especially in the realm of
social credit, right, but also in education and so so
this digital wallet industry is basically putting in the next
building block that's going to enable the new stable coins

(12:36):
to be the main medium of currency so to speak,
to distribute not just education funds school choice funds. But
a company like class wallet, which is a digital wallet company.
This one started specializing in education funds, but then branch
is if you go to the website, they're branching out
into emergency relief funds through FEMA, housing vouchers through HUD

(13:01):
health tokens through HHS. So they're going to be branching
out to essentially facilitate or distribute public funds from all
government agencies. And if you look at another company digital
wallet company known as benefit Wallet, that's sort of an
inverse case where it began focusing on health savings accounts

(13:22):
distributing basically health care monies, and then it branched out
into school choice money. So what we see is that
this if we zoom in on school choice and the
digital wallet sort of infrastructure there, it's just one aspect
of this broader fintech overhaul throughout the entire economy, especially

(13:44):
beginning with and emphasizing the public end of the public
private economy, so to speak.

Speaker 2 (13:51):
So they're basically paving the way for this technocratic takeover
of everything. But this what we're talking about is just
a small focus on the education system, but it has
the capability for larger and broader applications. Talk a little
bit about the involvement with companies like pal Andeer Peter Thiel,

(14:14):
the integration of AI with not only education systems but
law enforcement things like that. This is all interconnected, right, Yeah.

Speaker 3 (14:26):
So when I first wrote my book and I started
looking at sort of the ways that some of the
tech education technologies, some of them were used for like
assistive technologies, so this means for students with disabilities, so
they would basically qualify as like healthcare technologies, so that
would be under hip instead of FURPA. And so as
I'm looking at it, I'm going like, well, look, you
know there's going to be have to be some legislative

(14:48):
overhauls here to like, right, I mean, because you can't
you can't mix those two baskets. But what I saw
was what was where just the ed tech industry once
all those pieces got sort of mainline that you would
you would be sort of conflating health and education and
criminal justice together. And so it was when I got
a peek at the sort of the Journeyman TV documentary,

(15:11):
the showcase the Chinese social credit system, was when I
was like, Oh, that's what it's gonna look like when
it all comes together. And what I'm looking at here
in education is just one It's just basically one part
of that ledger, one part of that ubiquitous comprehensive biocycle
social ledger. And so all these things that I'm discussing

(15:35):
in education that I'm focusing on in my wheelhouse, similar
things are happening in healthcare, and similar happening things are
happening and and and in criminal justice, and all those
things are gonna converge into again, right, All those all
those different data tracks, all those different parts of the
ledger are going to be part of one digital ID,
one one profile. So you asked, how does you know

(16:00):
did the field verse fit into this? Well, Uh, as
far as the fintech and and these and these characters
are all invested in the ed tech as well, Let's
let's focus on the fintech for just a second. If
you look at a company like well Odyssey and Merit
International is a good sort of case study or studies.

(16:20):
So Odyssey digital wallet company is funded by Andres and Horowitz.
These are that's Mark Andrewesen's venture capital firm. Uh It's
also funded by Bleen Capital and Tusk Venture Partners now
Tusk Venture Partners along with Andresent Horowitz. They both share
common investments with Peter Field's Founder's Fund, and Bleen Capital

(16:41):
is invested in Palenteer, which is Peter Field's uh Ai
analytics company. Then Mark Andreeson's VC is Andrews and Horowitz
is also finances Merit International along with Alumni Ventures, and
Alumni Ventures also has common investments with Peter Field's Founders Fund.

(17:01):
So that's just one sort of a cross section kind
of zooming in to see sort of how the venture
capital firms of Feel and his PayPal technocrat buddies are
invested in these digital wallet companies. But if we look
at there's several ed tech products they're invested in these
as well. So right on the fintech out input end

(17:23):
and on the ed tech outcome, and so company like
Newton and Clever, these are adaptive learning courseware companies. I'm
pretty sure we talked about the Skinner box and how
that is the was the analog version of what is
today the digital teaching machine, right, the adaptive learning courseware
that uses those operate conditioning algorithms that I mentioned Charlotte

(17:44):
Blew the whistle On and Project Best and so Peter
thel funds Newton and Clever. There's another one called symbol
Lab that has been acquired by course Hero Course Hero
being funded by David Sachs's. Then you have the bile
feedback wearable technologies still uses stimulus for response and operating conditioning,
but instead of cognitive behavioral or thinking algorithms like the

(18:07):
adaptive learning course where data minds, the bile feedback wearables
are data mining the emotional algorithms. So they're going to
infer emotions based on data mining the student's heart rate,
brain waves, the galvanic skin response which is basically electrical conductivity,
and then also making inferences about emotion based on facial
recognition scans. So a company called Effectiva, which largely they

(18:32):
focus emphasize on the facial recognition social emotional learning stuff,
but they did have a GSR bracelet at one point
Galvanic skin Response Brace. I don't know if they still
have it, but that company is partners with makes Co
creates online content with an online learning platform called Udacity,

(18:54):
Udacity being funded by Mark Andresen And then sort of
at the apex of the pyramid, if you look at
my book, sort of what I lay out is the
progression of a series of evolving technologies moving from screen
based adaptive learning courseware to wearable social emotional learning bile
feedback devices, eventually to implannables. But in the interim right

(19:16):
the cognitive behavior and social emotional algorithms get data mind aggregated,
fed through machine learning and deep learning into a large
language model through neural networks, and you get generative AI.
So we're essentially getting close to the end of the
book here. But in something like open ai, which has
its own basically tutor bought and then it provides that

(19:37):
service to kind academy for its tutor bot known as Conmego,
it can do all that stuff. It can data mind,
the brain waves I'm sorry, the social emotional algorithms, and
the cognitive behavioral algorithms while also interacting with the student
through natural language, and open AI is founded by Elon Musk,
and Elon Musk has also funded kind Academy, which uses

(20:01):
open AI. So the sort of the PayPal mafiosos are
invested in both ends of that feedback loop analytics and
then obviously the AI as well, with somebody like David Sachs.
I mean, so Fiel's palenter. We noted Musk's Open AI.
We noted something else worth noting is that David Sachs
was appointed under Trump as the White House AI and

(20:24):
cryptos are and actually under his senior policy advisor for AI,
Sri Ram Krishnan. He used to work for Mark Andresen
And so you have in charge of the AI policy
at the White House, you have PayPal mafia folk as well.

Speaker 2 (20:44):
Let's break this down for people like myself that aren't
technologically educated and aren't invested a lot with this techno stuff,
but would really benefit from knowing more about it when
we're talking about things like fintech and ed tech.

Speaker 3 (21:03):
Maybe we could.

Speaker 2 (21:04):
Break this down and give real world examples of how
this would look being implemented into a parent's life.

Speaker 3 (21:16):
Yeah, I you know, I I think uh. I'll give
you an anecdotal example. This is actually for college students.
But this is available for a different institutions. So I'm
one of the schools where I teach. I just got
a big chunk of my hours replaced by an AI
bot known as Anna, and it comes from a company

(21:37):
called Upswing, and it also does mental health services by
the way, with the same bot. It's not two bots.
You don't get in ed tech bot. And this is
what I was talking about when I was like, how
are they gonna do this? Because technically, like for anybody
that's taught, you'll know that, like if a student has disabilities,
you can't even ask them like do you put on
the syllabus? If you have a disability, go tell the

(21:58):
disability services and then they will provide me with your accommodations.
Because right, that's a violation of hip ho if I
started asking people about their medical records, right, So I
can't do that as a teacher, but the bot can, right.
And so but the thing about this this thing is
so basically what I think they're gonna have is they're
gonna have a bot that will follow you around because

(22:20):
basically this thing can do everything that you know, as
I mentioned the adaptive learning, the screen based stuff, and
the wearable stuff. They're just more or less just building blocks.
Now if you use the AI bot, you kind of
got it following you around. It's supposed to this one
that they have hired is supposed to supposed to help
the students, like with everything from daily planning organizing, uh
to actual tutoring. It could be any discipline and the

(22:44):
mental health services as well. But basically this thing, it's
it's one of the things that it promotes is that
it keeps track of all the outcomes and the progress.
Well that's that's the feedback loop. That's the adaptive learning stuff, right,
and you know they boast it. They can do it
not just by This is part of the selling point
for the school. So it's not just individual analytics for

(23:07):
each student, but they can do it by group, so
like age, maybe freshman, sophomore, et cetera, race, gender, sexuality,
institution wide all this stuff, so they can get all
these different sets of data not just for the student
to help, you know, to keep track of the student's improvement,
but for the institution to keep track of the improvement.
And I don't want to digress, but maybe we'll talk

(23:28):
about how that relates to impact investing. And sort of
the securitization of human beings. But I think that where
we're going with this is going to be basically so
I saw, well, I talked to my god, it cuts
my hair. He mentioned the Mark Zuckerberg just recently showcased

(23:52):
this new set of metagoggles, and they can basically, as
far as I understand, you can prompt the AI through
the AR. You can make FaceTime calls through your gloggles,
but you don't apparently you can think it. So I
don't know how that works, but it sounds like it
has an EEG built into it. So basically I would

(24:13):
guess this is how it works. If I verbally command
the glass say hey, call Bob right, and it sees
that right, and here's the command. Well, if it's got
the EEG right in the implanted in the stem of
the glasses and it can read my brain whens, then
it can pick up on the signal right of what
the pat the electrical patterns in my brain when I

(24:35):
speak that. So then if I keep my mouth shut
and I say it in my head right, it should
light up essentially those same signals. And so this is
how it can make this inference. Now my point bringing
this up is I think we're getting close to the
point where the this is. You know, this is all
the book, Like I said, we're getting moveet to the
end of it here. It's all you know, I started

(24:56):
writing it in twenty sixteen, so it's like, you know,
it was a future forecast. It's like ten year ten
years forward fast man. So but essentially what I think
what you'll have is I mean, and this is basically
what these these glasses have. There's an AI bod in it, right,
it can it's got the basically the the social emotional
for ed tech. That's it will be caught. But in

(25:17):
like fitness, it's basically just a fit bit, right, it's
doing the same thing. It's just what types of metrics
are you trying to infer from it? Uh? And the
thing can keep track of you know, your daily you know,
UH activities, so it can track the patterns and you know,
if you're trying to help it, that's what you'll have.
You'll have some bop that follows you around all day
on some kind of a wearable, whether it's goggles or

(25:38):
a set of wearables. Maybe you can maybe you can
help the bot get more data if you have a
ring and a watch and you know, all the different
things to get the heart, you know, get a more
precise reading of all the different biometrics. But that's where
we're going, and uh, we're we're pretty close. Uh, we're
pretty close to it now. But if you wanted more

(25:58):
practical examples of what that what they has looked like,
you know, it probably didn't look like much, you know,
especially in the early days that like the adaptive learning
platforms like Conmego and things like that that have you know,
just sort of these online modules because most of those
things that I'm talking about is happening on the back
end of it, you know, so people, I think people
probably it seems as innocuous to some as behavioral advertising

(26:24):
feels to a lot of people in general. Right, So
it's like, you know, you're using your Facebook or your
social media you're on, you know, and these these same
types of algorithms. You know, if you look at a
company like dream Box, they say that they're adaptive learning
algorithms that basically do the predictive analytics and personalized learning
and all that they're not just based on Skinner's operating

(26:48):
conditioning algorithms. Is that they're they're effectively the same as
the behavioral advertising al rooms that like Netflix and Amazon use.
So you know it's it's that's it's effectively the same thing.
It's just instead of good instead of the feedback loop
offering you products that you might like to buy or
videos that you might like to watch, it's going to
offer you the next lesson that it thinks you need

(27:10):
to get to the career pathway that it thinks you
are best suited for. Right. And so the wearables that
that isn't something that probably a lot of parents have seen,
like as embedded in basic you know those basic modules.
It might have the adaptive learning algorithm in it, but
where you might have seen it is for helping. Again,

(27:30):
this is where it would be assistive technology would be
like at one school where I teach at the wellness center,
we use heart math, and heart math is a it's
an EKG wearable. It data minds the heart rate. But
basically the way it's pitched for schools, it's it's used
not just for schools, it's used like business meetings and

(27:50):
things like that, Like spirituality. There's kind of a new
age global Coherence initiative that's you know, the whole separate
use for this, but I don't want to, I don't
want to digress into that. But what they do is
they put it on. It's got a trademark meditate breathing meditation.
So apparently you got to get trained to be a
heart math coach. And you can't. You can't teach the breed.

(28:13):
You can't teach the breathing to somebody, because that's true.
You know, that's owned, patented or whatever trademark. But so
you give this the student the wearable and they breathe
and and the device is gonna tell the student as
if you couldn't feel that you're less anxious, that your
that your heart rate has lowered. But you know, maybe

(28:34):
if you're super anxious, you can't and maybe maybe it
helps you like dial it in. But it's gonna tell you, hey,
your heart rate is level enough. Now you should try it.
Now you're good to take the tests. Right, So before
you go to take the test, you sit down, you
do this thing and once you're once it goes into
it I guess what they call heart coherence. Then the
you know, the the wellness coordinator would come in like okay,

(28:56):
test time, right, and so those are two like more practical,
immediate examples of what some of these technologies look like.
But I think we're gonna we're getting very close to
the place where, Yeah, the student's gonna have a personalized
AI bot that follows it around through a series of devices,
and it's pulling all that data and personalizing what the

(29:20):
student is allowed to learn based on that.

Speaker 2 (29:23):
This sounds like science fiction. It's essentially already here, but
our data seems to be the new currency, and it
seems to be the most important thing the parasitic ruling
class is after with these biofeedback wearables. People don't realize

(29:44):
that this is going to be used against them in
every way possible. It's going to be integrated with the
social credit nonsense, with the new healthcare, which is probably
going to be integrated with AI and all this other
nonsense that we're talking about as well, and there will
be no escape from that system if you want to

(30:04):
participate with certain things.

Speaker 3 (30:08):
Yeah, I mean to your point about the healthcare, I mean,
let's take a let's zoom in on that for just
a thing, because I've actually written about the precision medicine
initiative that Obama pushed forward, and the emphasis on this
was largely get personalized medicine. Now, the emphasis was largely
on that means gene based, So right, we're gonna give

(30:30):
you a personalized treatment based on your DNA signatures. Now,
part of this, though, is also the AI analytics, and
there was a part of the Precision Medicine Initiative was
like an offshoot project called All of Us and that
was headed by a guy by Eric by the name
of Eric Dishman, and Eric Dishman got his The reason
why he was signed up for the Precision Medicine Initiative

(30:52):
and All of Us was that his background was in
developing smart homes for like veterans and Alzheimer's patients and so, right,
people that can't remember to take their medicine and things
like that, it had all these sensors and things to
remind them. Okay, So he has had a background and
sort of creating, uh, you know, an an AI environment

(31:13):
that could data mind people's behaviors or at least track
them and keep track of it. So what the All
of Us wanted to do is this is basically an
epigenetic project. Okay. So that means that they wanted to
have people use wearable Remember Bobby, Bobby recently said this.
He recently said that he wants everybody to use the
wearables in that way they can write and so uh

(31:34):
And when I heard I I'd known for a while
that he was invested in some like some AI and
impact and investing in things like that, where it was
kind of like, hmm, I mean, let's let's let's give
them a chance to see what he see what he does, right,
because I liked I like this stance on the lockdowns,
you know, and I was like, you know, keeping my
eye on it. And pretty quickly early on, I was like,

(31:55):
I think he's gonna go with this dishmund project. And
I've you know, if you go back to my interviews,
see that I kind of called it before before he
officially announced it. But that's what it is. It's the
same thing. So basically what they're gonna do is based
on they'll take you, you'll scan your your genes, right,
because uh, genetic expression is not necessarily causative, right, right,

(32:16):
Like if you have the what is it, the BRS
BCR whatever that the breast cancer gene, right, doesn't mean
you're going to have breast cancer. It means you are
more susceptible to it, right and right, So what they
want to figure out is what types of behaviors, environments, lifestyles, consumption, habits,
et cetera, trigger the expression of a particular gene. So

(32:39):
they scan you, they go, oh, you've got these genetic proclivities.
They watch what you they you put the wearables on,
they see you know how much you exercise, what are
you eating? You know, how much sunlight do you get?
All this type of stuff, and then they're gonna go
based on that, right, they're gonna try to figure out
what are these things that trigger those expressions. And I
think that one of the one of the ways that

(33:01):
they'll get them they'll get people to on board with
this is they'll probably they'll tokenize it, so you know,
first to get you to have them put them on,
like you get you get a deal on your insurance rates. Right,
It's it'll be kind of like have you ever seen
this was like foot in the door for social credit
a long time ago, the uh smart or safe driving Yeah,

(33:24):
so the insurance yeah, so in set side, so it's
like you know, you only get penalized when you do
bad stuff, but what if you do so good all
the time, shouldn't you earn points for that? And that's
how they'll do medical insurance. They'll be like, hey, we'll
bring your rates down if you show us that you know,
you eat right, you exer So put these wearables on.
You can earn points right and then or tokens what

(33:46):
however they want to monetize and quantify it, right, But again,
this is just gonna be part of your ledger. So
you're gonna have the education, does it? You're gonna have.
And they have wearables for the workforce stuff too. I
think they call it boss wear. So, like, you know,
if you especially if you're doing online work, they want
to make sure you're not on Facebook or daydreaming, So
you put earbuds in e g. Wearable and somehow it's

(34:08):
gonna infer uh that whether or not you're paying attention.
I you know, Just as a side note, I do
have to say that I I I I'm very skeptical
the accuracy of the inferences you can make based on
things like heartweight and brainwave and facial expression because like,
especially me, uh, especially if I'm writing front reading or

(34:30):
writing like I look, I look angry sometimes like and
people be like he always looks at He's like, oh,
he's the nicest guy to them. They're like, yeah, I'm
just like I'm focused, you know what I mean. So
I don't know, like if people look at me and
think that when I'm thinking hard about a problem where
I'm focused, did I look upset? Like what's the AI
gonna do? Like, you know, we're gonna get some kind
of a red flag goes off because I'm thinking too

(34:51):
hard or something. So sorry, that's like a side note,
but that's yeah, that's an uh, this is what we're seeing.
We're seeing this and then you know, and in the
realm of pre crime. You know, now that we've got
Operation Gladio season in the United States, right, that'll be
the That'll be the next part of that. It'll be,
well's looking at your health records or mental health records, right,

(35:13):
because that's an inference you can make on it and
make from it, and they can make it based on
you know, the attach or the or the or the
med tech. And then you know, and then that would
be the result of that, right, be a pre pre
crime thing.

Speaker 2 (35:26):
Things have been moving extremely fast, especially when you look
at the development of AI lots has changed since just
the last time we spoke. There have been some concerning
stories come out about people's relationships with AI, some of
the interactions and some of the responses that are coming

(35:50):
out of these AI programs, psychosis, now these new mental
disorders from interacting with A it's gotten pretty crazy. But
i'd love to get your thoughts on where we're at
with that and how advanced we're really looking with the
AI that we're interacting with versus what maybe they have

(36:14):
in military industrial systems or what we're not aware of.

Speaker 3 (36:22):
Yeah. Well, so as far as the interpersonal interpersonal, that's
not the right word to use because that would suggest
that the AI is person. But the human effects of
AI on human AI interaction, that's the best way to
say that. That bot that I mentioned, so it's called ANNA,
and by the way, I think the reason why it's

(36:43):
called ANAS because it's short for analytics, So it's like
like that's its real that's its real purpose. But if
you go on the company website for Upswing and then
there's a they have a demo for its bot, not
believe it's only tech space, Like I don't think it
actually has a verbal so you have to type and
then it types back. They showcase what's supposed to be

(37:04):
a student interacting with it, and by the time, you know,
it's hyper affirmative everything. It's like to me, it's it's
like pack, I mean it it's talking. And now maybe
if it's a super if it's a younger student, that
might be appropriate, but I don't necessarily think that's appropriate
for a college level student. It's like, yo, it's okay,
like literally like you can't hear it, but like you're phrasy,

(37:27):
you could you could almost hear the intonation. Well by
the end of it. Uh. And this is this is
how they add this is their idea of a of
a promotional I'm if you want an insight into like
being you know, not attached or dissociated from like normal people.
Like by the end of it, the student is I

(37:48):
don't remember if it says I love you explicitly, but
it it does heart emojis. It's like, thank you so much,
you care so much about it. So it's like, so
the student is having an emotional bond with the bot,
which is supposed to be representing the authority figure so
to speak, or the you know, the the teacher or

(38:10):
the tutor. So this is another area where like again, right,
as a teacher, I can't I can't ask students about
their health records. As a teacher, I can't and should
not be a bonding like that with my students. Right,
there's that there shouldn't be in any emotional attachment like that,
certainly not any right heart emojis. Right, So like these

(38:31):
lines that teachers should never be crossing, we're totally okay
for the bot to do that. In the meantime, you're
you're you're saying that this thing is designed to help
the student with mental health. But what do you think
what do you think happens when a human being becomes
emotionally attached to a bot? Like you don't think that
causes mental problems? Especially in eight you know populations developing populations, right,

(38:55):
you know, whether it's K through twelve or even early college.
Right when when a lot of people might not be
fully uh so, let's socialized so to speak, right, socially developed,
meaning right, like their their abilities to like interact with
people without being uncomfortable or socially awkward, have social anxiety
things like that, right, Like if you if you haven't

(39:18):
sort of you know, worked through that yet. Uh, but
you got a bot that always affirms everything you say
and always makes you feel good and feels like you
get a lot of what you what you sort of need,
I mean, human being. We're social creatures, right, we need
to have some kind of human interaction. But if you
can get sort of the same biochemical effects of that

(39:38):
from a bot, and then then you're probably gonna spend
less time with human beings. So on top of the
fact that you're bonding with the bot, you're dissociating from
human beings, all under the guise of you know, helping
and personalizing. And I just cannot you know, Oh, I don't.
I don't have the data to prove this yet, you know,
I don't. I don't have the metrics. But I don't

(39:59):
think I need a whole lot of you know, I
don't think I need a whole spreadsheet to accurately infer
that this is going to have certainly more negative effects
and positives, I you know what I mean.

Speaker 2 (40:11):
So there's some that speculate that AI is being utilized
in ways that the general population would never be aware of,
in somewhat nefarious ways, possibly planning high impact events things
of this nature.

Speaker 3 (40:29):
What are your.

Speaker 2 (40:29):
Thoughts about how capable AI is, especially the AI that
we're not allowed to interact with.

Speaker 3 (40:40):
Well, I don't know how much I can speculate on
like the power of I'm sure there's you know, I'm
sure that whatever we have, just like just like any
point in history, is right, not what they are, right,
because what we have, what we needs to be scalable
and profitable and all that. Right. So, like, you know,
back in the days when they the original computers, you

(41:01):
see like pictures of J. W. Forrester who came up
with systems dynamics sitting in front of like an entire
room of like magnetic tape reels and stuff, you know,
and like like our tell our our smartphones can do
more than that now, Like but had that stuff been
pocket sized back then, it probably would have been in
people's pockets, right, And so you know, I mean I imagine
that there's systems they have that right, just scalably, you know,

(41:24):
in terms of scalability and profitability. You know, it's it's
not it's not feasible to you know, hand out to
the to the public. But as far as like you know,
let's just let's just say like social engineering at a
at a larger scale. If you hook these systems up

(41:46):
to social media, you can sort of in the same
way you can do it with a particular school or
an individual. You can look at things like not just
you know, broadcast me. Propaganda was a one way thing.
So the only way you could get a sense of
like whether or not your propaganda was working was to

(42:07):
do like different polls. And then even that, you know,
how accurate is that because people are self reporting and
you don't know if they're what they're saying is basically
what they really think. Maybe they're saying something different because
they know it's gonna go on the form and or
they're in front of a camera. But the social media, right,
I think this is why Twitter was designed to be
so the short I mean, it's different now, remember you
speak on I had so many characters. What it's trying

(42:29):
to do is elicit immediate responses, right, sort of stimulus response,
reflexive actor. Right. That gets a better sense of the
effective system, meaning the emotions like what are you what
do you have strong feelings about one way or another?
Whether you know obviously rage and fear beating some of
those stronger motivators. So you can see that on an

(42:49):
individual visual scale, but you can see it in group scale.
You can see it. You can go left wing, right wing,
Republican democrat, you go race, gender, class sixs you out,
and you can go generation. All these different all these
different conflicts that right are basically the entire ecosystem of
social media. And based on that, you can look at
numbers and you can go oh, and then you can go, well,

(43:11):
what are the what are the buzzwords or keywords that
are associated with the values of these groups and the
values of these groups, how can you how can you
amplify that with bots and uh therefore polarized those contentions,
or how can you attach those values to a candidate
or a movement or a policy, right, and then they

(43:33):
can do this in real time. This is what's known
as action research. This is a concept that basically was
formed by Kurt Lulin. He was one of these Frankfurt
School guys who was also at the Macy Cybernetics conferences
where they laid out the basically the theoretical framework for
AI right on the feedback loops and all that stuff.
So when the action research works. It's basically you know,

(43:55):
same way any feedback work. But feedback loop works except
in a system dynamics, right, And so you do something
action right, So maybe you you put out uh, you know,
you have some big account, put out a tweet, and
then you see how people react and then if they
react the way you want them to, then you right,

(44:16):
you reinforce that, right, and so it's it's uh, there's
two steps at it's action something something and then back
to action again. And so you know with social media, uh,
I think you can do that very much in real time.
And if you look at something like uh, you know,
the Rand Corporation developed something called scenario planning. It was
Herman Coon. I can't remember the exact decade, somewhere between

(44:38):
the fifties and sixties, I want to say, but it's
basically what it does is it takes your systems dynamics.
Systems dynamics just looking at like it started with supply
chain dynamics, right, and so it's looking at everything from
like extraction the wraw materials to uh manufacturing to putting
it into the stores, distributing all the way to the consumer,
right and looking at how all those pieces work in

(45:00):
a system. Right, they've done this with like limits to growth.
This they did this with environmental dynamics and in context
of population dynamics, but we could do this in terms
of socio political dynamics. So the fact that right, they've
they have systems for doing this. One of the most
famous examples is well, one of the most well known

(45:22):
examples would be your You and your audience probably remember
the scenario. Uh, the Lockstep document was sort of what
it was coloquially, but the title was it's something like
Scenarios for the Future of Technology and something something something,
and it's scenario. It's a scenario planning document. If you
open it up, you'll see they have the all of
them are usually it's like it goes on a quadrant

(45:45):
like this, So you have like an ideal and and
like the like the most ideal outcome the least ideal outcome.
Then you got these two. They're like kind of intermediates, right,
and so what you do is, right, you want your
systems to evolve in a particular direction on that quadrant.
And then what you do is through the social media
and the AI analytics and the action research, right, you

(46:06):
put out stimuli and you see what the response is
and you see are we getting closer to our ideal
scenario or are we moving towards one of these other scenarios.
Now back to Rockefeller, the scenario planning document that was
done in twenty ten. In twenty twenty, we basically ran
the script of one of those quadrants. So we have
empirical evidence that yes, they use scenario planning to manage

(46:30):
systems dynamics, and they can plan things ahead in as
much as ten years. Okay, So if you know that
they've laid out the feedback loop schematics at Macy cybernetics
conferences in the early forties, and you know that Theriza
venture capital comes shortly thereafter, and you realize what's the
venture capital, Well, that's a lot of it coming out
of basically, you know, like NQTELCIA. But it's money is

(46:53):
that are get thrown at startups that basically, again, these
are companies that for the most part are not readily
scalable or readily distributable to the broader public right, so
you've got to prop them up with monies that may
or may not you may or may not get a
return on, which is why a lot of that money

(47:14):
basically comes from CIA. But the idea is they've had
seventy years knowing Moore's law, knowing that right, how, how
what is the exponential rate at which the transistors become
more efficient? So like, at what point from basically to
get though they knew how many decades it would take
before they could get the theory was already laid out.

(47:36):
It was just a matter of when it is the
hardware going to get to the point where we can
not not only process all the data, but you know,
track and trace it as well. So you've got all
those things. They've got a clear timeline, they've got methods
for planning across those timelines, they've got methods for managing
large systems across those timelines. And then they've got AI

(47:56):
systems to track their progress towards the outcomes. And so
I think that when you see something like you know,
nine to eleven and the timing of it, I mean
this these that created the sort of flash point where
we had crises that sort of hurried people along, hurried

(48:17):
people along and gave them reasons to adopt some of
these dystopic technologies that they write. If they weren't scared
again right, being motivated by rage and fear, they wouldn't
have adopted them. And so you know, in the meantime,
all the other than sort of laying out the scenario
planning and the action research is largely just keeping people

(48:38):
at each other's throats and otherwise distracted through cultural In.

Speaker 2 (48:42):
My perspective, there are two essential realities that a person
can focus on. One is online, what they're seeing in
social media and on the TV, and one is the
real world interacting with real people. That's the one I'm interested.
The one online is fueled by theatric propaganda, psychological operations,

(49:03):
and a whole realm of fantasies. But I really enjoy
getting perspectives of what's happening in real people's lives, what
people are noticing about, the collective reaction to things. Interactions
on a real world basis physically talking with someone one
on one, engaging where people are at with everything that's unfolding.

(49:29):
You're a teacher, you interact with young people. If you
listen to what the media is telling us, they'll have
you believe that everyone is just becoming insane and flipping out.
They have so many different perspectives of where we're at
with the mental health of the collective. And I look
around and I still see a somewhat normal world when

(49:50):
I go out and talk and interact with people. But
I'd love to get your perspective on where we're at
in reality versus what they're trying to feed us.

Speaker 3 (50:01):
Yeah, I'll say that. You know, after lockdowns, I finally
started going back into human space to tutoring because of
the larger because of the pot So it was like,
you know, the only the main thing we I guess
we have to offer because we still do virtual human
human to human virtual but I guess, you know, the
administration that contracted with upsling, you know, I guess at

(50:24):
least what we have now is what the bot cannot
do is be a physical human being in physical space.
So but for a lot, you know, for the last
five years, largely because lockdowns and the mandatory health treatment,
I you know, I stayed online even after they said
it was okay to come back, because I didn't know,
you know, what they were gonna do. I didn't know
if they were going to roll everything back, and then

(50:46):
I was gonna have to play musical chairs between my
house and the school and not know you know, it
was just I'd rather just stick to one medium in
that way, whatever they do, I don't have to make adjustments.
But you know, being back, no, you know, I don't
I uh, I don't sense any you know, it's been

(51:08):
five years. I don't know that you would notice a
huge difference, although you know that's that technically is getting
close to us, almost a new generation almost. Uh No,
I don't. I don't sense anything different as far as
the average student, you know what I mean, you know,
and it's like, you know, like, oh, like the the
other sense you get it's like this all these kids
are you know, woke and all this. You know that's

(51:29):
not the case, you know what I mean. Like uh
and even you know, even the students that they do
come in and and identify differently, we'll just say like
it's not really a thing, like it doesn't you know,
you just treat them like a human being, and it doesn't.
It's not a thing, you know what I mean. Like
the interactions are just as you know, the same as
they've been. So yeah, that's I'll say that. I'll say

(51:54):
also though that you know, to your point, I remember
remember when I first visited Charlotte Israbeat in twenty twenty.
So it's just like at the peak of lockdowns. And
when I drove there, it was like you know, ghosts, ghosts,
ghost town, but you were you know, I was watching
the news and it's like, uh, riots, yea and riots

(52:14):
there in Chazz And like, I seriously I didn't know.
I'd never I never really traveled a lot. I mean,
never really traveled at all that you know. There's I
had family in Missouri that we'd go see every once
in a while. That's about the furthest i'd been. So
you know, I didn't know what kind of detours I
was gonna have to take through towns. And I was
nervous thinking like, oh, I'm gonna like when I come
into you know, I go through New York. What if

(52:35):
I got to go through the city and am I
gonna run? Dude? When I drove across the country, it
was like, you know, if you didn't turn your TV on,
you wouldn't have any I you would have You'd have
no inclination. That's every stuff. Everything was so chaotic. You
have no inclination, you know. And one other thing I'll say,
you know, it's just this is more about faculty, which

(52:56):
is I don't know. Hopefully this isn't Hopefully this isn't
just a blip on the radar. But I've noticed I
walked past the walk past an office. I heard some
people complaining about AI a little bit. I've seen a
lot of people embracing it, like, oh, we gotta we
gotta find a way to make it work for us,
which I you know, like in some ways, I mean,
I'm I'm of the opposite mind. I'd like to I'd

(53:18):
like to just abstain from it, but uh, I I'm not.
I've said to a lot of people, like you know,
I just said, I said, today, we're getting close to
the end of the book. I don't know if you
can put that toothpaste back in the bag. So I'm
not gonna say it's a bad idea. Too. Well, let's
at least try to shape it so that it reflects,
you know, our values. Maybe you know if you have

(53:41):
if you if there's no chance that it's going away,
I can't say that that's a bad idea. Uh, but
at least hearing people, you know, I've tried to warm
I've been talking about this for most of ten years now,
you know, and it's like now people but now that
it's on their doorstep, it's like now you're now people
are like freaking out or got you know, And it's

(54:01):
kind of like, dude, it's you know now as it
to me, it's like the it's the equivalent of, uh,
somebody that doesn't you know, you know, I'm a martial artist,
somebody that doesn't put in the hours in the gym.
And you think you're gonna, like on game day, you're
gonna show up in there and perform. It's not gonna happen,
like you know what I mean. Like I'm not saying yeah,
I mean, I guess you got to do what you
can do. But I will say yeah, people people have

(54:24):
been skeptical, and I won't say anymore in that. There's
there's a few people that if I if I talk,
if I say too much more, I wouldn't want them
to be out there publicly like that. But but there's
people that I've talked to that I can see that
they're doing things to at least try to hold the
line in ways that I hadn't seen before as far

(54:46):
as faculty and colleagues. So uh, that that is a
that's a positive thing. I you know, and I don't know.
Maybe maybe you know, maybe we'll see, we'll see where
that goes.

Speaker 2 (55:01):
Well, you mentioned a few times that were almost at
the end of your book in a sense that everything
that is unfolding you've warned about a while back. How
should we navigate this and is there any other important
aspects that we didn't touch on yet that you'd like
to leave the audience with.

Speaker 3 (55:24):
Well, there's one point that I I, uh, I think
it's worth making it and and then and then we
can sort of talk about solutions here. But this is
a good way to sort of put bookends on the
whole social credit, fintech, and tech, all all the pieces
of this puzzle and the way the AI and I
mentioned that impact investing works. Uh So, what what we're
moving towards is essentially the securitization of human beings. Right,

(55:48):
I'm gonna tell I'm gonna explain this the way I
always do, which if you've if you heard me explain
it before, it's gonna be the long version of storyer.
But I like to tell it this way because I've
when I tell it this way, the look on people's faces.
I could see that it lands in ways when in
ways that I hadn't articulated prior. So, uh, you know,
I'm not I'm not a financial, not a fintech guy.

(56:10):
I'm not a financial you know, money is not my focus, so,
you know, uh, to make sure that I was understanding
a lot of the fintech financial jargon accurately for this article,
I contacted Mark Goodwin, was the former editor of Bitcoin magazine.
Everybody should check out his series on the Chain. I
can't remember the but it's all the titles have something

(56:33):
like the chain in it, and it's sort of a
historical analysis of sort of the blockchain crypto industry and
how it's moving moving towards you know, this stable coin
economy and other other features. So when I talked to him,
we were talking about whether or not the stable coins
are gonna be classified as either a commodity or a

(56:57):
security and so, you know, me being you know, not
super familiar, I'm like, so clarify for me, what's the
difference between commodity and security? What makes a security of security?
And he says, the best explanation they said. They call
it the it's like the Howie farms or the Howe
orchards test okay, And so back in the nine I
think it's like the thirties, this guy Howie something. He

(57:19):
had these orchards and instead of selling land, he was
selling futures, securities, futures on the fruit that would grow
and the profits you would make from the fruit that
would be grown in zone. Okay. So security is something
that's future expected profits that are not baked into the
asset itself. So the land obviously can appreciate on its own,

(57:39):
but right, that's not a security, that's just normal appreciation.
The security is added value in some capacity. So the
way they're going to securitize human beings is they're going
to extrapolate or monetize or quantify the financial gains that
are associated with the student's lifetime and education workforce outcomes. Okay.

(58:02):
So basically what they do is they go like this,
and if you look at like company think tanks like
American Enterprise Institute, they quantify it like this. So did
Kamala Harris when she was putting parents in jail for truancy.
The rationale as well, if students are true and eventually
they you know, those students end up in jail. So
instead of them ending up in jail. Let's just put
mom in jail and then we'll get in front of it,

(58:24):
and maybe that student won't go to jail. And well,
you can you can monetize that, or you can quantify
it in terms of dollars spent because that costs, right,
cost money to run prisons, right, and they cost maybe
that student. And you know, other factors could be this.
You know, students that don't do good in school, they
get mental health issues, or they need to be retrained
for job. All of these could put a drain on
public moneys. And then conversely, if a student does does

(58:47):
really well in their career path, then they're gonna put
money into the account. So right, what they do is
they basically securitize those future expected earnings or savings based
on the students out learning outcomes. Okay, securitize that, put
it in a bond, attach the bond to the student,
and then you have these impact investment firms basically sponsor,

(59:11):
in other words, gamble on the student outcomes and then
get a return on that profit. So for the impact
investment firms, impact investings basically you put money upfront to
help achieve certain social outcomes. It could fall on the
ESG scale. Environmental social governance could be workforce, healthcare, education
outcomes could be for particular populations, marginalized communities. And in

(59:35):
education they're called pay for success, So you put money
up front right this to achieve these educational outcomes. If
those outcomes are achieved to the extent to which they are,
the firm can earn money back from the government, and
if they exceed the outcomes, they can even earn a profit. Okay.
And for that company that I mentioned, Upsling, so I

(59:56):
looked it up, it's funded by multiple not just eventure
capital firms, but impact investment firms, one of them being
Social Finance, which I wrote about in my first piece
Trump at twenty twenty five, and I also wrote about
and I think we talked about it on your show
last time Unesco Study eleven. I think I should showcase
some of the images of that, and so Social Finance

(01:00:20):
I did a video late. I mean I don't know
if it was later, but I eventually did a video
on that kind of breaking it down, and I showed
that Social Finance partners with Future Learn. Future Learn is
part of the UK's open university is like the it's
an online university, online open online school. Back in the day,
it was like audio visuals. It was they instead of

(01:00:42):
they didn't have internet, so they'd send you books and
tapes and DVDs and stuff. But so Social Finance it's
a social impact investment company. So right for Anna at
the school. This is my guest, the reason why, the
reason why they want it. It's not just that it's
cheaper to than to paid hourly teachers, right, It's that

(01:01:03):
it will enable them to gather the data to get
the impact investments. Right. So that's why it's called an
analytics it's gonna and it boasts, it shows you all
the different metrics, right, and it tells you it's not
just for the for the individual, but it's for the institution.
So the institution saves money. Then they get this data.
You need the You need the AI to get that data. Right.
You're not gonna do it with spreadsheets, you know, just

(01:01:25):
the human beings crunching numbers. So to scale all that
data on an individual level, on a group level, from
the fintech inputs to that to all of that, you
need AI to make all those calculations at real time.
So you need the AI to get the data. You
need the data to get the impact investment grant. The
impact investment firm needs the data to get the money

(01:01:45):
back from the government. So what you're seeing is that
all these pieces of this puzzle are part of a
new supply chain ecosystem. And so that's that's essentially where
we're what we're looking at in terms of the endgame, uh,
with with the fin with the fintech.

Speaker 2 (01:02:04):
So I was just gonna say, how do we navigate
this is what solutions could we offer?

Speaker 3 (01:02:10):
Yeah, I mean, you know, my my thing is this, uh,
like you say, we we're we're way down the pipe here.
But I think that this is a simple way to
put it. I think if you approach the AI and
the fintech and the blockchain and all this stuff, now
it does just have to be an education just be

(01:02:30):
in your everyday life. If you approach that stuff, you
treat it like especially for for your audience. And people
that right are uh, they're in the all listen to
alternative media. They have alternative understandings of right, the propaganda
that we live in. Right, you probably didn't get your

(01:02:52):
mandate mandatory treatment. And if you didn't do that then
you might have been right. It was. That was a
challenging thing to do. Obviously, it wasn't the easier thing
to do in a lot of ways, especially if you
got fired, especially if you lost friends and family, especially
all these things, right, but you did it because it
was the right thing, and it was the better thing. Right. So,

(01:03:13):
if you treat the voucher moneies, the essays, the scholarship moneies,
if you treat the blockchain, the digital wall, if you
treat the AI like that needle, just think about it
like that needle. Whatever you're gonna get, however, it might
make your life a little less it'd be a little
less friction in your daily life, but you didn't do

(01:03:35):
it for that, so you know it's gonna have the
same repercussions. And by the way, I mean, you know,
there was obviously the biological consequences of what happens if
you get that. But part of what a lot of
people were also worried about was that that health passport,
that that thing was gonna be the first part of
this ledger that we're talking about, and that they're gonna

(01:03:57):
add to it all this other stuff and then you're
gonna then that was gonna lead to your biocycosocial digital idea,
social credit technocracy. So we're just so that's why it's
gonna come with similar dangers and it's gonna end it.
It will take us to the same destination. So you know,
you can work out the particulars of how to hold

(01:04:20):
that line on your own. I don't think I need
to you know, I don't anything I need to itemize
all that. But but I think if you orient yourself
in that way, I think you will find the spiritual
afford to do the thing you need to do.

Speaker 2 (01:04:32):
Right on, John, this was fantastic information. Thank you so
much again. Before I let you go, remind the audience
how they can find out more about you.

Speaker 3 (01:04:41):
That's a schoolworld Order. Dot info is my website that's
got links to all the social media I'm talis professor
on x book is you can find at tryinday dot com.
There's a link to that on the homepage of my website. Otherwise,
most recent stuff goes up on unlimited hanging.

Speaker 2 (01:05:00):
Excellent. I'll have those links easily accessible for everyone. John
will definitely have to do this again sooner than last time.
And until next time, everyone, have an excellent evening. We
will talk again tomorrow.

Speaker 3 (01:05:13):
We'll see LDN
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

The Joe Rogan Experience

The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.