Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
M You're here because you know something. What you know
you can't explain, but you feel it. You felt it
your entire life. Do you know what I'm talking about?
Speaker 2 (00:14):
The matrix?
Speaker 3 (00:28):
I had dreams that weren't just dreams.
Speaker 2 (00:35):
We accept the reality of the world with which we're presented.
It's as simple as that. Billions of people just living
out their lives oblivious, they talks. You're good, Hey, do
you believe their world?
Speaker 1 (00:52):
You can deny all the things I've seen, all the
things I've discovered, not for once long because too many
others know what's happening on there, and no one, no
government agency, has jurisdiction over the truth.
Speaker 3 (01:08):
Hello, I'm welcome to Beyond the Paradigm. I'm your host,
Paul Breckel. Well, for those of you that listened to
the episode last week, and I said that I would
have a guest on this week. Unfortunately due to a
mix up over time yet again, and it actually turns
out it is my.
Speaker 2 (01:25):
Fault because I live over here in the UK.
Speaker 3 (01:29):
Many of my guests are over in the United States,
or they're in other countries, let's put it that way.
And I always tell my guest that I'm on GMT,
so Genwich meantime, which is true, but only for part
of the year because I always forget. Because we're now
on British summertime, so obviously the clocks have moved one hour.
(01:51):
So when I tell my guests eight pm GMT, they
then go and correspond that to their time for example,
and then turns out it's actually an hour wrong.
Speaker 2 (02:04):
Because of what I've told them.
Speaker 3 (02:06):
So unfortunately, I wasn't able to record the interview due
to this error on my part, So that will be
an interview that will be coming in the future and
hopefully soon in the future, and I do have that
guest and another guest lined up to reschedule for two interviews.
Speaker 2 (02:23):
That are going to be fascinating.
Speaker 3 (02:26):
So just for those of you that are new to
this podcast, if this is the first time you've listened
to this podcast, then you can definitely expect to have
your worldview challenged. I've had my worldview challenge when interviewing.
People are also doing research for these monologues, and this
is going to be a monologue today which is relevant
(02:46):
for every single.
Speaker 2 (02:47):
Person on Earth.
Speaker 3 (02:49):
And I also want to apologize if the sound on
this episode is not as good quality as some of
the other episodes, and the reason for that is, well,
I don't really understand what's going on. My microphone seems
to have become ultra sensitive, and I've been doing some
sound test even basically whispering, and the.
Speaker 2 (03:12):
Volume has been off the chart.
Speaker 3 (03:14):
So during the editing process, I've had to obviously reduce
the noise levels and maybe the quality might not be
as good as usual. But I hope that this doesn't
spoil you're listening to this episode or put you off.
I just hope it's of good enough quality so you
can listen and sort of get all the information without
(03:35):
being too distracted with the poorer quality, let's say, of
the audio. So it looks like I'm going to have
to invest in a new microphone. And this is what
I say about podcasts, and I do mention it regarding support.
It costs money to run a podcast. There are monthly costs,
but then you have things like this happen where a
(03:55):
piece of equipment goes faulty and you have to buy
a new one. So I am very thankful to all
the members of my Patreon who do support me with
a monthly amount.
Speaker 2 (04:07):
Which it's a very miniscule amount.
Speaker 3 (04:10):
It's actually in US dollars one dollar, and in the
UK that's less than a pound. It's probably like eighty pence.
It's nothing. You can't even go to a cafe and
buy a cup of tea or coffee for that price.
So I'm very thankful to all of you that do
support me that way, And you can also go and
(04:31):
buy me a coffee, and people do that and they
make one off donations, and there's been some very generous
one off donations by some extraordinarily kind people. So I
thank you for everyone that has done that. But the
number one way, and I always say this, to support
the show is follow the show.
Speaker 2 (04:50):
And then leave a rating.
Speaker 3 (04:52):
And if you do think that I have earned a
five star rating, please leave the five star rating. I
look all the time to see her people follow the
show and how many people have rated the show, and
the amount of people that follow the show does not
correlate to the amount of ratings. It's nowhere near. So
if you do follow the show and haven't left a rating,
(05:14):
please do that, as it helps with the algorithms and
makes this show more visible.
Speaker 2 (05:20):
Also, you can follow me on Instagram. I am active
on Instagram.
Speaker 3 (05:24):
Post quite a few short videos and they're linked to
Facebook and then videos then they're published on Facebook automatically.
So if you want to follow me on Facebook, you
can and you'll see the.
Speaker 2 (05:33):
Videos as well. And you can also.
Speaker 3 (05:37):
Follow me on Twitter, but Twitter seems to completely shadow
or ban me. Don't seem to get much engagement on there,
and never really have. There are big accounts on there,
but they're probably funded and the algorithms are pushing them out.
Speaker 2 (05:52):
And what I've.
Speaker 3 (05:53):
Noticed recently is since doing some of the episodes with
Johnny Serucci regarding the Vatican and the Roman CAF Church,
nearly every single thing in my newsfeed now is pro
Roman Catholic Twitter accounts attacking Perstantism and promoting Roman Catholicism,
and these are the types of things that happen.
Speaker 2 (06:13):
They target you that.
Speaker 3 (06:14):
Way because I've obviously posted on Twitter regarding these episodes,
so there are ways you can sort of connect with
me and support me. But now we're going to get
into this episode, and this is relevant and up to date,
and this is relevant for everybody on Earth. And obviously
(06:36):
you've seen the title of the episode, the Digital Prison,
So let's get into this now. So I want to
look at some of the sort of foundations first of all,
of financial surveillance.
Speaker 2 (06:49):
So we're going to be moving, aren't we.
Speaker 3 (06:50):
Into a cashless society. There's this transition happening, and it
is going to happen because cash transactions are becoming less
and less. And it's estimated that by twenty twenty eight,
ninety nine percent of all transactions are respected to be digital.
(07:11):
Ninety nine percent. That's the expectation. We're not fared away
from that. And digital transactions they create metadata trails, building
behavioral profiles accessible to corporations and governments. And I've seen
a video this morning, so I'm recording this episode and
(07:33):
looking for some information regarding this cashless transition. And anyway,
there was this video and it was on Twitter and
this guy had filmed it and he's gone into an
Aldi in Greenwich, which is near London, and he was
a tourist. He wanted to buy some food and he
(07:54):
walks into the shop. So Aldi is a small, smaller
type of supermarket, not as bigger sort of you know
your Walmarts in America and we have like Tesco and
as they're over and if you live in the UK,
you know what, Aldi is sort of cheaper sort of
place to shop. So he's gone in and what he's
(08:17):
noticed straight away is there's these gates, like when you
go into a train station, for example, and you have
these gates where you've got to scan your ticket or
put your ticket in and then the barriers open up.
There were these gates, these barriers similar to that, and
all on the walls there was information regarding downloading this
(08:39):
particular app. And you've got a scanih QR corded download
this app and then you scan your QR cord from
the app to gain access to this Aldi. It was
completely cashless. You couldn't purchase anything with cash in that store.
And this gentleman obviously was showing that is this the start,
(09:01):
you know, of this digital prison being sort of constructed
around us, and this is happening more and more, and
these obviously these things are there to sort of get
us used to moving over to a complete cashless society,
because you couldn't just do it overnight, could you. So
these things are in place now, so you couldn't go
(09:22):
in this how day and buy anything with cash. You
couldn't even access the place unless you add this app Now,
one of the things that I've talked about previously on
is central bank digital currencies, and apparently the aim of
CBDCs is to modernize money for faster payments, lower fees,
(09:44):
financial inclusion. But they also enable real time surveillance. So
the Federal Reserve Chair JR. Own Powell noted that they
could require all payment data tracked. The Caterin Institute one.
CBDCs threatened privacy buffers, third party banks become obsolete, giving
(10:06):
governments direct ledger access. The IMF and DHS studies echo
concerns CBDCs centralized personal transaction records, risk of hacking, and
allow programmable behavior. That's the key thing programmable. Now, the
World Economic Forum notes cryptographic methods such as zero knowledge
(10:31):
proofs could preserve privacy, but only if policymakers enforce them early.
Speaker 2 (10:36):
Now, for those of you that don't know what zero.
Speaker 3 (10:38):
Knowledge proofs are, they're a cryptographic method that allows one party,
so the prover, to prove to another party, the verifier,
that they know a certain piece of information without revealing
the information itself. So a simple analogy is person claims
(11:02):
they know the password to a lock door, and instead
of telling you the password, they go through the door
and come out the other side, proving that they know
it without ever saying it aloud. And that's the essence
of this zero knowledge proof and how it works is
zero knowledge proofs must meet three criteria Completeness. If the
(11:24):
statement is true, an honest verifier will be convinced by
an honest prover, soundness. If the statement is false, no
cheat improver can convince the verifier. Otherwise zero knowledge. The
verifier learns nothing about how the proven knows the information,
only that it's true. And in real road application, it's
(11:47):
privacy preserving. Cryptocurrencies z cash, for example, uses this zero
knowledge proofs to allow anonymous transactions that are still publicly verifiable.
You can prove your own coins and made a transaction
without revealing amounts, addresses, or balances. And then you've got
authentication without passwords, Prove you know the password or biometric
(12:12):
key without actually sending it, protecting against leaks or interception.
Secure voting prove your voters counter correctly without revealing who
you voted for. And confidential blockchain smart contracts execute contract
logic and prove outcomes without revealing sensitive business data. And
(12:32):
obviously this is something that's going to be needed. And
the point is, are they even going to enforce these
types of things early on, like zero knowledge proofs?
Speaker 2 (12:43):
Probably not, But why does it even matter.
Speaker 3 (12:46):
Well, in a world of increasing surveillance, data mining, and
identify theft, these zero knowledge proofs enable verification without exposure,
and it's a powerful tool for privacy, security and trustless systems.
But obviously our privacy is not on the agenda, is it.
(13:09):
And part of this sort of foundation of this financial
control that's going to come in is this digital wallet.
And the European Union, for example, has proposed a digital wallet,
and the European Commissioners advanced plans for a unified digital
(13:30):
ID wallet and basically it would store your ID, so
driver's licenses, bank information, medical data. But critics obviously see
it as a stepping stone to total access control and
that's exactly what I see it as. So money is
no longer anonymous. A single mistake and the legend knows
and can lock you out. And other things that have
(13:53):
formed a foundation leading up to bringing is into a
digital prison is things like the Patriot in the United States,
which was obviously enacted after the September the eleventh attacks,
it was past six weeks after nine.
Speaker 2 (14:10):
To eleven, and overnight.
Speaker 3 (14:12):
It gave the US government sort of power to be
able to vacuum up phone logs, emails, library records without
a traditional warrant. Well, what about the social graph that
has awakened? Officially not Zuckerberg launched a dawn room novelty
(14:35):
called the Facebook, and within a decade it morphs into
behavior crystal ball tracking like scroll speed, even dwell time
on a foto. So now stay with me on this.
So we're talking about Facebook, but let's.
Speaker 2 (14:51):
Link in DARPER.
Speaker 3 (14:54):
If you don't know what DARPER is in the United States,
you will know, But if you don't live in the
United States, DARPER is the Defense Advanced Research Projects Agency
and it is an agency of the United States Department
of Defense responsible for developing emerging technologies for military use.
DARPA funds and manages experimental research programs across multiple fields,
(15:17):
including artificial intelligence, autonomous weapons and robots, cybersecurity, advanced aerospace
for example, hypersonics and stealth tech, biotechnology and bioengineering, brain
computer interfaces for example, neural engineering, system design and next
gen communications, and quantum tech. But they also had a
(15:39):
project called Lifelog, and it was the government's ultimate diary project.
And Lifelog's purpose was basically to track and record every
aspect of a person's life in real time, so emails,
phone calls, websites, visited books and media, consumed, physical location
(16:03):
via GPS. Is it starting to sound familiar? Biometric data,
photos and video of daily activities, interactions with people, social
network mapping basically, but Lifelog was officially canceled on the
fourth of February two thousand and four. Its ultimate aim
(16:28):
was to create this comprehensive digital record of a person's
entire life, essentially the ultimate digital diary. So it was
canceled officially fourth of February two thousand and four. When
was the Facebook launched the fourth of February two thousand
and four. Do we think it's a coincidence, Well, let's
(16:54):
look a little.
Speaker 2 (16:54):
Bit more at lifelog.
Speaker 3 (16:56):
So it was basically Darper's attempt to create this toll
digital diary, and it would ingest everything you did, every click, call, purchase,
interaction and movement. All this data would be stored, categorized
and used to improve AI understanding of human behavior. Dorper
(17:16):
said it would have military applications helping soldiers recall critical events,
or let in analysis study how decisions are made over time.
But to privacy advocates, it was a dystopian horror story.
So think about Black Mirror now, but in two thousand
and three, your memories, your choices, your habits stored in
(17:39):
a government server form an undergoing public scrutiny. The project
was quote unquote officially canceled on the fourth of February
two thousand and four, but on the very same day,
to the hour, according to some online sluths, Facebook went live.
Speaker 2 (18:01):
What was Facebook?
Speaker 3 (18:02):
Well, apparently it was Mark Zuckerberg's idea, and it was simple.
It was a digital version of the printed Facebook's Harvard
used to distribute to income in freshmen, adding some social features, friends, relationships, statuses,
interest and suddenly you have a profile driven network that
(18:24):
mapped relationships and personalities. At first it was just at Harvard,
then Yale, Columbia, Stanford. Then it opened to all colleges,
and by two thousand and six to the entire world.
And what was everyone doing there? Uploading photos, logging locations,
sharing interests, tagging friends, expressing political leanings, joining groups, voluntarily
(18:48):
doing what Lifelog had once dreamed of collecting by force.
The platform became a mirror of the user, a data
stream as revealing as a diary. Now is where the
conspiracy deepens. What if lifelog didn't die. What if it
was merely privatized, transferring surveillance projects from public institutions into
(19:14):
the hand of private tech companies.
Speaker 2 (19:17):
And it's a pattern seen before.
Speaker 3 (19:18):
So think of Pelluntear, Google Earth's CIA routes, or inker
Tell investments. By letting a cool platform take over the
data collection process, the government sidesteps constitutional challenges. People opt in,
They want to be seen, They crave the validation the
(19:39):
update willingly. So Facebook becomes the trojan horse, the smiling
face of the physic surveillance state.
Speaker 2 (19:46):
Now there is no smoking gun that.
Speaker 3 (19:49):
Links directly Lifelog and Facebook, and obviously Zuckerberg's never acknowledged
any connection, and neither as Darper.
Speaker 2 (19:58):
And they've not.
Speaker 3 (19:59):
Released any do documents suggesting a handoff, but why would they?
But there are some curious overlaps Facebook's data structure. In
early years, mirrored lifelogs proposed schema social graph media tagging
timeline based recall. Darper researchers have gone on to work
with Silicon Valley companies including Meta and Google, and the
(20:22):
sheer scale of the data Facebook collects, such as biometric scans,
emotional sentiment, location logs, device data, political views, is more
expansive than even lifelog had imagined. So is it just
a coincidence or was lifelog never canceled and just rebranded.
(20:44):
Whether it was a coincidence or it was a replacement,
or just a perfect storm of timing, Facebook has become,
by many measures, the largest behavioral mapping system in history,
and today it owns Instagram on WhatsApp, It shapes elections,
its censors content, It sells data. It even builds virtual
(21:07):
worlds where your avatar can live in an alternate reality.
Maybe a lifelog failed because it was too obvious, too
top down, But Facebook Facebook taught us to build the
cage ourselves, one photo, one like, one status update at
a time, and we didn't even ask for a key.
(21:32):
Then in twenty thirteen, thanks to Edward Snowden, the curtain
was pulled back. Edward Snowdun's Nissay leaks confirmed that private
firms and spy agencies share a bed named Prism Your
cloud backups, your contact list, your Skype calls, your Zoom calls,
all mirrored into a UTAH data center. In the digital
(21:56):
Prism model, this is the moment the guards find their
watchtower keys, surveillance once ment being tailed by men in
trench courts. But Paul snowdon it's baked into servers that
never sleep. Identity, biometric tracking, and behavior control are all
(22:46):
key parts, key elements to this digital prison. And you've
got the digital eed D system. Now India has the
air Door, which links biometric data to every citizen liking
transit welfare and critics warn of trackability and unauthorized sharing,
and Indian mandates biometric ID so fingerprints Irish scans for example,
(23:12):
to over one billion people. Then you've got ID twenty twenty,
which is backed by Bill Gates and Microsoft, which pushes
digital identity as a human right, obviously, which raises the
question of who holds the key. Then you've got the
social credit system Avenue, which China has rolled out and
(23:35):
basically the pilot that they used in this credit system.
It encourages compliant behavior and it flags nonconformist behavior with
travel bands, job limitations, etc. There was a documentary I
watched on artificial intelligence and it talks about this social
credit system. And as a test of the system, this
(24:00):
guy decided he was going to light a cigarette on
a bus and he did that, which is obviously prohibited.
He let the cigarette and everything had a smoke. By
the time they got home, he'd already been fined and
the fine had been taken directly from his bank account.
So this is the social credit system. However, there are
(24:24):
prototype programs in Europe and the United States which referenced
China's model, and it's a step towards global behavioral scoring.
And obviously this is not widely known. And you face
your fingerprints, even how you walk is digitized, it's watched
and it's scored. And digital identities lock just as much
(24:48):
as they open. What about access to health as a
means of control? We had the contact tracing apps, didn't
we in the vaccine passports during the lockdowns for the
COVID nineteen pandemic. As we know, it wasn't a pandemic,
it was a planned scandemic, but there was the normalization
(25:12):
of health based accessed World Economic Forum Digital Health Network
twenty twenty three EU Digital COVID Certificates airline mandates. In
twenty twenty five, the system tested gated societies, digital health,
digital access, And don't forget that this was a test.
(25:34):
The COVID scandemic was a test. It was nothing more
than a test to see how people are going to
react and how they can bring these systems in and.
Speaker 2 (25:45):
How they will work.
Speaker 3 (25:46):
So to board a plane, attend a conference, you may
have to wave your phone instead of an actual ID.
But do you trust the code that they've scanned. That's
the thing, because it's going to be a code, isn't
it a QR code to gain access to these things?
And then you've got things like financial clamps such as
(26:09):
the Canadian clamp down precedent. So in twenty twenty two,
Ottawa froze thousands of freedom convoys protesters bank accounts, proof
that digital finance is instantly politicized.
Speaker 2 (26:24):
Don't forget that.
Speaker 3 (26:25):
What they did, these peaceful protesters had the bank accounts
frozen and they did not have access to their money.
And this is what I keep saying to people, if
it is not in your hands, it does not belong
to you. This is how easily they can turn off
your access to your finance in those banks it's not yours.
(26:49):
And I've said it before and I'll say it again.
Under fractional reserve banking, banks only hold around ten percent
of the money that's on the ledgers on their books.
When you look at your bank account, that's not money
that's yours. It's money the bank call you. And you'd
better hope that eleven percent of people don't want the
(27:11):
money out, because if that's the case, the banks will crash.
So programmable money, it's behavioral economics or behavioral control, isn't it.
That's what it is. It's behavioral control. CBDCs allow for
money that expires or is conditional, no political fundraising, no
(27:34):
one approved meats.
Speaker 2 (27:35):
For example, you might buy.
Speaker 3 (27:38):
A stake, you go back a few days later.
Speaker 2 (27:42):
In your fancy another piece of red meat.
Speaker 3 (27:46):
But your programmable money doesn't now work because you're only
allowed one piece of red meat a week because it's
bad for the environment. Otherwise, and there's many different countries
going to be bringing in central bank digital currencies. By
(28:06):
March twenty twenty four, over one hundred and thirty countries
were actively engaged in CBDC research.
Speaker 2 (28:14):
And as of the first quarter of.
Speaker 3 (28:16):
Twenty twenty five, Eleven countries have fully launched to CBDC,
including China, Nigeria, and the Bahamas. Detailed CBDC pilots are active,
like I said, in thirty six countries and twelve of
them are targeting cross border functionality. Sixty two percent of
central banks cite financial inclusion as a primary motivation for
(28:40):
CBDC development in twenty twenty five, and I know there's
going to be a digital Pound currently talking about that
and looking into development, and I think there might be
a further announcement on a digital Pound at the end
of this year or the beginning of next year. And
I know there's a digital Euro further down the pipeline,
(29:00):
and even a digital Pound. And another resource that I
read said that even your savings could have a time
around them, so they could expire. So people they go around,
don't they working hard to pay the bills, feed the
families and to save some money. Well, apparently even this
(29:20):
digital money, these cbdc's, your money will expire at a
certain date. Don't know how long it will last, but
that's apparently what could happen and possibly will happen. It
wouldn't surprise me, to be honest with you, and then
what about climate surveillance, sort of carbon rationing and eco policing,
(29:41):
carbon footprint tracking. So MasterCard plus Economy row carbon calculators
into apps and banking. They track every purchase and all
the purchases that are tracked, which is everyone. Like I've said,
they track the purchases see or two. Obviously the carbon
foot print of this purchase that you've made. So you've
(30:03):
purchased meat, whatever, or you're purchasing fuel. So in the future,
once these limits are hit, the muted, aren't they the programmable.
So you've put too much fuel in your car this week,
you've used your card to pay, or you've you know,
you've used your CBDC to pay too much fuel.
Speaker 2 (30:24):
Well, what you're gonna do now?
Speaker 3 (30:26):
You need to get to work or you can't because
your cards run out of fuel. But you've used too
much or they say you've used too much. It's basically
a form of ration in this This is this is
what it amounts to. They're gonna put you on rations.
If you go into your banking app. You've got online banking,
which I'm certain most people do have, go on your
(30:47):
banking app and see if the carbon score in you
I would imagine.
Speaker 2 (30:51):
That they definitely are.
Speaker 3 (30:53):
But they can't carbon score you if you pay cash,
although see it's cash transactions coming out how much you've
taken out, And then they don't know what you've done,
they don't know what you've spent your money on. It's private,
and they don't like private. And then we've got AI
censorship and pre crime surveillance of real time content moderation.
Speaker 2 (31:16):
Social platforms auto.
Speaker 3 (31:17):
Flag what they have decided is misinformation. Algorithmic sensors with
or PAKE rules. Government's game backdoor access to these systems,
which is a pipeline into speech control. And here's an example,
a real world example of out this speech control, how
these governments can shut down speech that they don't want
(31:42):
to hear. So the UK has what's called the Online
Safety Bill and it was passed on the twenty sixth
of October twenty twenty three, with full provisions rolling out
in mid twenty twenty five. And the scope is that
it applies to any user to user or search service.
Says with significant UK reach and global platforms are included
(32:05):
and the key duties apparently are age verification f under eighteens,
especially for adult self harm or extremist content, rapid removal
of illegal.
Speaker 2 (32:15):
Content, which is obviously a good thing.
Speaker 3 (32:19):
Mitigating legal but harmful content seen by children, and transparency
on content pulses.
Speaker 2 (32:25):
Offcom can impose fines of up to.
Speaker 3 (32:28):
Eighteen million pounds or ten percent of the global turnover,
and even block sites.
Speaker 2 (32:34):
Now here's the thing bit shoot.
Speaker 3 (32:38):
If you're not familiar with it, it's a similar platform
to YouTube, but you can say more on bit shoot.
Let's put it that way. It's now banned in the UK.
I can now no longer go on bit shoot and
watch any videos at all. I have a bit shoot
channel which no longer know in the UKA can watch
(32:59):
because of the Online's Safety Bill. And the reason why
you can't access bitchoot is.
Speaker 2 (33:04):
Because the government over here deemed that.
Speaker 3 (33:09):
It hosted extremist, conspiracy and hate content. And in May
twenty twenty five, bitch you announced it would withdraw UK access,
citing harassment from OFCOM and refusal to comply with the
Online Safety Act. The platform said it could not meet
obligations to proactively scan every upload and user comment for
(33:34):
extremist content as required by OFCOM guidelines. Now just take
note right, extremist content me saying the homosexuality is a
sin would be classed as that sort of extremist content.
Even though this is biblical, it's to shut down freedom
(33:54):
of speech is in it.
Speaker 2 (33:55):
That's the reality. So anything there was probably.
Speaker 3 (33:59):
Extremist content on there as well, I would imagine. But
the site's completely shut down now to people in the UK,
and it's an attack on freedom of speech and offcom
is empowered to block non compliant services under section ninety
two of this Act, and Bitshoot became the first platform
(34:20):
targeted under this power. So what does this mean, Well,
there's been a precedent set now UK now routinely blocks
or bands platforms that refuse to comply. There's privacy trade offs.
The low pressures provide us to weaken end to end
encryption and deeply analyze encrypted traffic, and there could be
(34:41):
global impacts the United States or the European Union.
Speaker 2 (34:45):
Users may now feel.
Speaker 3 (34:46):
The effects because platforms may tailor their contrent by country
to stay compliant. What's the bottom line, Well, the Online
Safety Act is the UK's most powerful online reggie relation yet,
spanning child protection, content moderation and encryption. It forces platforms
(35:07):
to censor or risk the UK site blocking it and
bit shoot's self exclusion is the first major example, and
this sets a global benchmark for digital safety policy and
sparks the debate about privacy, encryption and free speech online.
Speaker 2 (36:13):
Let's now talk.
Speaker 3 (36:14):
About the invisible bars of the digital prison, the kind
of exists not in code or in metal, but actually
in your mind. So it's psychological and behavioral conditioning. Because
if the digital prison is real, it's not just built
with devices and apps, it's built with behavioral science, neuroscience,
(36:37):
and then the manipulation of human psychology. And this isn't speculation,
it's documented history and there's a science of behavior manipulation.
Behavioral psychology has long been weaponized by advertisers, but today
it's turbo charged by data and AI. So let's explain
(36:59):
this now. Platforms like Facebook, TikTok, and YouTube don't just
show your content. They shape how long you stay, what
you think, and what you feel. Every scroll, every like
or pause on a video trains a normal.
Speaker 2 (37:17):
Net not just about you, but for you.
Speaker 3 (37:21):
That neural net then manipulates what you see next to
reinforce behavior, just like a rat in a skinner box.
Reward punishment, reward, distraction. And this is what research is
called persuasive technology. It's tech designed to subtly alter your
choices without you even realizing it, so you're not just
(37:45):
using the app, the app is using you.
Speaker 2 (37:49):
There's the dopamine loop.
Speaker 3 (37:51):
Behavioral engineers have replicated the dopamine reward loop found in
gambling addiction. Notifications act as variable rewards. You never know
what you're going to get, likes, messages, outrage. You don't
know which one, and the uncertainty keeps you hooked. It
(38:11):
activates the same neural pathways as slot machines. This creates
digital dependency, not just on the platform, but on validation itself.
Now ask yourself, how free can you be when your
sense of self worth is being chemically engineered by someone
else's algorithm.
Speaker 2 (38:34):
Many of the.
Speaker 3 (38:35):
Design decisions you take for granted were intentionally built to
control behavior. So, for example, the infinite scroll, which equals
endless engagement. If you go on TikTok, you go on Instagram,
you can watch the videos and you can just keep
scrolling and scrolling and scrolling, and it never ends. There's
(38:55):
just more videos after more videos, stories that disappear, which
equals urgency to keep checking. They're badges and alerts, visual
cues to trigger anxiety AI suggested content. These aren't neutral
tools that actually triggers and over time, they don't just
(39:17):
alter your behavior, they reshape your attention span, your decision making,
and even your beliefs. And that is a thing, isn't
it the attention span? Because a lot of these videos
on Instagram, TikTok, they are short videos, and you will
find yourself if the videos maybe three or four minutes long,
you might not even want to watch it because you've
(39:38):
no attention span. Now, everything's like a minute, minute and
a half even less. Some of them are even less
than that. And the mind is the first cell in
the digital prism. It's decorated beautifully, but you are still
locked inside of it. So that's sort of the conditioning
part of it. But we're going to move deeper now,
look into prediction, because to contract you and shape you,
(40:01):
the next logical step is to predict you and then
control you even before you act. So what are predictive algorithms?
These are systems that analyze your behavior and predict what
you will do next. They use historical data, clickstreams, purchase patterns,
emotional responses, typing speed, voice tone, and even eye movement.
Speaker 2 (40:27):
And what's the result.
Speaker 3 (40:28):
It's a behavioral profile so accurate it can forecast your
actions seconds, days, or even years in advance.
Speaker 2 (40:38):
And here's the kicker.
Speaker 3 (40:40):
One's prediction becomes precise enough it enables preemptive control. So
then you're into predictive policing. AI watch list cameras and
phone data enable pre crime models. Machine learning predicts who
is at risk a watch list for an act. Algorithms
(41:02):
decide who speaks, who get stopped before we even know why.
You become a pattern and not a person. And the
real threat isn't just that they predict you. It's that
they no longer need to understand you as a human being.
Speaker 2 (41:18):
You become a.
Speaker 3 (41:19):
Cluster of probabilities seventy three percent likely to engage with
this fifty six percent chance you'll share that eighty eight
percent chance that you will be a political dissenter. It
will be a risk of political dissent. Law enforcement uses
this in predictive policing. Advertisers use this to trigger purchases
(41:44):
before desire even consciously arises. Employers use this in hiring
algorithms that never tell you why you weren't selected. In
the digital prison, the wardens are algorithms, and your sentence
is calculated in silentance. Now let's get to the real
heart of this actual digital prison that's been created. The
(42:07):
bars may be built from data, the locks may be
made of cold, but the true control mechanism is fusion,
surveillance and prediction working together to quietly reshape society. Individually
they're powerful, but together they form a system of invisible governance,
a regime of influence where you don't need to be
(42:29):
told what to do because you're already being steered toward it.
So let's break it down a little bit. So surveillance
watching everything and forgetting nothing. Modern surveillance isn't just about
cameras or wire traps. It's about total digital awareness. Every click,
(42:50):
every search, every gpsping, every voice. Not every smart fridge, smartwatch,
smart car feeds data into a system designed to know
everything about you. But the most chilling part, it's not
just about what you do, it's how you do it.
Speaker 2 (43:10):
How long you pause on a headline, the rhythm.
Speaker 3 (43:13):
Of your typing, the tremor in your voice on a
customer service call, the heart rate while you're watching a video.
These are behavioral signatures that are always on. It's not
surveillance like all well imagined with telescreens and constant monitoring.
It's softer, it's smarter, and it's voluntary because we carry
(43:37):
the sensors in our pockets and call them convenience. You
don't need your home wire tapping anymore because the wiretap's
already there. And people speak to the whitre tap, don't they.
They say, hey Google or hey Alexa. Make no bones
about it. Those things are monitoring you. So are smart televisions,
(44:00):
so prediction turning you into a forecast Oncetough data is collected,
it's not stored in a folder with your name on it.
It's process correlated and modeled. The goal is not just
to record what you do, but to predict what you
will do. And this is where big SEC and big
(44:21):
government meet. Predictive algorithms. They can now estimate when you're
most vulnerable to adds, what political idea will trigger you,
whether you'll default on a loan, or even if you're
a risk to national security before you've ever done anything.
Sounds dystopian, doesn't it. Well, it's already in place. Predictive policing,
(44:45):
social credit scoring, risk assessments in job hiring, content moderation
based on behavioral patterns, not violations. And here's the key.
Once your future is statistically mapped, you can be managed
in the present. What about the silent system? This social
(45:06):
control brings us to control without coercion. The new model
of power doesn't jaw you. It guides you, It nudges you,
It invisibly restricts you. So let's say your online activity
makes you high risk.
Speaker 2 (45:22):
That would be me.
Speaker 3 (45:24):
I would one hundred percent because of this podcast. Videos
have put on YouTube, Lumball, Instagram. I'm high risk. I'm
already completely banned off TikTok. I've tried to open another
account just to test it, different email and everything, can't
do it and banned.
Speaker 2 (45:41):
So it must be IP address band.
Speaker 3 (45:44):
That must be what's happened, So you might find your
post quietly shadow band. Yep, that's happened to me. For example,
you apply for a job and the job application mysteriously
goes nowhere. Your bank freezes your account pending verification. Your
name ends up on a travel watch list. No arrests,
(46:07):
no warnings, no confrontation, just digital friction, just your freedom
slowly thinning until you comply or you just disappear into irrelevance.
And this thing to do with being on a travel
watch list. There's people it's happened to. There is a
(46:29):
guy over here called David Iyke. Some of you may
be familiar with him. He does say some very good things.
I disagree with his views on obviously God and everything
because he's not a Christian. But he's on a travel
watch list. I think he's banned from a number of
countries now and I think one time he had arrived
in Australia for a speaking engagement. I've heard him talk
(46:51):
about it, and they just wouldn't let him in and
they had to go home and all the speaking engagements
were canceled. I think it cost thee thousands of pounds
to cancel these. They lost money because they'd all been
booked on everything. And that's it is banned, and this
is what can happen, you see. And they're doing it
because they want this man to comply. And it's what
(47:14):
the Chinese philosopher be Young Truell Hand called the psycho
political regime. Control by data power, through transparency.
Speaker 2 (47:24):
You're no longer rule by fear but rule.
Speaker 3 (47:27):
By optimization, and AI is the new bureaucrat. What's more
dangerous than a government bureau cart watching your entire life,
a no own net with no empathy, no context, and
no accountability. AI systems are already replacing humans in decision
(47:49):
making processes across howling, lending, housing, law enforcement, and immigration.
And these systems are trained on surveillance data and used
to make predictions that shape your fate. You can't appeal
to them. You can't argue with a model. You can't
ask a machine to understand your nuance. You are a
(48:11):
statistical risk vector. You're not a human being. And this
is the dehumanizing effect of predictive surveillance. So imagine now
the world and everyone knows they're being watched or scored,
So the world they're aware of it. You don't need
(48:32):
a dictatorship, do you. In a world like that, people
will self censor, they will self align with whatever behavior
avoids friction. They'll stop sharing controversial opinions, turn down your
criticism of the state. You don't attend that protest because
you know you're being profiled. This is social control through
(48:55):
internalized surveillance. Michael Foke called this the panopticon effect a
prison where inmates can't see.
Speaker 2 (49:06):
The guard, but they know that always being watched.
Speaker 3 (49:10):
Today, the panopticon isn't a tower, it's a cloud server.
And Christians who are listening to this now understand that
behind this is the Devil, because ultimately what he wants
to do is so control society and so controlling to
(49:32):
the point where any discent at all, for example, preaching
the Gospel will result in you being pushed out of
society completely. And what does it talk about in revelation,
By the mark of the beach, you won't be able
to buy yourself anything. This is where it's going. This
is the endgame, complete domination, and it's ultimately aimed.
Speaker 2 (49:56):
At stopping the Gospel.
Speaker 3 (49:59):
That's the ultimate aim of the devil, to stop the
Gospel going forth, to put this digital prison in place
so it so controls the people that they won't have
any what there called controversial opinions, and controversial would be
anything that's obviously criticizing the state, or for example, you
(50:24):
speak out against homosexuality, you speak out against abortion, all
these things will be considered controversial and then obviously you'll
be shut out from society.
Speaker 2 (50:39):
Want your things will be blocked.
Speaker 3 (50:40):
You won't be able to get a particular job. You
won't be able to go on a plane, you won't
be able to travel on a train, you won't be
able to buy certain things.
Speaker 2 (50:48):
And slowly, but.
Speaker 3 (50:49):
Surely, the digital prison boys will just close in around
you until your cell becomes even smaller and smaller. The
most obvious example is obviously again China's social credit system,
real time behavioral scoring that determines your access to housing, education,
or travel in the West, its soft control credit score,
(51:12):
social media, shadow bands, algorithmic throttling, cancel culture. They don't
need to arrest you to stop you. They just reduce
your reach, slow your influence, deny your opportunities behind the scenes.
And prediction is becoming a tool of pre crime, the
silent mechanism that nudges society towards compliance without a gun
(51:34):
and without a trial. And we've been promised, haven't we
a future with digital freedom, of open expression of knowledge,
without borders?
Speaker 2 (51:44):
And instead whatever, we.
Speaker 3 (51:45):
Got a system that monitors us, predicts us, manages us
into obedience. Surveillance alone is oppressive, prediction alone is manipulative.
Together they create a system where control becomes ambient invisible,
and worst of all, accepted. The question now isn't already
(52:10):
being watched, it's what kind of future are we being
predicted into? And who benefits from it? But what makes
this even more insidious is consent. You agree to the terms,
you check the box, you downloaded the app. But were
you ever told that your data would train AI models
(52:33):
more powerful than any surveillance system in history, that your
identity will be reduced to predictive equations, that your freedom
of thought will gradually be conditioned out of you by algorithms.
We're not just living in a surveillance state. We're training it,
(52:53):
feeding it, and letting it learn who we are before we.
Speaker 2 (52:57):
Know ourselves up.
Speaker 3 (53:01):
And I always like to try and end with hope,
because there is.
Speaker 2 (53:05):
Hope for those who were born again of the spirit
of God.
Speaker 3 (53:11):
What it can't do is it can't get into your
mind when you're knelt on your knees praying to the Almighty.
Can't hear your prayers. They can't hear your prayers. When
you sat in the comfort of your own home reading
your Bible. They can't see what you're reading. If you've
(53:32):
got a physical copy of the Bible, they can't see
it when you're in your car traveling to work and
in your head and either it. Sometimes you're thinking upon
the greatness of the Lord and meditating upon him and
the great things He's done, and even praying silently. They
can't get to you that way. They can't take your salvation.
(53:54):
Your salvation cannot be taken from you and cannot be
lost those that are born again. Christ said, no one
can take them from his hands. No one can take
you from the Father's hands, nobody at all. So yes,
this digital prison is being set up, and it is
being constructed around us. Even as I'm speaking now, even
(54:16):
as you're listening, things are going on behind the scenes
that we're unaware of. The military have AI that is
probably thirty years more advanced than what we have in
the public domain. But like I said, they say, I
can't take your salvation off you. So if you are
(54:36):
a Christian, don't despair. There is hope. Christ is going
to return. He is going to return, and he's going
to return in great power and great glory. And if
you're not a Christian, suggest you get your hands on
a Bible, begin to read it, and seek the Lord
with your whole heart. Repent of your sins and believe
(54:58):
in Jesus Christ, because great some times are coming on
this earth.
Speaker 2 (55:03):
There is a great tribulation.
Speaker 3 (55:04):
Coming on this earth. The digital prison continues to be constructed,
but Jesus has come to set the prisoners free, and
those that the sun sets free are free. Indeed, hopefully, guys,
this has been helpful to you. Some of the stuff
you may have already known about, but I just felt
that it was important to talk about it due to
(55:27):
some of the things I've been reading, and it's not
commonly known a lot of this information.
Speaker 2 (55:32):
So hopefully it has been helpful and God willing as
long as I don't.
Speaker 3 (55:36):
Mess up the timings again, I will be back next
week with a guest for an interview about another fascinating topic.
Speaker 2 (55:45):
So that's it for me this week.
Speaker 3 (55:47):
I'm Paul, and this is beyond the paradigm.
Speaker 2 (55:55):
My crazy. We don't use that word in here.
Speaker 1 (56:15):
S