All Episodes

May 1, 2025 90 mins

The Hoover Institution Program on the US, China, and the World hosted Digital Authoritarianism and Strategies to Promote a Democratic Digital Future, on Monday, April 28, 2025 from 4:00 – 5:30 PM PT in the Shultz Auditorium, George P. Shultz Building. 

The People's Republic of China is collecting and analyzing unprecedented volumes of data from both public and private sources, within and beyond its borders, for social control. It is leveraging advanced data-centric technologies such as artificial intelligence, neuro and immersive technologies, quantum computing, and digital currencies to enhance and export its authoritarian governance model. This has led to an erosion of privacy, personal freedoms, and a climate of fear and self-censorship within the PRC. As the PRC exports its technologies to other countries, these authoritarian practices may spread globally. What are the most effective strategies for democratic societies to prevent the misuse of emerging technologies for surveillance and control by authoritarian regimes? How can we effectively track and monitor the global spread of data-centric authoritarian practices? What approaches can democratic governments and civil society adopt to develop and promote privacy-preserving solutions that offer viable alternatives to authoritarian methods, while ensuring accountability, transparency, and the protection of human rights? How can we engineer democratic values into the architectures of our technology platforms? In this event, our panel will examine the unique aspects of the PRC’s approach to digital authoritarianism and the opportunities for a democratic response. 

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
[MUSIC]

>> Larry Diamond (00:10):
Okay, good afternoon,
everyone who is physically here inthe George Schulz Auditorium and
everyone who is joining us virtually forthis program that's going to
in part be discussing virtualtechnologies of various kinds.
I'm Larry Diamond.
I'm a senior fellow hereat the Hoover Institution,

(00:31):
working with Glenn Tiffert andElizabeth Economy and
Frances Hiskin on the US China andthe World Project,
which is one of the co-sponsors of thisevent with the National Endowment for
Democracy on Digital Authoritarianism and
Strategies to promotea democratic digital future.

(00:55):
I want to say this event isinspired by a paper authored by
one of our speakers that Ihighly recommend to you.
The title of the paper isa little bit different.
It's called Data Centric Authoritarianism,
how China's development of frontiertechnologies could globalize repression.

(01:18):
And if you just type intoGoogle Data Centric Authoritarianism, and
the author, Valentin Weber, who you'llbe hearing from soon, who's written.
I wanna stress this remarkable andinvaluable paper, or
type in Data Centric Authoritarianism and

(01:38):
National Endowment for Democracy orany combination like that,
you will get this remarkablepaper that you can read for free.
We're very pleased not only that Valentinis with us, but the Vice President for
Studies and Analysis ofthe National Endowment for Democracy,

(02:00):
Chris Walker, who will speakafter me briefly and welcome you.
And Beth Curley,who's the Senior Program Officer for
Technology andDemocracy at the National Endowment for
Democracy, who is going tojoin us on this podium and
has helped to conceptualize andcontributed in a way to this report.

(02:23):
I want to just make a few substantivecomments before I turn it over to Chris.
This event engages three broad themes inthe relationship between technology and
democracy.
First, the progress of technology.
What is the pace of it?
Second, the competition and technologicaldevelopment between autocracies and
democracies, and more specificallybetween the US And China.

(02:47):
And third, the diffusion of technologies,and therefore potentially
diffusion of technologies of control andrepression on the technology itself.
There are interesting debates aboutthe paces at which these are moving.
Valentin's paper talks aboutdifferent technologies,
digital forms of surveillance,digital and biometric,

(03:11):
AI enhancement of digital surveillancethat may be moving at a different pace
perhaps than some of the further outtechnologies like quantum computing,
and neuro, and immersive technologies.
But we'll hear, andthen you'll make your own judgments.
The second point, obviously we'reat a kind of different moment

(03:33):
now than perhaps when thisproject was conceptualized.
We've long since entered an era whenleadership in the development of critical
technologies has become contested,not only between different democracies,
but between, again,democracies and autocracies, and
more especially between the People'sRepublic of China and the United States.

(03:57):
And the rapidly emerging context,which I think really we need to remain,
you know, deeply mindful of,both from a technological standpoint and
from a democracy standpoint,a geopolitical standpoint,
is that China is surging intothe lead in some areas of

(04:19):
technological innovation, andit has the potential at least
to achieve broader dominanceif US Policy shifts result in.
This is my opinion now.
I won't attribute it to anybody else forthe moment.
If US Policy shifts resultin diminished funding for
research and development in the USin science and engineering.

(04:43):
And then two, the emerging environment,not only of diminished funding for
RD, but for visas forforeign talent that we heavily depend on
in the development of thesetechnologies and basic discovery
risks undermining U.S.leadership in many of these realms and

(05:04):
frankly, further advantagingauthoritarian competitors,
especially the People's Republic of China.
There are very few things thatworry me more than this right now.
All of this is happening my fifth andpenultimate point, at a moment when,
as our director ofthe Freeman Spogli Institute, Mike McFall,

(05:28):
who's just finished a book on this,has walked into the room and
our colleague as well atthe Hoover Institution,
when the competition between democraciesand autocracies is accentuating and
when the global recession ofdemocracy that's been underway for

(05:50):
nearly two decades is deepening.
And the final point which we'regoing to hear discussed here is
that the transfer of digitaltechnologies of surveillance and
control from the world'smost powerful autocracy,
which I call a neo totalitarian system,in my view,

(06:12):
that is not a misplaced characterizationof the People's Republic
of China to less technologicallysophisticated countries,
including some even that are formally andin other cases formerly democratic,
that digital transfer of technologyis accelerating at the same time.

(06:33):
And this is the final point, as thebalance of power shifts in the world and
as that balance of power shifts,if it continues,
democracies will have lessleverage to monitor and
restrain this transfer, andthey may wind up having less will to
aid Democrats in these countrieswho are struggling to push back

(06:56):
against these technologicalintrusions to defend their rights.
So with that, I introduce you tothe vice president of the most important
organization in the United States thathelps people around the world to defend
their rights, Chris Walker ofthe National Endowment for Democracy.

>> Chris Walker (07:21):
So thank you very much, Larry, for that.
Let me just say a few more wordsto build on some of the ideas that
Larry shared I'd suggestthat this event is dedicated
to discussing a truly importantdigital challenge to democracy.
And the discussion is an outgrowth of workwe've been doing with wonderful partners
here at Stanford,including a workshop we held last year.

(07:44):
I wanna take a moment to recognizeboth Larry, Frances Hisgin,
Glenn Tiffert, Elizabeth Economy,who's a member of our board,
also would like to thankEileen Donahoe for her leadership.
She's a member of our board,our vice chairman, who's here with us.
With us as well.
And she's done somuch on these issues over the years.

(08:04):
It's really helped us tosituate our thinking at NED.
Right now, this issue continues to beboth helping us situate the current
dynamics around the proliferation ofChina's authoritarian technologies.
And to look at how emerging technologies,from digital currencies

(08:25):
to quantum computing, may transformthe way authoritarians collect,
process, and make use of digital data.
Today, the world confronts diametricallyopposed visions of the future of freedom.
15 years ago,
many commentators assumed that liberationtechnology would lead to a world where
the free flow of information broke downwalls of authoritarian censorship.

(08:49):
And that people facing repression wouldfind new opportunities to connect,
organized, and that autocrats wouldfind themselves on a back foot.
China's trajectory in the interveningyears shows us otherwise.
In a system that's closely entangledhomegrown tech champions and
the party state, a resurgentCommunist Party is leveraging the power of

(09:12):
data-driven technologies to subjectcitizens to pervasive surveillance,
maintain a closed digitalecosystem that's censored and
suffused with state propaganda.
And to identify new populationlevel methods to reward favored and
penalize disfavored behavior.
Rather than empowering people andfostering open debate,

(09:36):
technology functions as a leverto keep the powerful on top,
establish narrative control, andclose down ever more space online and
off formanifestations of independent civic life.
While techno totalitarianism ispracticed in PRC may for now be unique,
China's global economic, technological andpolitical acumen creates vectors for

(10:00):
authoritarian practicesto spread globally.
Companies like Huawei provide the pipes,such as 5G networks through which
information flows and supply governancepackages such as safe cities that
shape how local officials understand,in some cases repress their people.
The technologies and trainingoffered by the PRC vendors bolster

(10:23):
the practice of authoritarianismelsewhere in the world.
Meanwhile, the secret andoften politically infected deals that
surround PRC tech imports sorry exportsprovide an opportunity for the CCP
to extend a shadowy web of influenceover foreign and economic elites.

(10:44):
Fellow autocracies from Russia to Iranto Belarus are leveraging the power of
AI surveillance to crack down on dissent,while Venezuela's ruling party has turned
a PRC sourced digital ID system into apowerful instrument of dictatorial power.
In in the UN andglobal technical standard setting bodies,

(11:06):
the PRC is showing its resolve to act asa norm shaper rather than a norm taker,
strategically occupying key leadershippositions in advancing norms antithetical
to basic democratic values from publicparticipation to free expression.
Over the past 15 years, the consolidationof the PRC techno authoritarian model and

(11:28):
the avalanche of democratic backslidingaround the globe have offered
a stark reminder to all of us that weshouldn't rest on easy assumptions about
where the world is headed or undulylimit our vision of the possible risks.
Therefore, the reportauthored by Valentin Weber,
which Larry commended everyone to andthat we released earlier this year,

(11:53):
offers a glimpse of just how pervasivethe dragnet of surveillance and
manipulation threatens to become inview of emerging technologies that will
enable authoritarian actors to breakinto not only currently encrypted data,
but the basic privacyrights of our thoughts.
For its part, the National Endowment forDemocracy is proud to lend itself and

(12:15):
its support to the courageous peoplearound the world who are working to
shed a light on secretivecross border deals,
outsmart increasingly sophisticatedauthoritarian sensors, and put technology
to work in the services of principleslike transparency and public engagement.
And as we move rapidly into a quicklyevolving, heavily contested digital

(12:38):
future, the democratic community can andmust put forward a vision,
an alternative to the authoritarianone that's on offer from the CCP.
So with that,let me introduce the panel very briefly.
I'm gonna start with Charles Mok.
He's a research scholar atthe Global Digital Policy incubator

(12:59):
of the Cyber Policy Centerat Stanford University.
That's a mouthful.
And a member of the Board ofTrustees of the Internet Society and
a board member of the International centerfor Trade Transparency and Monitoring.
Charles served as an electedmember of the Legislative Council
in the Hong Kong SpecialAdministrative Region,

(13:20):
representing the Information Technologyfunctional constituency for
two terms from 2012 to 2020.
Valentin Weber, who's the authorof the report that's the basis for
the discussion,
is a Senior Research Fellow withthe German Council on Foreign Relations.
His research covers the intersection ofcybersecurity, artificial intelligence,

(13:43):
quantum technologies, andtechnological spheres of influence.
Valentin Weber is also a China ForesightAssociate at LSE Ideas, the foreign
policy think tank of the London Schoolof Economics and Political Science.
And he holds a PhD in Cybersecurityfrom the University of Oxford.
And finally, let me introduce my colleagueBeth Kerley, who's a Senior Program

(14:06):
officer with NED's International Forum forDemocratic Studies.
She's the editor and
contributor to the forum series ofpublications on emerging tech and
democracy, including the report we'rediscussing, Data Centric Authoritarianism.
So let me introduce the three speakers tothe panel and welcome again to everyone.

>> Beth Kerley (14:27):
All right, so thanks Chris and Larry, and thanks to all of you for
being here.
Appreciate the Hoover Institution hostingthis discussion of our very timely and
important report.
So forthe moderated portion of this discussion,
there are kind of three broadthings I want to cover.
The first is this concept of datacentric authoritarianism, what it is,

(14:48):
how it looks in China andhow it's spreading.
The second is how frontier technologiesare potentially changing the game.
And the last is going tobe what we can do about it.
So starting at the conceptual level,
the title of the paper isData Centric Authoritarianism.
What's that idea all about?
What features of the authoritariansystem that we see in the prc,

(15:10):
the laws, practices, bureaucraticinstitutions, make data important and
make it make sense to think of these verydifferent technologies that are discussed
as kind of contributing tothe same political project.

>> Valentin Weber (15:23):
Thank you so much, Beth.
Thank you for the National Endowment forDemocracy for making this report possible,
and also to the Hoover Institution fororganizing this event and
bringing us all together.
So, Data Centric Authoritarianism,the report is really
about what the role of data isin China's surveillance state.

(15:44):
And for a long time,China has relied on information on people,
on informants,on people who are in the security
apparatus to report on dissidents andso on.
But really, in the early.
2000S, there was this drive fordigitization, for
putting up CCTV cameras and so on.

(16:06):
And throughout the last two decades,
really data became central to the CCP'sidea of how it wants to govern it.
So, kind of a scientific idea ofusing data to every day to predict
people's behavior,to see in real time where people are to
decide on whether the policeshould crack down on a protest or

(16:29):
when, not because the policecan't be everywhere, right?
If it's just a small protest,you might decide not to.
And that's really what the data gives you,
that insight as to what to do andwhat is really a change, I guess.

(16:50):
And what is the core concept ofthis paper is that this frontier
emerging technologies,they're at this time profoundly
changing again what the Chinesesurveillance state is.
And so I looked at four technologies.
One is quantum technologies andespecially quantum computing, which

(17:14):
is projected to break current encryptionin the next five to ten years or so.
And here the idea is reallythat the CCP would get
access to data that iscurrently protected.
Let's say if you use the Tor browserin China, which is difficult, but

(17:35):
if you use it,you're protected by encryption.
In the future that wouldn't be possibleif it hasn't been upgraded to post
quantum cryptography.
In terms of AI,that's the second technologies.
The first one is quantum technology andthe second one is AI.
And here really the CCP again hasused it to make sense of very,

(17:57):
very complex systems of millionsof people's behavior and
to look at the patterns.
And I think one of the core thingsof the article again here is
that the masses of data that the CCPis processing are growing and
growing and also that controlis becoming more centralized.

(18:20):
So we have at the moment these kind ofcommand centers where police use data
of cities.
And there is already a thingwhere it started in cities and
now it's getting to the provinciallevel where provinces can see
what's happening in the citiesin their provinces.
And so the current estimate is that,there is even two on two screens,

(18:44):
the Chinese secure credscan see around what 20% of
the population is doingover 200 million people.
So that's how centralizedcontrol has already become.
And that's because of AI.
Third technology is reallythe metaverse and newer technologies.
Metaverse mostly AR, VR, andnewer technologies can be invasive or

(19:09):
non-invasive, looking at thoughts.
And that's a technology which gives theCCP access to new data in the thousands or
in the 20th century they weren'table to look at thoughts,
but now it's getting easierby looking at things like

(19:30):
pupil dilation and things similar to it.
And lastly, the last technologyfrontier technology that the report
looks at is the digital currency.
And here again it's especiallylooking at what China did
regarding its central bank,which instituted the EUN and

(19:51):
which can potentially centralize thecontrol over financial data within China.
And the corporate core also ofthe report is not just to look at these
technologies separately, butlook at what all these technologies
together could do in transformingthe surveillance state and

(20:11):
shaping data in a way thatwas previously not possible.
Make it more centralized within China,new access to new data and
given also access to data that hasmight have been previously protected
from the prying eyes ofthe the surveillance state.

>> Beth Kerley (20:28):
Thanks, Valentin.
So running sort of through those comments,this idea of centralizing data, right,
creating an individual locus of controlbased in part on instruments that
are already there, particularly camerasthat blanket the physical space in China.
There's also of course the questionof digital surveillance,
which maybe Charles could say a bit moreabout, and then integrating that using AI.

(20:54):
So there's a foundation of alreadyexisting surveillance tech and
the potential forthese new capacities to augment that.
But before we get too deep intothe specific areas of tech development,
Charles, I want to turn to you for you'vedone a lot of different work looking
at the model of digital authoritarianismthat we see in the PRC and

(21:15):
particularly how it's projected outward.
So anything that you wantto add on the goals of
this pervasive web of digital control andhow we see it, for
instance, in technical standard settingbodies being promoted globally.

>> Charles Mok (21:28):
Yeah, thank you, Beth, and thanks Valentine for
the valentine for the report.
So to me,if you look at the regime in China and
how they look at variousaspect in their society,
including technology, business, culture,
education, everything,it all has to serve the party.

(21:53):
The party is the only goal.
So to me, when I look at how they viewtechnology ever since the Internet was
introduced to China, probably aroundjust like for the rest of the world,
the commercialization of the Internetin the early 90s, I think very quickly
they figure out that they have to findways to control it rather than just refuse

(22:14):
to let it come into China, rather theyallow it to come into China, but they have
to make sure that they can controlled it,make sure that it will serve the party.
So that's when you see that theyhave all these concepts and
development from the Golden Shieldproject and the Great Firewall,

(22:35):
which is more as a matter of a passiveway to censor the Internet coming in.
But then again very quickly I thinkbefore people talk about big data,
they figure out the importance of data.
So they started to collect allthese data before they know how to,
how to analyze it before they havethe technology to be able to analyze them.

(22:58):
So that I think evolvedinto the mechanism and
the philosophy in Chinato adopt further control
through surveillance so andalso use these mechanism and
the technology to makeit more of a propaganda

(23:21):
tool that they can take advantage of.
So if we look at the ways thatthey are trying to control and
work in the technical standardcommunity I think this is very similar
to the ways that China has beenthe philosophy that they have in
terms of thinking about the wholetechnology in the last several

(23:44):
decades coming into China andthey start to adopt them and
they start to find ways to controlthem and control the development.
So I would say that number one,they are very well planned, but
they don't look necessarily immediate.
They look for perfection.
They don't need to have the wholestrategy mapped out, but

(24:05):
they can be very quick to adapt becausethey are also they realize that they
are in a developing mode,especially in the beginning.
So they don't mind what we say in Chinese,
like touching the stones whilethey are crossing the river.
So I can figure it out as we move along.
So when they deal with this, the technicalstandard community internationally,

(24:30):
there are two different typesof these organizations.
You would have the more top downnational control intergovernment
agency controlled ones such as the itu,International telecom unions and
the working groups under them.
And then you will also have anothertype of standard organizations

(24:51):
such as the Internet EngineeringTask Force, IETF and IEEE and so on.
These are more multi stakeholder and
bottom up organizationtechnical societies.
For both,they are trying to exert more and
more control through participation.

(25:13):
Because particularly in the latter,for the multi stakeholder and
open technical society organizations,they are free to participate.
So they would provide resources,
they would participate at a high level andfrequency.
And of course sometimes we also lookparticularly at the UN organizations,

(25:39):
they also use the same strategy andparticipate at a very high level.
So for the last 20 years I would say ormore, they,
they have been increasingthis level of participation.
And in the last several years I thinkthey are trying to change the mode
of operation of theseorganizations in a couple of ways.

(26:03):
First of all, they would create andtry to propose new standards
that would fit their philosophy offuture technologies, including, we would
say that this is adopting elementsof surveillance into the technology.
But to them this is allabout creating a safer and

(26:24):
more secure network environment, right?
Because obviously today we have so
many problems with crimes andscams and so on on the Internet.
And so they're trying to figureout ways to help law enforcement
tackle these issues.
So that's number one and number two,

(26:46):
I think the ways that they are tryingto change the system is that or
the particularly the standardorganization system.
Is that they want to change the governanceaspect of it which is that they
are strategically trying to switch some ofthese technology standard organizations.

(27:06):
And the work that they do fromthe multi stakeholder organizations
over to the intergovernmentalUnited nations organizations.
Which they feel that they would havea better chance of controlling the outcome
and influencing the outcomethrough governments.
And through their other friendlygovernments or the Belt and
Road countries and soon through their Chinese influence.

(27:29):
So I think These are the twotypical ways that in the last 10 or
more years that China has beengradually trying to increase
the level of influence in thesestandard Bodies around the world.

>> Beth Kerley (27:46):
Thanks, Charles, and I know some points coming out from
a conversation that we held last fall aswell with our colleagues at the center for
International Media Assistance.
Looking at really the challengefacing civil society advocates and
supporters of democracy more broadly and
attempting to engage on standardsin this new environment.
It strikes me that in both of the storiesyou tell, there's kind of a progression

(28:08):
that we see from a more reactive defensiveto a more assertive mode, right?
So from tech as, okay, this is dangerous,how do we stop the free flow of
information to tech as, hey, this is a mapof our population that we can leverage for
control, this is really cool.
From trying to shift dynamic,shift proposals within standard setting
bodies that are maybe set up in waysthat don't work so well for the CCP.

(28:33):
To actually trying to change the waytheir run to shut out non governmental
voices and when it comes tothe dissemination of these norms.
So the standard setting bodiesare one vector for that, right.
You can have the authoritarian ideasabout surveillance kind of baked into
your standard for smart cities.

(28:55):
Or the case everyone isfamiliar with is the so
called new IP which was proposed andrejected a few years ago that
would have increased centralizedstate control over the Internet.
But another piece of this question that'sreally important is what happens on
the ground.
Because in China, of course,
you have a very well established physicalhuman authoritarian infrastructure.

(29:17):
Lots of investments in internal security,this idea of grid management,
all of this being combinedwith particular technologies.
And now these technologies are beingexported all over the world to very
different settings.
And there's a pretty robust debate amongscholars about the extent to which
you can see PRC like systemstake root elsewhere.
So Valentin, you engage withthat a little bit in the paper,

(29:38):
tell us about that debate andwhere you come down.

>> Valentin Weber (29:40):
Sure, yeah, so
it's really the debate is aboutwhether the Chinese model or
the Chinese approach to surveillancecan be also implemented elsewhere.
And the main argument against it is thatyou can export the surveillance gear,
you can export CCTV cameras, butthe police on the ground in countries,

(30:06):
in different countrieswon't be able to use it.
Because what China has isa very sophisticated security
infrastructure with organizationsthat have been, you know,
have large funding, are very,very, very sophisticated.
And so in that premise,you could maybe export it to Iran,

(30:28):
Russia and so on,because those are countries
which also have very sophisticatedsecurity organizations.
But most other countries,developing countries,
wouldn't be able to take China's approachbecause you would export the technology
and they just wouldn't be able to use it.
But I think on the ground,there's a very different reality,

(30:51):
we know that I would make a bet now and
say that Chinese surveillance tech can befound in every country across the world.
Whether it's being used bythe public infrastructure or
by the state is a different question,but it's found everywhere.
And, even the poorest countries,such as in developing countries or

(31:14):
in Venezuela,they they do import the technology.
And even if they can't afford everything,
they have a priority that'sthe regime security.
And because of regime security,

(31:34):
they will spend on those smart cities andso on.
And even if they don't have the money,China will find a way
of giving them the surveillance gearin exchange for, let's say, oil.
There was a deal between Ecuador andthe PRC where Ecuador
gave the PRC oil, andit got surveillance gear in exchange.

(31:57):
So really, everyone, let's say.
Can afford it, but then again, thereis the question about whether they can
implement it if they don'thave sophisticated police.
And they are really the private companies,Chinese tech giants come into play.
We saw in Uganda that Huawei was training
local police officersto use their gear and

(32:21):
to get access to dissidents phones.
And so, so we can really see thateven though they might not be
immediately capable of doing it,they'll get the support
from Chinese companies toget access to those phones.
And so what's the lesson really here is

(32:44):
that they won't be completelyable to copy China.
What China did is really remarkable.
It has its own tech giants suchas Tencent, Baidu and so on.
So a country, let's say Uganda or anothercountry, won't be able to do that, but
they will be able to buy the gear and
they will get help from Chinareally to implement it.

(33:05):
And China will do everythingthat they can to support that.
So I think really it's the realityon the ground shows already that
the surveillance gear is diffusing andalso that, you know,
the model is being exported andimitated abroad and quite successfully.

>> Beth Kerley (33:23):
And I think that also brings us to a really important point
that it's not just the technologiesthat are going abroad, right?
It's often advisors, trainings andso forth that could also export
certain ideas about how itought to be used along with it.
You mentioned the role of companies and
I want to turn to Charles with onefinal question on this segment.
So any responses to that?

(33:45):
And also just given the ups and
downs that we've seen in the CCP'srelationship to some of the tech giants
over recent years with a raft ofregulatory and creative crackdowns,
I would say to a seeming warm up in thepast couple of months or so given, okay,
private companies are going to help uswith the AI race, so maybe we need them.

(34:08):
What does that mean for the role ofprivate companies from the PRC in this
export of digital authoritarianism?

>> Charles Mok (34:15):
Well, but before that actually I wanna respond
a little bit to what Valentin hasbeen saying about the way that China,
how do they export the model and so on.
I recall that two years ago I wrotea paper on the great firewall

(34:36):
development over the last20 years in China.
And one of the things that Iactually was in part of my report
was that China's great firewallmodel is very difficult to export.
I said exactly the same thing,that it takes huge resources,
huge amount of human resources aswell in order to make it work.

(35:01):
So at the time,
we were referencing an example in Cambodiabecause of the fact that that country also
wanted to use Chinese technologyto implement their great firewall.
And it hasn't been successful becauseof the lack of the same kind of
infrastructure, includingcontrolling the telecom companies,
which they don't in Cambodia,

(35:22):
as well as in China with their stateowned company structure and so on.
So, but having said that,I think the reality right now is
that China doesn't have toexport the whole thing.
They only have and they figure outthat if these countries are not
as capable of doing it as China,they could still use, as you mentioned,

(35:47):
companies such as Huawei andothers to fill in the gaps.
So these countries,these companies would be able to, and
this partly also answeringpart of your question as well,
these companies would play a rolein the scheme of things for
China over the world in terms ofproviding training to these countries.

(36:10):
And they call them cybersecuritytrainings, they call them anti crime,
anti-cybercrime training, which is whatwe, everyone would need, [LAUGH] right?
And if the country doesn'thave those resources and
capabilities, they mightas well even outsource it.
And also an important part of it is that,if we believe that controlling these

(36:33):
infrastructure in these other countrieswould benefit China in the sense that,
if we worry about having back doors andso on, if they really exist.
Then China would get access to doors data,whether it be back door or the front door,
whatever, they getthe access to the data too.
So it just works to China'sadvantage no matter what.

(36:54):
But back to the question aboutwhat is the role of the companies?
You know, the relationship warming upsince the crackdown on the commercial or
tech sector in China starting about,you know, two, three years ago and
gradually warming up right now,I actually don't think that
the situation has changed thatmuch in terms of the relationship

(37:18):
between China's tech companies andthe government and the party.
Because to me, it's all in the family.
It is like the childrenare not very obedient and
they forget the values of the party so
I have to spank them a little bit andthey cry.

(37:43):
And then they come back andbecome like the good kids,
again, and they, well,[LAUGH] they are obedient again.
They obey.
So now, and also not to mention thatthe global situation has changed as well.
You know, remembering that when theywere starting to, in 2020, you know,

(38:07):
when they were cracking down on companieslike Didi and Alibaba and so on,
these companies were perceived to begetting out of control of the Party and
the stuff that we have been talking aboutthe last two years about sanctions and
so on hasn't really started yet.
So the global relationship betweenthe tax geopolitics at that time and

(38:31):
today or the last two yearshas changed a little bit too.
So I think out of practicality and outof the fact that these companies are in,
you know, coming back to the fold ofthe party, this is the time that they
actually, the government, Chinesegovernment, actually need these companies.
You know, they shape them as well.

(38:52):
You know, we've been talking about movingthem from the soft tech to the hard tech.
So in many ways these companies thatwere focusing a lot on making money,
quick money, on, creating games and so on.
Now they're making chips in AI, and otherhard tech that the governments believe
that is more important for the country'sfuture or the party's future.

(39:15):
So I think right now, andeven before the government makes
sure that they would exertthe right amount of control so
that they would beserving the party's need.
And I think that really hasn't changed,but
right now it is warming up a littlebit because compared to when

(39:40):
these companies were firstcracked down three years ago,
the global situation has also changed.
So the party right now do needthese companies in terms of both in
the sense of real technologydevelopment and advancement as well as.
In terms of propaganda purpose as well.

(40:02):
I mean, think of the ways that theyhave been using the success of DeepSeek
domestically.

>> Beth Kerley (40:08):
Thanks.
So it's kind of good explanation of why itmakes sense to think of both public and
private layers of the PRC techapparatus as pursuing common goals,
particularly around this idea of security,
defined as regime security,security per se, right?
And another point that came out in therethat I think it's really important to keep

(40:31):
in mind is that when we're looking at theproliferation of PRC tech from a democracy
point of view, there are atleast two different risk angles.
One, which we've been focusing on is aboutpotentially reinforcing authoritarian
practices in the importing states.
So local law enforcement learns how to usefacial recognition to identify protesters
and so on.
But the other angle is that a lot ofthe data may be going back to the PRC and

(40:55):
being used to train AI or forother more problematic purposes.
Valentin, any comments on the privatecompanies angle before we move on to
the four technologies?

>> Valentin Weber (41:04):
Yeah, just a quick one.
I think there was a lot of publicdebate about, you know, as you said it,
about companies misbehaving and so on, but
privately everyone knew who was in chargebecause there's laws in the PRC like
the 2017 National Intelligence Law,which requires companies to.
To share all data if requested,

(41:25):
even proactively sharedata with the government.
And there's so many levers of powerthat the government has in order
to make life difficult for companies.
It was always clear that they haveto stay in line with the party and
whether that's at home or abroad as well.
So I think that's more of a public displayas to, you know, we're privately owned,

(41:49):
we have independent and so on, but really,if you look at the structure as to, as
the power relationship between the partyand the companies, it's very clear, and
it's always been the same that the partyis in charge and it will, if it needs to,
you know, impose their power andtheir will onto the private companies.

>> Beth Kerley (42:10):
Thanks, so we've got a picture at this point of a pretty
pervasive web of authoritarianinstitutions spanning the public and
private sector,spanning online and offline, and
this is already in operation andalready being exported.
So to what extent is the development ofthe frontier technologies that are studied

(42:31):
in this report going to make a difference,right?.
For instance, if we can assume thatthe CCP already probably has a pretty high
level of access to people's informationFrom Alipay and WeChat, for
instance, how much of a difference wouldthe Transition to the use of the cny,
China's digital currency make andin general,

(42:51):
how big do you see the impact fromthe changing tech landscape itself being.

>> Valentin Weber (42:56):
Yeah, no, it's a very, very good question.
And the question is really,why does China do what it does?
Why does it want a central bank,a digital currency?
What's the advantage if ithas access to data anyway?
So let's say you're the state andwhat you currently do, you have banks,
you have credit card companies, you havedigital payment apps such as Alipay or

(43:21):
WeChat Pay, and you get your informationfrom them, you inquire from them.
And so in that case,you really have a bit of a friction there.
You need to get the data there,you need to inquire.
You might not be sure if they give youeverything and so on, but by instituting
a central bank digital currency,the EU and you do away with that layer.

(43:45):
So you'd have a direct accessto the financial data of
citizens across the country.
But with the eun, it's interesting becauseit hasn't really gotten so much adoption.
People have already the digital paymentapps, they're accustomed to it,
and there wasn't really, even thoughthere were incentives by the party

(44:09):
to get people onto that digital currency,it's been very,
very slow because people just don'tsee the point of adopting it.
And here it's actually an interestingcase where there would
be a wish by the states tomove people in that direction.
But because it's soconvenient at the moment,

(44:31):
with the digital payment apps, thereis still people are sticking with it.
But just to say very shortly, it wouldreally do away with a lot of friction.
It would give direct access andmore of a centralized control as well to
financial data,which is really crucial about telling so
much about people right where they spendmoney, where they are at a certain moment.

(44:56):
And so it's really, really powerful wayof understanding citizens better and
controlling them in a better way.

>> Beth Kerley (45:04):
So that's one example where authoritarian practices are already
quite deeply entrenched, but the systemof control could become even faster,
more pervasive if you take awaythe things that get in the way.
Charles, any thoughts on this questionof how much of a difference do frontier
technologies make, and
perhaps in particular the recent advanceswe've been seeing in AI from China?

>> Charles Mok (45:27):
Well, actually about the central bank digital currency example,
in addition to the aspect of usagethat Valentin has been talking about,
more on the personal level,individual level of
having a digital wallet andthe renminbi, and so on.

(45:50):
There's also the aspect of the,the cross country
transfer of digital currencythat would enable the CCP and
China to bypass the globalbanking system which is
something that they havealways been trying to do.

(46:11):
I just remember, let me see.
I remember that.
Yeah, I forgot which country.
I was just reading abouta news that they were also
just signed an agreement with yetanother country.
I forgot whether it was in Asia orto do this kind of bank transfer or

(46:31):
digital currency transferjust very recently.
So anyway, so this is actually not justworking at the personal or local level.
Not to mention that the fact that they canuse this to effect immediate change or
immediate control on people's finances and

(46:53):
in addition to holdingon to their data through
getting the data From Alipay orWeChat Pay and so on.
But I always think that at somepoint if the government really
believe that they need to enhance andincrease the control to
that particular level,they could work with these companies and

(47:18):
just convert Alipay into a nationallycontrolled system if they choose to.
But they probably think that thisisn't the time to do it yet.
They don't need to.
But eventually if theyreally want to push it.
What I mean is Alipay isn't an obstacle.
In the end, they control it.

(47:40):
They can even nationalize the whole thing.
So I think China is playing a long
game in this regard to try to carry out
experiment about using CBDC to.
Achieve its financial aims.

(48:03):
And actually, I mean,let's not forget that I think when
China first started to the ideato implement the Eronminbi and
the CBD and the central bankdigital currency in China,
it was actually inspired by Facebookbecause Facebook was actually
trying to create its digital currencyat the time on this platform and

(48:28):
it was not successful because ofopposition from the US Congress and so on.
And then China pick up the idea andgot the system running in two years so
they, they, they can learn and adapt.
That's what I was trying to say.

>> Beth Kerley (48:46):
And question to both of you.
Sort of building on that.
And also Larry at the outset mentionedwe've seen a shift in the situation over
the past year in part due to apparent PRCleadership in some of these areas of tech
development.
So one of the things that's been on a lotof people's minds in the tech community
lately is of course Deep Seek.

(49:06):
And I think that shifted the narrativea bit on advanced AI and
generative AI in particular.
Whereas in the immediate post chat fervoryou could see a lot of commentary that
this particular type of AI maybe isnot well suited to authoritarian
systems because the censorship limitsthe availability of training data or
because they'll be scared ofit because it's unpredictable.

(49:30):
And so whereas the PRC is verygood at biometric surveillance AI,
for instance, the free world hasan advantage in generative AI.
But DeepSeq has clearly calledthat narrative into question.
So how are we thinking aboutthe implications of DeepSeq for

(49:50):
the surveillance state?
And the PRC authoritariansystem in general?
How important is that going to be?
Are they gonna succeed in threadingthe needle of employing gen AI to the max
while maintaining a high level ofcensorship, starting with Valentin?

>> Valentin Weber (50:07):
Yeah, Deep Seq was really again a game changer and
we saw it very interestingly.
The CCP didn't see it, or doesn't atleast in public see it as a threat.
We can see now thousands of policeofficers being trained as to how to use
DeepSeq already, a few weeksafterwards they're training sessions as

(50:28):
to how to use it for writing reports,police reports, how to use it for
query large video datafootage very effectively.
So they're really, really embracing it.
And especially also more poor orpoorer provinces are now also deploying
it because it's open source,they can use it, it's cheaper.

(50:52):
People who weren't or
provinces who weren't able to rollit out previously can do it now.
So it's really making AI even morewidespread than it used to be in
terms particularly of censorship.
Could it be a danger whencensorship isn't working,

(51:12):
when Deep Seq isn't the modelisn't censoring the content?
It should be.
So I think there it couldbe really go into two ways.
Recently we did see that peoplecan get a deep seek to speak
things that it shouldn't be,and you can get around it.

(51:33):
But I think there's already ways beingdeveloped as to how to put that in check.
And it's interesting asto even here in the US,
I think it's anthropic,which is working on AI that is able
to check another AI, right,that the AI is behaving.

(51:54):
So I think there's already beingsolutions being put in place to make AI
do what they should be doing.
And I think China will of coursealso do that if it's in its interest
to hold regime security.
So I think technologicallyit's possible that, you know,
censorship will be upheld, butat the same time it could also go very,

(52:18):
very wrong because AI isbecoming increasingly autonomous.
And we saw that AI cango a little bit rogue.
You know, it can.
There is cases where AI has been disablingthe oversight mechanisms that it had,
you know, or that it was diffusing toplaces that it wasn't supposed to go.

(52:42):
And I think there's a particulardanger there, especially as
China's already experimentingvery strongly with agentic AI,
which is able to executedecisions on behalf of the CCP.
Right.It's not just, you know, giving, you know,
analysis, but it's able to already act.

(53:04):
And that's gonna be a real danger,I guess,
if those AI agents are notbeing understood properly and
if they're perhaps actinga little bit in a direction
which might not be inthe interest of the CCP.
So I think at the moment it's stillvery much out there whether it could be,

(53:28):
you know, a bit of an unpredictablegamble that the CCP is taking there,
but they're really embracing deepSEQ as much as they could for
the public security sector at the moment.
I don't see the day,I really see a danger there.

>> Beth Kerley (53:44):
And just quickly before we move on,
because you recently had a piece in theJournal of Democracy on this, and I think
it's a bit different from how most peoplethink about Deep SEQ and generative AI.
So what does AI advances in the securitycontext specifically mean and
in particular the use of agentic systems?
What could that look like on the ground?

>> Valentin Weber (54:04):
Sure.
I'll give you one example.
One case is where you could query hundredsor let's say hours or thousands of
hours of video footage to say, how didthis car behave in the last 24 hours?
Just that, to get the query and an answer.
But in terms of agentic AI,I would say the most advanced I've

(54:27):
seen currently is really that in cities,AI agents are working together already,
coordinating between eachother to implement orders.
Let's say you would havean agent in the commercial field.
You would have an agentin the transport field.
The transport one would be,you know, you know,

(54:48):
would be monitoring orexecuting, you know, traffic,
traffic cameras and traffic lights andso on, whereas the commercial
one would be able to have a lotof data on the commercial field.
And the idea really, as it'sconceptualized in China at the moment,
is that these AI agents wouldbe working together and

(55:12):
that there would be a super AI agent.
That's how they call it, is coordinatingthese AI agents to get to a certain goal.
So your goal would be, let's.
There was a protest andyou would tell the super AI agents,
let's prevent that in the future.

(55:32):
And so the super AI agent wouldcoordinate the field AI agents to
prevent it in the future.
It would tell one AI agent, identify me,all the people who are in that protest,
and another AI agent to tell me wherethose people were in the last 24 hours.
And together they would execute decisions.
They could,let's say an AI agent could reach out to

(55:57):
all the people who were in that.
Protest to all their contactsto tell them that they shouldn't
be talking to those people.
So it's really automating everything,basically control
in a way that is at the momentstill was recently unimaginable.

>> Beth Kerley (56:15):
Thanks for that picture.
So a lot there to be concerned about,both if it does what the CCP expects and
potentially also if it doesn't.
Charles, thoughts on AI Deep Sea.

>> Charles Mok (56:24):
So, well, I guess when we think about AI from our perspective and
look at, no, not, I mean,deep seek from our perspective,
we tend to think about all the potentialissues, problems, privacy,
security, data leakage or disinformationor even from China's perspective,
like Valentine, you mentionedthat they want to censor, but

(56:47):
they are imperfect andthey make mistakes and so on.
But I think from China's perspective,they've been there before,
the whole Internet was like that, and theyfigure out a way to make it work for them.
So these are minor issues to Chinathat they have to fix [LAUGH] in order
that they get complete control.

(57:08):
And I think they still have the confidencethat they are going to be able to be
successful from their point of view andalso to them,
the most important thing atthe moment is the race to adoption.
So just like you also mentioned,you know, they're adopting it in
big ways in all kinds of aspect insociety or business or the economy,

(57:31):
manufacturing, education, in the courtsystem or militarily and so on.
But I think the danger here is thatthey really haven't been really
focused at all on the safety aspect andthe security aspect.
And that might come back to bite themin the end, but we're not sure yet.

(57:51):
Of course, at the moment I thinkthey're more keen on making
sure that they win in this race to adopt,that they can have the leg
up against their competitors,the Western countries and so on.
So this is what they'refocusing on at the moment.
But I know how it will turnout in the end, whether or

(58:14):
not some of these issues willcome back to haunt them.

>> Beth Kerley (58:19):
All right, thanks for that.
So perhaps turning to something that willplay into the answer to that question,
democratic responses.
There are a couple ofdifferent angles to this.
One, what we can do to improve our effortsspecifically to counter the proliferation
of authoritarian technologies andpractices, including from the prc, and

(58:41):
then also the possibility ofoffering something different.
So first I want to give youboth a chance to offer a few
suggestions as to what wemight do on the first front.
What are one or two strategies thatyou think would be particularly
effective in counteringthe proliferation of digital practices?

(59:02):
Digital norms that stand stem fromauthoritarian systems raising our game
in that area, whether among researchersin standard setting bodies on the ground
elsewhere, how can we raise our gamein that area starting with Valentin?

>> Valentin Weber (59:18):
Yeah, so what can we do?
I guess reining in technology,a diffusion will be very, very difficult.
We have seen it currently there wasa coordinated effort to do that, and
it's we still end up with Chinesesurveillance tech everywhere.
But I think where I saw positivethings is you got to look in very

(59:40):
authoritarian regimes it'sdifficult to prevent the misuse.
But I guess in hybrid regimes it'sa bit easier or in also, you know,
more swing states ora bit more democratic leaning countries.
One such case was I think inthe Philippines where a smart city

(01:00:01):
was prevented because of an oppositionsenator who raised it in parliament.
But there was a nationalsecurity grounds saying, okay,
this wouldn't be working because you know,because
of national security in the Philippines,even the military prevented another smart
city because it was in a strategiclocation in the north of the Philippines.

(01:00:23):
So we can see already thatthat can be a factor.
In Mauritius also, I think therewas a very interesting debate in
parliament again about Chinesesmart cities and that was very,
very good again because there wasa public debate in parliament and so on.
So I think really oppositionpoliticians very, very important and

(01:00:46):
especially regional coalitionsof opposition politicians which
face a similar challengewith Chinese tech.
Bringing those people togethercan be very, very, very good and
that they exchange their practices andknowledge.
So I've been part of one of suchmeetings amongst ASEAN countries,

(01:01:08):
opposition politicians in ASEAN countries,and
they exchanged their also as to howto face Chinese surveillance tech.
And I think that was really, really good.
I think another one is alsothat highlighting Chinese
surveillance tech andbringing it out of the shadow.

(01:01:30):
Often people just don't know thatsurveillance tech is present.
And so I think in Belgradethere's a very good project in
highlighting where CCTV camerasare taking it out of the shadows.
They created a map of all Chinese CCTVcameras deployed in the city,
and I think that can againbring civil society together.

(01:01:55):
You saw it recently also inprotests against the government.
It's really bringing people together and
also focusing theirattention on those issues.

>> Beth Kerley (01:02:05):
So coordinating at the political level and
making the public more aware.
Charles countering CCPdigital authoritarianism.

>> Charles Mok (01:02:13):
First, about the standard setting bodies, I think we have to
reiterate democracies have toreiterate the importance of supporting
multi stakeholderism,the bottom up standard setting process.
And really putting the resources intothe effort and preventing the attempt for

(01:02:36):
some countries to move the standardsetting mechanism from the multi
stakeholder bottom up processorganizations such as IETF and
IEEE and so on the professional bodies,
moving them over to the ones thatare controlled by government
such as the United nations andITU working groups and so on.

(01:03:00):
So that is the first one.
The other thing that I'mvery worried about is that
democracies are increasinglylosing the moral
high ground when we talk about theseissues about digital authoritarianism.
Part of it is because ofcybercrime which ironically
is very much in big part created bycountries such as China as well.

(01:03:22):
But right now every country is scaredabout the impact on cybercrime and
scams and so on on their citizens and,and all that.
So you do see that the reaction,the knee-jerk reaction from
many government is to stay is tosay that we need to enable the law
enforcement more power to getthe backdoors to encryptions and

(01:03:44):
to get the legal power to get datamore easily from the platforms and
Internet companies and telcos and so on.
That is to me a very dangerous trend.
And we do see that fortunately there are.
There are still some countriesthat are more standing on a firmer

(01:04:04):
ground in terms of saying thatencryption is important for
the protection of everyone's privacy.
And we should not take a short sightedapproach to thinking that you give
the law enforcement the secret key andthen soon enough that secret
key must be accessible to the criminalsat the same time as well.

(01:04:26):
So there are some countries,such as the, or
groups of countries such as EU orFrance, and so on,
that apparently are still stickingto that principle a bit better.
The US only woke up to thatbecause of the assault Typhoon,
[LAUGH] the hacking from the Chinese,

(01:04:50):
from China to the telcos in the US, and
then they say thatencryption is important.
They didn't say it becausethey were really saying
it because they believe thatprivacy is important for
the US position to me isstill a little bit uncertain.

(01:05:14):
So, my worry is that democracy isincreasingly losing the high ground
on this particular debate about securityversus personal rights and privacy.
And that's the second worry that I have.
So finally, I think I'm echoingwhat Valentin is saying as well.

(01:05:35):
Educating people about the importanceof their own privacy and
the risk of using theseChinese technology and apps.
People were so happy that theycould download a Deep Seq app and
play around with it without considering or

(01:05:57):
looking at the terms of use, and so on.
Obviously, nobody would.
But maybe they should really get worriedjust because they know that this is
a Chinese app because.
But nobody did so deep seek as an example.
And the other worry that I have iseven in this country, for example,

(01:06:18):
when there were news about TikTokbeing shut down shortly, imminently.
And then you see a lot of young people orusers flocking over
to Xiaohongshu, the Little Red Book,or Red Note app.
And that to me is deeply worrying as well,

(01:06:39):
because it almost seemslike these users were
jumping from one Chineseapp to another Chinese
app simply because they were doing it as
a protest action againsta potential US ban.
And I asked why, they should realize that

(01:07:03):
this is not a safe thing for them to do.
But this obviously nevercrossed their mind.
So I think a lot moreeducation about the danger and
the risk of using these Chinesetechnologies I think needs to be done.

(01:07:23):
A lot more education and awareness.

>> Beth Kerley (01:07:26):
And one piece that I want to throw in there.
You spoke a considerabledeal about cybercrime and
the risks from government approaches.
And one landmark on a lot of people'sminds is the recent adoption
of a UN Convention against cybercrimethat raised a lot of concern for
human rights defenders.
Because of some vague provisions thatcould potentially be weaponized by

(01:07:46):
authoritarian governments that basicallydefine cybercrime as saying things
online that the governmentis not too fond of.
Valentin, I know you also had somethoughts on the encryption piece.

>> Valentin Weber (01:07:57):
Yeah, absolutely.
So about the cybercrime convention,just very shortly, it's maybe more about
preventing even worse things fromhappening because if the west and
democratic countrieswouldn't be engaged in it.
There would be a cybercrimeconvention without them, and
it would have been probably even worse.
And it's not a good outcome at the moment,not an ideal outcome, that's true.

(01:08:23):
But yeah, sowe're there in international space,
and it's very difficult,I guess, regarding encryption.
We're here at the very crucial time atthe moment because, people are still
arguing whether there should beend-to-end encryption or there shouldn't.

(01:08:44):
And we saw recently also in the UK,
end-to-end encryptionbeing basically banned.
And that's a very bad thing.
It goes back to what you said.
We got a lead by example andit's not just for
our own national securitybecause it's prevents China

(01:09:06):
from hacking us orgetting very easy access to data.
But it's really about not justnational security, but also privacy.
Here is on the same argumentative line.
It benefits both, and again,as we come back also to the report,
we're transitioning in an era wherethere might be quantum computing

(01:09:30):
being able to break encryption inthe next five to ten years or so.
And there's gonna be againa very crucial moment,
a time of transition, which is alsovery dangerous because we will
have very soon widely implementedpost-quantum cryptography.
And there will be again the temptationto insert backdoors into

(01:09:52):
those post quantum cryptographyto give law enforcement access.
But again there we need to resist
inserting any backdoorgovernment access to that.
So I think here we got to becareful now and also plead for
very encryption that is not subverted bythe government because China is gonna

(01:10:15):
create such a post-quantum cryptographywhich will have backdoors in it.
So we got to have an alternative to thatand really argue for not subverting it.

>> Beth Kerley (01:10:26):
And a last question from me before we open this up.
So also please be thinking of yourquestions for our distinguished speakers.
We were speaking just now aboutthe negative side of the response,
how to oppose the diffusion ofdigital authoritarianism from the PRC.
What are one or two recommendationsthat you would make for

(01:10:47):
affirmative responses fromthe democratic community to offer a more
rights respecting visionof tech development.
And I'd be particularly curious aboutyour thoughts on the potential role of
privacy-enhancing technologies likefederated machine learning and
whether that could help to reduce theauthoritarian affordances in frontier tech

(01:11:08):
starting with Valentin.
And if we could be quick, sowe can leave some time for our audience.

>> Valentin Weber (01:11:13):
Yeah, sure, so what's one thing that I would say really is that
we got to keep innovating because ifwe do have the first mover advantage,
we can really shape the standardsthat are being out there,
not only international institutions butalso on the ground.
Right if you export technologyyou're also setting how things

(01:11:35):
are looking on the ground.
You can set the ethical boundariesthere and set a first thing there.
Also get the lead by example assaid on encryption I think and
really I think at the moment ifwe want to create a democratic
digital ecosystem that'sthe opposite of authoritarian.

(01:12:00):
Ecosystem.
I think at the moment we have very,very isolated, you know,
let's say democratic technologies.
We do have Signal,which is an end-to-end, encrypted,
very Internet freedom going intothat direction, being good for it.

(01:12:21):
We might have other technologies, but
there is no yetoverarching kind of ecosystem.
Right.It's small islands of democratic tech
there.
And I think US government has donegreat things in proposing and
giving a first financing to a lotof these technologies, Tor Project,

(01:12:43):
Signal and all these technologies cameabout also with US government financing.
And then they developed from there.
But I think definitely that shouldbe continued to create a broader
digital ecosystem and perhaps a morestrategic vision as to how to
create a more holistic system ratherthan small products, let's say,

(01:13:06):
which are fulfilling certainpurposes such as messaging.

>> Beth Kerley (01:13:11):
That's an interesting thought moving,
the DemTech is almost an exceptionto DemTech as normalized.
Charles Democratic digitalecosystem in two minutes or less.

>> Charles Mok (01:13:21):
Yeah, I follow on Valentin's points about keep innovating.
But of course if you want to innovate,you have to support research and
support smaller companies and so on.
So that is what's needed in this country,
as well as maybe in Europe andother democratic nations.

(01:13:43):
And it's also very important for
the government to take the lead inadopting these technologies as well.
So if they are privacyenhancing technology,
then is the governmentadopting these technologies?
Or the government is just following orjust focusing on
consolidating the influence ofthe big technology firms and so on?

(01:14:08):
That is also needed.
So they need to have policies thatwill support not just innovation, but
also innovation by smaller players andso on.
And of course, just like what you said,we have technology that are good,
such as or applications that are privacyenhancing, such as signal and so on.

(01:14:30):
But then again, it just take a coupleof not very smart users to use them and
then people thought that these are notgood technologies, which is a shame.

>> Beth Kerley (01:14:40):
All right, thanks.
And so from both of you, this idea,innovation can be something that actually
helps get us to a better place interms of pro democratic technologies.
And a pro democratic digital ecosystemdoesn't have to be in opposition to it.
Questions from the room or from ouraudience online, please raise your hand.
We've got some microphones in the back.

(01:15:02):
I'll try to take a couple at once and
then give our panelists a chanceto respond over there, please.

>> Speaker 6 (01:15:09):
First, thank you very much for this enlightening discussion.
I had a question to Valentin,in your paper you outlined seven
critical steps for democracies ornear democracies to do.
And I think you guys touched on someof those during the panel discussion.
But I was curious about in the paper.

(01:15:29):
Are you basically doomingthe autocratic world to following
in the steps of China andits technology and its know how?
And it seemed like the steps that youwere outlining were kind of defensive for
for the democracies to do.
But I didn't see anything there that waskind of offensive for us to prevent or

(01:15:55):
even roll back the what wouldhappen in the authoritarian regime.
So I was curious if that's somethingthat you guys could touch on.

>> Beth Kerley (01:16:06):
Thanks. That is an intriguing question.
Anything else from the room?
I just want to collect a couple to makesure that we get you in under the wire.
Yes, please, in the back of the table.

>> Speaker 7 (01:16:21):
Thank you for the great panel.
My question also followslast comment that Charles
mentioned about case of TikTok forinstance.
I just think that this wouldbe a very pragmatic example.
Realistically discuss the ideaof digital democracy.
And in a way when it comes to sort of U.S.

(01:16:47):
policies, how should we rethink democracy?
Does democracy itself reallymean what it meant before,
like let's say 1940s, 1950s.
Right.Democratic round that US kind of created
through this sort ofnotion of consumerism and

(01:17:11):
notions of sort of historicalenvironment through
media creation of media anddemocratic media.
But today with the basicallysituation that we have at hand,
does that still work orwe should even rethink and
redefine those,the democratic environments really.

>> Beth Kerley (01:17:35):
Thank you very much.
And did we have one more.
Yes, the gentleman inthe middle of this room.

>> Speaker 8 (01:17:42):
Thank you for the talk.
I'm curious, how much are the opensource tools that are available?
How much are do they comparewith the state of the art that
surveillance that China has to offer?
My background is on Iran andyou know, they have, you know,
in Iran there are ride sharing apps,there are a lot of digital,

(01:18:03):
there's, you know,a thriving digital economy and
the government does haveaccess to a lot of those data.
So I'm just curious, you know, in acountry like that, in a context like that,
do you really need access to like thestate of the art, you know, technology or
can you just, you know,
impose that kind of authoritarian regimejust with this open source tools.

>> Beth Kerley (01:18:26):
All right, thanks very much.
We've got three great questions.
So are there any strategies foractually going on the offensive and
authoritarian settings?
What do we think about when we're thinkingabout the intersection of tech and
democracy?
How are we thinking about democracy?
And here I'd be particularly curiousif you have any thoughts on the use of
digital deliberation platforms, which issomething we've talked about in previous

(01:18:51):
forums at ned and then the comparisonbetween PRC tech and open source tools.
So please answer the questions thatfeel most relevant to your experience,
starting with Valentin.

>> Valentin Weber (01:19:03):
Sure, going on the offensive, yeah,
it's a good point cuz otherwise wejust have a whack a mole thing and
we're always one step at the back.
So I would say really the most,
let's say the most sustainableone is just getting market share,

(01:19:23):
doing more to promote democraticalternatives in those countries.
But I think in especially authoritariansettings there are things
that are done at the moment alsoI think which are to give people
the tools to reach information, right?
Anti-censorship tools, proper VPNs,

(01:19:45):
not Chinese VPNs whichare government surveilled.
So there are things in givingpeople access to information.
So I think there's these things whichare being done, which are good,
which should be done.
Further and beyond that,I wouldn't know, maybe you have some
suggestions which would be interestinggoing and giving more access there.

(01:20:12):
But it's really about, you know, givingpeople, I guess, access to information and
that's what's being currently done,which is quite interesting.
Regarding open source,I would say even Deep Seq is open source.
But there is other open sourcetechnologies which are out there and

(01:20:35):
you know, authoritarian governments will
use whatever they have at their hand.
If it's cheap, if it's not as good,
if it gets to the purpose that they wantto achieve, they're going to use it and
they're not going to be too, you know,too picky as to what they want.

(01:20:58):
But I think on the democratic side,open source
tools are very muchencouraged to proliferate and
get out there in a very costefficient way to support that.
Yeah, Jjust to make the point that on bothsides, democratic and authoritarian side,

(01:21:18):
there is this open sourceelements sometimes.
And it really depends on how itwas conceived in the beginning
by the producers and developers.

>> Beth Kerley (01:21:27):
Yeah, that Deep Seq as open source and
that as part of the PRC digital ecosystemis I think, something that's a bit of
a thrill for those of us who havebeen used to thinking of open
source almost as an inherent part of thepro democratic digital movement, right?
Which it still is in many cases, but
people can design open source toolswith different intentions in mind.
Charles, any responses on the question?

>> Charles Mok (01:21:49):
Well, going on the offensive is an intriguing thought,
but I can only try to answerthe three questions linking
them all together by saying thatI think the best thing that we
can do is still hold on tothe values behind the ways that

(01:22:10):
we believe these systems should bedesigned, that they are right enhancing,
that they protect people's privacy andthey are designed
in such a way that people candistinguish as much as possible or
designed to protect peoplefrom disinformation and

(01:22:33):
so on, you know,all those important values.
My biggest worry is that rightnow democracy is acting more and
more like authoritarians.
So how do you go onthe offensive like that?
Right, because you're becoming like them.

(01:22:54):
Right, we are becoming like them.
So I think going back to the basic,it might be defense is the best offense if
we really hold on to the values,if we can change the current ties.
And I'm sure,thinking back to maybe the 90s or
the early 2000s, why are there so

(01:23:18):
many people at that time inChina trying to circumvent
the firewall to reachinformation outside and
today they don't anymore?
So I think maybe I'm too idealist.
I still think that maybe holdingonto our values and defense, and

(01:23:39):
then defense will turn out tobe the best offense for us.
And also maybe there are a few thingsthat we can really do in concrete ways,
such as supporting the development ofthese anti censorship technologies or
circumvention tools and so on.
But unfortunately some of the fundingthat we, this country, provides

(01:24:01):
to the global community to develop thesetechnologies are no longer available.
And the other thing aboutopen source is that I think
China do realize the issues orthe concerns that they
should have about opensource technologies.

(01:24:21):
But at this point,I think they are lesser worried
about backfiring on themselves thanto disrupt the US Model of AI.
So they chose to do this.
So that's why I said China isnot like a one sided thing.

(01:24:45):
So they really can try to adopt,adapt to different circumstances and
different purpose and requirementsat the time to decide what to do.

>> Beth Kerley (01:24:57):
Thanks very much.
And yeah, I do think there is something tonow it's sort of considered passe, right,
to recapture that techno optimism ofthe 90s when people thought of technology
as inherently liberating and so forth andthat original vision of the Internet.
But I do think there's somethingdeeper there behind that impulse that
speaks to a much moreenduring that we see in so

(01:25:19):
many settings around the world,human instincts for freedom.
And so it's worth thinking, and there's alot of thinking to do about the different
strategies we can deploy to get back tothat vision where tech is not a lever for
centralized control bythose who know best,
let alone it's something thatitself knows better than us.
It's a system that gives everybodythe right to speak up and

(01:25:42):
have a free voice in their own destiny andin the destiny of their country.
If we have any additional questions,we have online.

>> Speaker 9 (01:25:54):
Question here from the online audience.
And while I know it's early days still Iwould be curious to hear from the panel if
they can speak at all to what the policiesof the current US Administration might be
as far as we've seen to the issue setthat we've been discussing tonight.

>> Beth Kerley (01:26:12):
All right, thanks.
And any more last call forquestions, speak now or
catch us in the hallway afterwards.
All right, sosince we just have a few minutes left,
I'm going to say any thoughtson perhaps I'll broaden that
a little to what we think the prospectsare in the coming years for

(01:26:37):
both US and perhaps global responses tothe digital authoritarian challenge and
any closing words that you'dlike to leave our guests here
at Stanford with, starting with Valentin?

>> Valentin Weber (01:26:52):
Yeah, I'll start.
Okay, two thoughts also onthe US administration's policy.
I think one interesting example isencryption and there we saw that
the current US Government is reallysupporting end to end encryption,
which is interesting, I think.
And there could be a broader coalitionalso with lots of European countries

(01:27:16):
on that front.
And so I think that's one of the issuesthat could be brought forward.
And regarding again,on our vision of what we should do,
I think often we have our own conceptionof what people want across the world and
with our conception of our freedom, ofwhat we want, of what we conceive things,

(01:27:40):
but I think we often have to thinkmore also as to what they want.
Going back to the questionon going offensive and
let's say an offensive tactic wouldbe providing information to people in
authoritarian governments andbringing that information there.
But again there we have.
Have to think like them and say, okay,what information are they even interested,

(01:28:03):
what's the demand side?
And that goes back toalso what Charles said.
People often are not interested inwhat's happening on the West Coast,
what's happening West Coast politics orin Germany or somewhere else.
They're really interested in what'shappening on the ground in their locality.
If there was an environmental disaster,they want to know what's happening there.

(01:28:24):
And so I think if we think about,you know, sharing that information,
then we need to get informationon these kind of events to them.
So there will be also a demand sideto the supply that we can potentially
give to them with anti-censorshiptools and things like that.
So I think that's the broader view.
We really need to see things as they are,as they see it, and not pursuing our

(01:28:49):
own conceptions of what, you know, freedommight mean on the other side of the world.

>> Beth Kerley (01:28:55):
And I'll just foot stomp that emphasis on anti censorship
technologies as something where thereis really technologically fascinating
things going on right now with the useof satellite tech for anti censorship,
use of AI for anti censorship.
So it's a potential opportunityto innovate for democracy.
Charles, last words.

>> Charles Mok (01:29:13):
Okay, I hope that democracies can develop more
coherent strategy andpositions towards our values.
And right now my worry is alsothat the policies are too
much concerned about fighting fortechnology or

(01:29:35):
AI leadership andcommercial incentives and so on.
So we still have to remember thattechnologies can be neutral and
the values are often the most importantthing where it shapes our policies.
So I wish maybe we could learn a fewthings from the PRC in the sense that we

(01:29:58):
should have persistence in ourpolicies according to our values,
which is what they do, butwe don't do as well as they do.
Different values, of course.

>> Beth Kerley (01:30:11):
All right, thank you, Charles, and so thanks to both of you for
this rich and wide-ranging conversation.
And thanks again to Hoover forhosting us today.
Valentin Weber's report,Data-Centric Authoritarianism,
can be found on our website at NED.
Thank you all for joining us, andhave a fantastic rest of your day.

(01:30:32):
>> [APPLAUSE]
[MUSIC]
Advertise With Us

Popular Podcasts

Stuff You Should Know
Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

On Purpose with Jay Shetty

On Purpose with Jay Shetty

I’m Jay Shetty host of On Purpose the worlds #1 Mental Health podcast and I’m so grateful you found us. I started this podcast 5 years ago to invite you into conversations and workshops that are designed to help make you happier, healthier and more healed. I believe that when you (yes you) feel seen, heard and understood you’re able to deal with relationship struggles, work challenges and life’s ups and downs with more ease and grace. I interview experts, celebrities, thought leaders and athletes so that we can grow our mindset, build better habits and uncover a side of them we’ve never seen before. New episodes every Monday and Friday. Your support means the world to me and I don’t take it for granted — click the follow button and leave a review to help us spread the love with On Purpose. I can’t wait for you to listen to your first or 500th episode!

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.