Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The pressure that foreign governments can exert on a global
company as really problematic, and that can lead to the
suppression of speech. Here in the United States, where are
we moving on net? And I think folks having greater
access to knowledge and ability to share knowledge and speak
freely on net is going to drive society towards a
better future or more open future than if we were
(00:22):
to try and restrict those things only or hoard them ourselves. Right.
The problem is that regardless of whether you're printing pamphlets
or you're posting on social media, governments trying to control
speech is a problem all ast time.
Speaker 2 (00:41):
Well, if you wouldn't mind just start off, I kind
of tell me what you got going on. I know
you're part of net Choice, I know you're into technology
and that kind of stuff, But would you mind just
kind of give our listeners a little bit of background
about who you are, what you have going on, and
then we'll kind of spurn it to our conversation.
Speaker 1 (00:55):
Yeah. Absolutely so. My name is Patrick Hedger. I'm Director
of policy at net Choice, and I've spent and almost
my entire career in free market and civil liberty's advocacy,
and I kept finding myself on the same side as
net Choice on several different issues, and last month, excuse me,
last fall, I decided to join the organization because I
think they're really on the frontier and at the front
(01:17):
line where the important battles are happening in terms of liberty, innovation,
the things that are going to be really important in
the twenty first century, and protecting free speech but also
protecting free enterprise online. And also, technology policy is really
a great policy place to be because so many other
policy areas, whether we're talking energy policy, labor policy, things
(01:38):
like that, healthcare, the lot of the fight is trying
to get bad regulations off the books, and that's extremely
hard to do. We're in a nice position in the
technology policy where we can prevent bad things from ever happening,
and that's a great place to be.
Speaker 2 (01:53):
So give me more background when you say technology policy.
Are you talking in federal state? Are you talking about
like the making rules on how we relegate it, you
know AI? Or are you talking about Congress? Like, give
me a little bit more about that space that you
work it in.
Speaker 1 (02:10):
Yeah, it's all the above, because right now, the technology
companies and tech tech policy broadly, AI especially has kind
of created this. I almost call it like the first
day of prison effect. Right. They always say when you
when you when you get to prison, you got to
beat up the biggest guy on the yard to assert dominance.
And that's what politicians are doing right now. Then the
(02:31):
biggest news stories are all about AI. The biggest companies
are all in technology, and so you see this urge
to kind of try and get their hands around that.
And so you see that at the federal level, you
see it at the state level, you'll see it at
the international level. One of the biggest things that we
deal with is European Union regulations and the EU bleeding
into the rest of the global Internet. I mean, if
(02:52):
you've ever gotten online and seen a cookie pop up,
that's because of the European Union. So international regulations can
affect you here domestic as well. And what we're really
trying to do is make sure at the both the
state and federal level that we don't make some of
the same mistakes that we've seen globally that have harmed
innovation in places like the European Union. I think we're
extremely fortunate to have Silicon Valley here in the United States,
(03:15):
but there are policy decisions that will make that not.
Speaker 2 (03:17):
So such as like, what is there things going on
right now that you're saying Silicon Valley might not stay
in the US? Is that is that kind of what
you just send it at.
Speaker 1 (03:28):
Well, it's essentially it's a fire that you could put
out right and you could smother innovation here in the
US with the wrong type of regulatory regime. I think
what we saw from a JD. Vance when he went
over to the EU, went to the summit, the AI
summit there and explained, hey, look, we're not going to
go down this path because the European Union is very
(03:48):
proud that they have regulated technologies, but they are not
regulating their own companies. Essentially, because if you think about
the largest technology companies, the most influential technology companies, there
are other based here in the United States or they
have a counterpart in China. But despite the fact that
the European Union has double the population of the United
States and at the turn of the century was roughly
(04:09):
the same size economically, and can you name a prominent
European technology firm, It's kind of hard to do. Who's
their Microsoft, Who's their Apple. There's no real equivalent and
that's because of policy decisions. They rush to get an
AI Act out the door. You've got the Digital Markets Act,
the Digital Services Act, the General Data Protection.
Speaker 2 (04:30):
And all these control Is it control related or like
why regulate? Why do we need so much regulation in
the technology space? Do they argue because it's to protect citizens?
To like, what's the reason for all this quick act?
You know, like you're just talked about the EU. There's
really quick to regulate these things. Why do we need
so much regulation on technology?
Speaker 1 (04:51):
Yeah? I mean, to borrow a term from the great
Thomas Soul, there's essentially a conflict of visions. And in
the United States we have largely had what is known
as permissionless innovation. You can launch Uber, or you can
launch this company Airbnb, and you can disrupt an industry
and you don't have to go get government permission generally
to do thatsus in the European Union they're very risk averse,
(05:12):
and so you see this rush to try and regulate
every aspect of a technology before it even gets out
of the cradle. I mean, imagine trying to put in
place everything we know about what we need to regulate
on an automobile after the first model t rolls off
the assembly line. That doesn't make a whole lot of sense,
and so that is sort of the two different approaches
(05:33):
that we see, and we're seeing that happen. Unfortunately in
the United States, there's been a rush at the state
level in particular, to try and pass various AI regulations
related to trying to prevent what is known as algorithmic discrimination.
Is what we are seeing. And there's a lot of
bills that are popping up here and there that are
really concerning, very well intended bills, but the onerous reporting
(05:58):
requirements trying to prevent problems that we don't even know
would potentially exist from happening, is going to stop companies
from even trying to put resources towards advancing technologies in
that space.
Speaker 2 (06:10):
Like I get the fear behind AI, right, like there's
a there's something we don't quite understand, and you watch
the movies of AI. You know, Robt's becoming syni in
and taking over the planets, and so I know there's
a lot of concern about AI particularly, But when you're
talking about when we use the word technology, what else
are you talking about them trying to regulate what other
(06:30):
spaces like, technology is a very broad word, So I
guess are there other sectors of technologies? AI the main
hot it's probably the hot button focus right now, but
like what other things go into this conversation that the
average person would be able to kind of gather.
Speaker 1 (06:46):
Yeah, that's a great question. I think probably the biggest
space would be the online media space, whether we're talking
social media, podcasts, you name it. There's a different speech
culture that we see in the European Union, verse the
United States, and even unfortunately you see in the United
States both sides want to control what is said online
for different reasons, and thankfully we have that conflict because
(07:10):
it's prevented a lot of government censorship in the United
States because of that, but in the European Union, there's
been a lot In the EU member States, there's a
lot of agreement about trying to prevent what they define
to be extreme speech or harmful speech, and you end
up suppressing a lot of otherwise benign and lawful speech
as well. So there's that aspect of it that is
(07:32):
really problematic. And as we've seen, these technology companies that
were born as social media companies are doing some of
the most important innovation that we're seeing in quantum computing space,
in the AI space. So it's really important that you
never know what a certain technology company that launches doing X,
(07:53):
Y or Z may be doing a B or C
that we could never predict down the road and so
overly rateulating a company that's mainly engaged in speech. Yeah,
it could could harm AI innovation space innovation. Right. Who
would have thought bookseller in the nineties would be launching
satellites to provide internet today? Right? That's Amazon? Right?
Speaker 2 (08:17):
Do did you side note? Did you watch the Joe
Rogan and uh like conversation that he had with uh oh,
I'm a blanket on the Facebook guy, I'm blanking Mark Zuckerberg. Yeah,
thank you. I was saying. I was like, well, it's
not Elon because that's Twitter. Yeah, so with Zuckerberg, did
you watch that at all, that that conversation.
Speaker 1 (08:39):
Yeah, I've seen a lot of it.
Speaker 2 (08:41):
Yeah, yeah, so I got probably about three fourth series.
So this is my question when it comes to the regulation,
and then we'll jump in to other stuff. But this
is one thing I don't quite understand. So generally speaking
for me, I think small government's better, especially federal give
states rights that's like my like what I would view
in my political and generally regulation a bad thing, generally speaking,
(09:01):
like stop regular things about the free market, take care
of his things with technology. It gets a little hairy
because I can understand the dangers of like, let's say,
like I put my personal information on this social media site.
Speaker 1 (09:15):
What rules are.
Speaker 2 (09:15):
There to protect my personal information? So that makes a
little more sense that there's that regulation there. Mowenzuck was
talking to Rogan because for a long time, people on
the conservative side, like Facebook was the enemy for us,
like they were the free speech deniers, and you would
get your band account if you said anything remotely conservative
(09:36):
for a long time. And then he goes up and
you know, whether he's doing pr control, whether he understands
that Trump regime is coming in, so he's trying to
say face whatever that is. He was saying, how it
was the American government and the EU everyone putting insane
amount of pressure on them to keep certain misinformation that
(09:57):
they would label misinformation off or hate speech off. I
guess because you're in the space more. Do you buy it?
Do you think that Zuckerberg was purely pressured or do
you think that he's just walking back right now, just
to try to save a little space. I don't know
if you're allowed to give your opinion on that.
Speaker 1 (10:15):
But no, certainly, I mean we have evidence, right, I
think the Twitter files provided a great look at what
the government had been doing, at least at the US
federal level. But you think about the different types of
things that have been proposed over the years, whether it
is anti trust regulation or whether it is various regulations
to try and control the content moderation policies of these companies.
(10:37):
They're getting enormous pressure from all sides, whether it is
the European government or whether it is the US government
or state governments. And the really difficult thing in the
United States is that they're getting different pressures from state
to state. Essentially, right, you could have a democratic trifecta
state that wants to see one thing, and a republican
trifecta state that wants to see another Online. There are
(10:57):
very subjective definitions of what is and what is not
harmful content. Right. We have very limited of what is
explicitly illegal content. We know what those are. There's very
bright lines in US law on those things. But beyond that, right,
we don't like the definitions of what is and what
is not harmful content very based on your political stance.
(11:19):
So that's a really difficult tightrope to walk. And the
problem with giving government so much power is they can
be upset with what you do over here on this
side of your business, excuse me, and the content side
of your business, and the way they get you, the
way they can pressure you to do something different there
is by turning the screws over here on something else
that you're doing. And so that is way I think
(11:41):
it's really important to kind of look at how the
government's able to exert pressure and prevent that, right, and
then we can have conversations about what's going on at
companies once we are able to eliminate that that side
of it. I think there's what the Trump administration put
in place, the Executive order going after this what we
(12:03):
call job owning, right, pressuring companies to take down speech
that should be codified. Right. That's something that could change
as soon as the Trump administration is no longer in power.
No center all has a bill that would do that,
So getting some sort of permanence there would be great,
and that way we can get more to the root
of the problem if there persists to be one.
Speaker 2 (12:22):
Do you see that happening? Because it you know, regardless
of the aisle that you're on, the government likes power
generally speaking, our governments they want more, and they want more,
and they want more. Do you see through the Trump
administration then being able to pass a bill that is
more long term that protects you know, social media companies,
(12:44):
technology companies from more government regulation. Do you see that?
Do you think that's an actual thing that might happen
or will this just be every four years? You know,
we're just going to keep walking this tight rope back
and forth.
Speaker 1 (12:56):
Yeah, that's a great question. I would love to be
more optimistic. I hope that that happens. I think that's
I think that's something that we should continue to fight
for and not be deterred. But you're very right that
the government does want more power. It always does. And uh,
and I think it's really important for folks to understand
what the what the shoe looks like. On the other foot,
I think the Trump Administration's got a lot of experience
(13:18):
with that, right, you know, they have experience with you know,
government being you know, weaponized and important not to set
precedence or load of bazooka that essentially will be handed
to your enemy in four to eight years. So I
think it's something we have to continue to push for.
But it's also why we need to be careful about
(13:38):
new frontier regulations on technologies like AI, right because you
may be regulating AI to prevent algorithmic discrimination, like I
talked about earlier, but that regulation can then be turned
around to exert pressure on companies to do things like
manipulating the content that AI produces UH to advance a
specific political agenda. So it's about reducing the those pressure
(14:00):
points and those leverage points that politicians can abuse on
frontier technologies.
Speaker 2 (14:06):
Gotcha, what I guess? Do you see? So when I
look at the EU, for example, people are now getting
arrested for what they're posting online and it's becoming like
you could, you know, you do hate speech and all
of a sudden you're in jail.
Speaker 1 (14:22):
There.
Speaker 2 (14:23):
What's the slippery slope that would get America to that point?
Speaker 1 (14:27):
Like?
Speaker 2 (14:28):
What what would it take? How close are we to
that right now? And if we don't act quickly as that?
Is that something that Americans should be keeping their eyes on.
Speaker 1 (14:39):
Yeah, I don't think we're close to Americans being arrested
for that. I mean, thankfully the First Amendment in the
courts in the United States are still very defensive of
the First Amendment. That said, I think the pressure that
foreign governments can exert on a global company as really problematic,
and that can lead to the suppression of speech. Here
in the United States. You're not going to get a
(15:00):
rested for what you're saying, but the company may be
more inclined to not let you post that or take
it down after you've posted that because of the big
pressure that they're facing from the European Union. I mean,
some of the fines that we've seen associated with EU
laws would amount to ten percent of global revenue for companies. Wow,
global revenue. I mean that would bankrupt a company overnight
(15:21):
because you have violated a law in the EU. That's
something that's whether it's competition policy or speech regulation, that's
not even illegal here in the United States. And that's
why I think the Trump administration is very right to
go to the EU and say you can't just be
taxing our companies like this. You can't be doing this,
because that is ultimately coming out of the pockets of
(15:41):
American citizens, and it's also harming the access to the
free and open Internet that American citizens have. Again, I
go back to if you're annoyed every time you go
to a different website and you have to click through
all of those cookie pop ups because of a foreign regulation,
and so the idea that it could never happen here. Yeah,
probably not going to go to jail for what you've said,
(16:02):
but you may have your access to valuable services reduced.
Speaker 2 (16:07):
What would it take for like a company like Facebook
to just pull their services from countries that are are
you know, penalizing them so much with these fees.
Speaker 1 (16:18):
Yeah, that's a great question. I mean, what we've already
seen is that a lot of companies, particularly in the
AI space, have not launched their products in the European Union.
So we're not at the stage yet of completely pulling
out of those markets, but we are at the stage
where they're saying, look this cool new thing that Americans
have access to, we're not sure that's in compliance with
(16:39):
this vague law that you passed. We're going to delay
the launch there, or you may never get it to
get at all.
Speaker 2 (16:45):
Yeah, because there's already countries like I mean, like China
doesn't correct me if I'm wrong, But China doesn't have Facebook,
right or is there access to Facebook in China, I'm
not I'm not sure on that.
Speaker 1 (16:52):
I don't know if your most most Western platforms are
banned in China Ryeah's.
Speaker 2 (16:57):
And it seems like more countries are like like in
the you who are going to that China model of
not letting their citizens have anything of type of free will.
And so you'd think, naturally speaking, if Facebook does pull
things from the EU, that would force lawmakers to change.
I mean, they have to at that point make a decision.
Speaker 1 (17:15):
Right, Yeah. Unfortunately, it almost seems like that would be
their goal. The European you know, they they they are
doing whatever possible possible to try and suppress this techno.
There a lot of European Union officials seem perfectly happy
with the fact that folks would not have access to
free and open speech and resources that are controlled by Americans.
(17:36):
And that's really unfortunate. I mean, I hate to see
the Internet starting to get walled off, especially we expect
this kind of behavior from the Chinese, we expect it
from the Russians. We shouldn't be expecting it from Western Europe.
That's perfectly what we're seeing.
Speaker 2 (17:49):
That's a good point. I mean, it's it's kind of
like almost like America is. Would you say America is
like the last big country power that's really the last
free speech Which country left that on a global scale
wants this because it seems like a lot of our
allies in the past that would have you would think
aligned in ideology, they're just not anymore.
Speaker 1 (18:12):
Yeah, I think that's why the battle is so existential here.
I mean, I don't know of anywhere in the world
where free speech is protected to the extent that it
is in the United States, and a lot of folks
really take it for granted. But unfortunately, what we're also
seeing is that there are attempts to try and erode
those protections because you don't like what a big company
is doing, you don't like their content moderation policies. I
(18:34):
get it. Look, if it's being if that's being caused
by the government, we got to address that. But simply
because a big company is owned by somebody that you
disagree with politically, sure, if you work to try and
erode their rights, you're eroding your own rights at the
same time. So we got to be careful there because again, yeah,
I don't know where we go in the world where
you can have both free innovation and free speech online.
(18:57):
Who is really driving that? Again, the if you look
at market cap right and user bases, the top companies
are all either here in the United States or they
are Chinese based firms. Sure, and that is a you know,
there's a very clear choice what the future path is
for the for the Internet.
Speaker 2 (19:15):
And you just mentioned a really interesting point where I
think conservatives sometimes can well, I think the Twitter files
muddy the waters on what you're supposed to think because
like the YouTube for example, the reason why Rumble exists
is because YouTube demonetized people's videos and took them down.
So then you're like, well, but YouTube has every right
(19:36):
so by Google, they're a company, they can make those rules.
But the question is did they make those rules or
did the government make it for them? And I think
that's the when the Twitter files came out, it proved
what a lot of people were thinking. You know, the
CIA and the FBI were literally like forcing Twitter to
(19:56):
do different things, take down certain posts, but boost certain
posts whatever. And then I think that's what makes the
most people concerned with like YouTube, for example, Like YouTube
is where I have my biggest platform, and I'm always
weary of depending on what I say. How much you know,
you know, if we're going to talk about you know,
the V word, right, the vaccine word, right, Oh, all
(20:17):
of a sudden, this video is going to be throttled.
I already know it's going to be throttled. But is
that a YouTube thing or is that a government thing?
And like, in your opinion, in these big companies, how
much of their models are just this is what the
company believes they're doing best. And how much do you
think it's government pressure?
Speaker 1 (20:35):
I mean, it's very hard to say, and I think
it varies company to the company, but there's no doubt
that there's been a tremendous amount of government pressure. And
I think back to a bill that was introduced by
Senator Amy Klobachar which would have basically conditioned, you know,
the really important statute for technology companies, which is section
two thirty. What have conditioned section two thirty protections on
the companies having to listen to what the HHS Secretary
(20:57):
says in terms of what is and what is not
help misinformation? And so that is not just behind the
scenes pressure, but that is overt pressure inter content moderation
policies like we can put your business model at risk
unless you listen to what the Secretary of the Health
and Human Services said. That is as close to a
First Amendment problem as you really get. And so I
(21:19):
think that has to be addressed. Right, That is what
the companies are largely responding to. And I also look
at when you're talking about trying to put in place
regulations to whether it is get back at companies because
you don't like decisions that they've made, or you're concerned
about some of the business models that they have. I
think about the YouTube Rumble dynamic right YouTube. Google has
(21:40):
a lot of the resources to be able to comply
with even the most onerous regulations you can think of.
They wouldn't love it, but they have those resources. A
Rumble doesn't. And so I'm always thinking about who is
that company that is still in a garage somewhere that
doesn't have the lawyers, doesn't have the accountants, and is
not going to be able to comply with the regulatory
modes that can put up up around well, while trying
(22:01):
to control a major major tech companies may actually insulate
them from any kind of competition, does it? Does it?
Speaker 2 (22:08):
Do you think the reason why Rumble has largely been
at least to my knowledge, untouched by the same push
as maybe Facebook, Twitter, or Google by the government. That's
just a user base thing. Like the government's just kind
of watching on how many users are acxiting this, you know,
using this, accessing this platform, and the moment it gets
to the point where they feel like too many Americans
(22:28):
are on this platform, they're going to come down hard
on it. Like if if Rumble grows too much, do
you think that's when the government will care and start
treating it like YouTube?
Speaker 1 (22:37):
Yeah, I mean, it's it's essentially a radar problem, right.
It's larger you get, the bigger you show up on radar.
And so that's that there's certainly an element of that there.
But I think there's also we've been very fortunate that,
you know, we've been able to beat back a lot
of the worst potential regulatory proposals out there that would,
while all of them are very well intended, would have
(22:59):
the unintended consequences again of stifling innovation and stifling competition.
I mean, the world without Section two thirty is a
very difficult world for Rumble as much as it is
for YouTube.
Speaker 2 (23:11):
Are you familiar with the whole Parlor situation that happened.
Do you remember the social media app called Parlor. It was,
you know, at one point, I think in twenty eighteen
or twenty nineteen, the fourth fastest growing social media app
at one point, and then it got blamed for J
six and it was taken off of AWS and taken
(23:32):
off of everything. So they're back now. I don't know
if you know this. They're back now, and I'm friends
with a couple of them that friends with the chairman
and some other people, and they're really great people. But
I think the thing that I'd love to get your
opinion on with that whole situation is it was very
easy and quick for a social media platform that had
over twenty three million users at one point, I mean
(23:55):
more than that, but I think still there's like currently
twenty three million people that still have Parlor downloaded on
their phone, like actively, right, So there's it was. It
was not a small social media platform. It was the
original free speech platform, you know, that is what they
branded as. So but it was kind of easy for
it to just disappear like it just seemed to overnight
(24:16):
the government said nope, you shouldn't exist anymore. Aw said, okay,
we'll pull our support, and it was off the app store.
It's a little scary how quickly a social media app
that big can just go away. Should there be protections
for companies that if the government doesn't like something they did,
(24:36):
to just be able to just cease their existence with
the snap of the fingers.
Speaker 1 (24:40):
Yeah, that's a that's a great question. It's a difficult one, right,
It's it's not one with an easy answer. There's no
silver bullet here within the context of American law. But
here are some things that I would suggest. First and foremost,
we need to make sure that we have a competitive
competitive dynamic at all levels of the Internet stack, right,
whether that is a your ISP level, whether that is
(25:01):
at the website level or at the data center level,
right where who is storing your data, who is powering
your website with the cloud services that protect websites and
allow them to function, right. And the way you best
ensure that is by again not creating a regulatory structure
that stifles the ability for companies to get off the
(25:22):
ground at all levels of the stack. And there's a
lot that we can still do, whether it's at the
ISP level, whether it's at the cloud computing level, and
whether it's at the edge level, to ensure that companies
are getting access to capital and are getting access to
the services they need to stay online. I mean, a
great place I would point folks to is if you
look at initial pulk offerings IPOs way down in terms
(25:47):
of when companies are launching, they're launching later and at
larger sizes, which suggests it's becoming a lot harder to
get public investment because of SEC regulations. So there's like
one little part of the solution there to allow companies
to gain the resources they need to continue to function.
And I worry that a lot of the solutions that
(26:07):
I've seen proposed in response to this situation would actually
make it easier in the long run for government to
exert pressure and take down companies or services that they
don't like in the future. Because again, if you create
this power and you say you're using it for good,
you don't know who's going to be using it down
the road, and that is something I want us to
(26:29):
be really really cautious about. So if you're upset about
what happened there, And you say, well, we're going to
use anti trust a law to break up the companies
that did that. Okay, Well that can be turned around
just as quickly to break up companies.
Speaker 2 (26:42):
That are good.
Speaker 1 (26:43):
Yeah.
Speaker 2 (26:44):
Yeah, Are you familiar with blockchain at all? The blockchain technology?
So that's what Parlor did, So that was their solution.
I'm not if you know all this, my listeners might,
so I'll give you a little bit of what's going
on there. And this is a unique thing that I'd
love to get because you're in this world, so I'd
love to get your thoughts on this. So when they
got banned, you know, the new owners bought it back
(27:05):
in September earlier last year, and then they built a
blockchain called Optio. Actually friends with Brian who literally is
a guy that built Optio. So it's a decentralized platform
just like an AWS that they built. And the theory
is that now it's like they are not on AWS anymore,
Like they have their own blockchain that powers their app
(27:26):
that is now decentralized, but by government touch. Do you
see a thoughts on that? B Do you see more
and more companies going down this path of using the
cryptocurrency technology to try to protect themselves from government regulation.
Speaker 1 (27:44):
I would say that I increasingly see companies wanting to
get a lot of these decisions out of their hands
because they're sick of like the pressure that they're getting
from either side, and that often kind of conflicts with
each other. And so that's what you've seen with the
approach that Meta is taking now where they're doing the
community notes approach to content moderation, I think you see
a lot of companies, again, yeah, wanting to empower users
(28:07):
because again, a lot of the things that these companies
are dealing with in the content space are fundamentally subjective. Right,
something that is abhorrent and violent to one person may
not be to another, right sure, or something that's distasteful,
you name it, right, these we're talking about fundamental debates
that are very again based in subjective values, and that's
(28:28):
a really difficult thing for companies to navigate. So to
the extent that they can decentralize systems and empower users, yeah,
even the largest companies are kind of going down that route.
And we've seen a new push by several companies in
the online safety space as well to re empower users, right,
you determine what the filters are, right, and here's where
(28:49):
our tools are. Right. Instead of us turning them on,
you turn them on, and here's how you do it. So, Yeah,
whether it's through blockchain technology or whether it's through things
like community notes, the ability to empower users and kind
of get that decision making out of the individual company's hands,
I think is something that's going to be really important
going forward.
Speaker 2 (29:09):
Do you think it's just as simple as just a
permission slide when you log into an app, basically saying
like this app's kind of like the wild Wild West,
you know, like a like a you know when you
go to a trampoline park and you have to sign
a waiver before you jump on the trampoline, saying like,
if you get hurt, it was your choice and you
chose to be here. We are not liable for anything
(29:31):
that you did, right that hurt yourself. Do you think
that that's the legal play for social media where they
can say, hey, look, it is on the user what
they see or don't see, and what they allow what
they don't allow. It is not our fault if they
see something that they think is hate speech, or they
see something that they think is misinformation, like that's on them.
Is that kind of the play moving forward to try
(29:52):
to take liability or with the government just not let
that slide because they still be blaming the platform for
housing those things.
Speaker 1 (29:59):
Yeah, that's a great question. I don't think that pressure
is ever going to stop from government, and we still
see it today, Right. There are still government officials that
get upset about what you see in cable, right. And
I think it's important that we recognize that we are
we are just dealing with a new type of technology
and a new moral panic that has existed with every
(30:19):
type of new media that has ever been launched. Right.
We had this debate in the nineties about video games,
We've had it about explicit music in movies. Right. And
we've increasingly seen companies come together and produce voluntary rating systems,
which I think is really important. Right. We want we
want this to be happening outside of government. Right. We
want families, We want users and companies to be able
(30:41):
to work together to kind of empower each other to
create a safer space. Right. We saw that with the
movie rating system. Right, when you go to a theater
and you can't get into the rraed movie because you're
under seventeen. That's a voluntary private system that is doing that.
Or the ESRB system with video games, that is a
private and privately managed rating system. Right, the government is
not saying that that game is mature. Is the video
(31:02):
game companies and trade association that says, hey, we're going
to manage this rating system. So I think empowering companies
to kind of be able to come together and work
on those things would be great, but a lot of
them are afraid to do it right now because you
could be seen as colluding under antitrust law. So I
think sometimes in certain cases the government pressure is harmful
(31:24):
towards these ends.
Speaker 2 (31:26):
I mean, I'd argue that government pressure in anything is
generally not a good thing, because, like you said, even
if let's just say we had someone that we like
in power, so to speak, and we like the pressure
they're doing, and they allow that to happen, the moment
that power flips and they have that power, now, it's
just going to be a ping pong match moving forward
(31:47):
of just four years of this, four years of that,
four years of this, Right, that's what we want to avoid.
Speaker 1 (31:51):
I mean, we saw this. I think if you look
at regulation at the internet service provider level, there was
a ping pong back and forth about the issue of
net neutrality and say what you have, whatever you want
about that issue, whatever side of that issue you're on.
That ping pong back and forth a lot, and that
caused companies to say, well, we're unsure about if we're
(32:12):
going to invest or not because we're we're not really
sure what kind of regulatory structure we wanting to operate under.
And so you don't want to see that spread to
other industries as well, where you're not getting new product
launches because you don't have an understanding of what am
I going to be dealing with four years from now? Right,
Companies don't look on four year horizons. They're looking on ten,
twenty thirty year horizons.
Speaker 2 (32:31):
Sure, which is a very interesting I want this is
just a side thing. The show is very Joe rogany
where I'm not a great interviewer, just kind of have
like cool conversations speaking of having long term outlooks. I
watched this thirty minute documentary last night on Saudi Arabia.
Have you heard about the line that they're building, like
it's called the Line and it's called like that this
(32:53):
is just gonna be this massive, just city like through
the desert. And then they're also apparently building like a
resort thing and they're building like a like something where
you can ski and like just doing all these things
right because Saudi Arabia sees the writing on the wall
of companies trying to get away from oil, and all
they have is oil, and so they're like they're trying
(33:17):
to come up with something that will you know, give
it's the next thing. This is their next cash cow,
and they're thinking tourism is going to be that or whatever.
I don't have you heard of that at all, because
it's kind of wild.
Speaker 1 (33:28):
I have heard of the line. I'll credit them for
their ambition. We'll see, we'll see that if it ever
comes to reality.
Speaker 2 (33:37):
It's just one of those things. And I if I
offend any of my Saudi listeners, if I have any
of those, I apologize. But generally speaking, Saudi Arabia is
not what you'd consider as a vacation area. Like they
have not been very kind historically to Westerners visiting, and
now they're trying to just do a one to eighty
(33:57):
where they're actually like where do you buy, you know,
we're cool, come in have fun with that.
Speaker 1 (34:03):
It'd be great.
Speaker 2 (34:04):
And I'm just like, if it works cool first of all,
If it works that, you know, yeah, that would be sick.
I have so many questions on how you're going to
get power out there and through all that, forget that.
But it is a unique thing of basically, you have
a country that's like our gold so to speak, which
is oil thirty forty fifty years ago, might not be
(34:27):
as needed as it is now, and we have a
big problem because that's all we really have. And so
there they think that tourism is their answer, which is
a uniqu is I think. I think I think Dubai
is probably what they saw what happened in Dubai, and
they're like, we can do that here and make it better.
That's that's probably in my mind. I don't know if
we're just randomly talking now, but I don't know if
(34:48):
you have any thoughts on that. It's interesting, you.
Speaker 1 (34:51):
Know, I would say it's important for governments to understand
that markets are very fluid and that the future is
pretty hard to predict, and that you shouldn't rest on
your laurels, right. I think I worry about that in
our own government, where they assume like, well, we'll always
have these trillion dollar companies that are producing these incredible
(35:15):
services that are quantum computing and AI and things like that.
That's not necessarily the case. You never know what the
next disruptive technology will be. I mean, there could be
a kid in his basement or his parents' garage right
now that has already stumbled upon some sort of discovery
that's going to upend, you know, one of the larger
companies overnight. You just don't know. So I think it's
(35:36):
always good for governments to have sort of an open
ended and vision of the future. I don't know about
sort of command and control and central planning of exactly
where people are going to live and in what shape
they're going to live, Like we're all going to live
in a big long line in the desert. Maybe that's
a little bit too much central planning.
Speaker 2 (35:55):
But literally central planning.
Speaker 1 (35:57):
Yeah, but being future and saying, hey, we want to
build cool stuff, all the power to you.
Speaker 2 (36:04):
Yeah. The one thing I will tell you this and
we'll go on to the next point. But the props
to the Saudi gup because they actually paid the people
that they were removing from the deserts, so they were
like towns and stuff that they So they actually paid
them to say, hey, we're going to help you start
a new life somewhere. We're gonna give you money, but
we're literally taking your house whatever. Right, so they at
least paid them, but then if they said no, then
(36:26):
they killed them. And so it was one of those
things where there's this like like there's a guy that
went to social media and he was just like, I'm
not moving, they're trying to take my home. I'm not
taking their bribes. And then he was dead like the
next day. And so it's just this unique because it
is still like you have a king of Saudi Arabia,
Like it's not a democracy like it's and that's the
(36:46):
thing that's to me. It's not even the technology. Can
they pull it off? That's not to me.
Speaker 1 (36:52):
The issue.
Speaker 2 (36:52):
The issue to me is our Westerner is going to
ever feel comfortable going to a place that's kind of
like communists China, like like that has the charade of beauty,
but behind the charade is just a dictator. And like
if Putin created the coolest thing ever and said Americans,
(37:14):
you're welcome, You're gonna be like you know what I
mean like that. I think that's gonna be their biggest
their biggest hurdle is convincing Westerners that they would feel
welcome and safe there.
Speaker 1 (37:25):
Yeah, I mean, and this point kind of reminds me
of what I'm really concerned about if I look at
the global picture and where we are in terms of
what the future of technology and the internet looks like.
The next billion people that come online and are coming
online right now, Sub Saharan Africa, Southeast they're getting they're
getting wealthier, and they're getting access to the internet. They're
(37:45):
getting smartphones. It's great, right, and these are things like
Starlink that are allowing them access to the global Internet,
and they're accessing services like microfinance to kind of lift
themselves up, which is fantastic. But the question is what
is the future? What are those billion people going to
be using? What services are they going to be using,
They're going to be are they going to be coming
online on an iPhone or on a Huawei phone right right?
(38:08):
And I think that's a really important question that we
ought to be asking before we start, you know, taking
for granted the companies and services that we have here. Yes,
you may have your disagreements with them, but I would
venture to guess you have a much bigger disagreement with Jijingping.
And so the question is how do we how do
we continue to promote American innovation and a global internet
(38:30):
built on the classically liberal values that undergird American society
versus seeding that ground the potentially nefarious actors.
Speaker 2 (38:38):
That's a really good debate and like the thing to
debate and question and figure out because like, we have
free speech laws here in America that we have the
First Amendment to protect that, but someone in Saudi Arabia
or someone in another country does it. And if we
give these people through starlink a voice for them to
criticize their leaders and the like, the ability and they
(39:01):
don't necessarily themselves even know the repercussions of what they're
doing until it's too Like, that's a that's an interesting
thing where if you give people the access like this
this for example, of the sky and Saudi Arabia that
literally made a real like this made a video and
then had to shoot out with the police, Like that's
something that if he didn't have a phone, maybe things
would be different. I don't know, but the fact the
(39:22):
government saw his video, they said, you know, take him out.
And I wonder if that would be more widespread if
like you look at some of the governments, like like
China and like that are like, we know that they're
corrupt and they're they're oppressing the people. Okay, if you
gave everyone free access to the internet to say whatever
they wanted to say, how many of them would put
themselves into a box before they realized it was too late?
Speaker 1 (39:44):
Yeah, that that's a it's a difficult trade off, right
and I and I would not want to be on
the overly cautious side of that question, because if you
look throughout history, right, I wouldn't say the printing press
has led to more oppression. Printing was very important in
challenging you know, centralized institutions and abuses of power. And
(40:08):
every new type of speech technology has certainly drawbacks. Right There.
There are no solutions, There are only trade offs. I
borrow a lot from Tomsol, but.
Speaker 2 (40:17):
I think, yeah, he's a great yea, that's a good
person to borrow from it.
Speaker 1 (40:19):
Yeah, so so many things that apply to wherever you're
looking throughout the economy, or society. Right, there are no solutions,
There are only trade offs, right, and so where are
we moving on net? And I think folks having greater
access to knowledge and ability to share knowledge and speak
freely is on net is going to drive society towards
(40:41):
a better future, a more open future than if we
were to try and restrict those things and only or
hoard them ourselves. Right, I think that would that's not
the preferred solution here. And ultimately the problem is that
regardless of whether you're printing pamphlets or you're on social media,
(41:01):
governments trying to control speech as the problem oldest time.
Speaker 2 (41:05):
And it's just a lot of times just giving a
voice to the voiceless at minimum of what this could be.
Starlink's incredible, But that reminded me of the stargate thing
that Trump has done e whether like five hundred billion
or something like that into technology. Would you mind giving
us a taste of that program that Trump has gone
(41:26):
right now?
Speaker 1 (41:27):
Yeah, what a really exciting announcement, Especially as a limited
government guy. I was excited to hear that it was
not my money, right, it's private capital. It's five hundred
billion in private capital that is going to be invested
in the United States to produce the energy and data
infrastructure necessary for an AI revolution, and that is really
(41:49):
really important, and it happened the fact that the Trump
administration President Trump himself took time within the first I
think within the first twenty four hours of taking office,
not exactly like a free and open schedule. For him
to be a part of that announcement is a great
signal to the tech industry that hey, we're open for
business again, because what we saw under the Biden administration
(42:10):
was a willingness to cooperate with the European Union. I
hate to keep going back to them, but they're the
best example of the most aggressive regulators in the world
there was. The Biden administration was cooperating with them in
terms of trying to institute regulations on frontier technologies like AI,
and that keeps investment on the sideline because investment, the
(42:32):
antidote to investment, if you will, is uncertainty. And so
the regulatory uncertainty that was created by the Biden administration.
We're going to regulate you here, we're going to break
you up there, We're going to control what you can
and cannot have on your website. Here. I mean, I
think back to what Mark Andreesen said on a recent podcast,
famous investor in Silicon bal Internet pioneer, that the Biden
(42:53):
administration effectively told them, look, we're only going to have
two or three AI companies and we're going to be
controlling them. We're going to regulate them. And that's it
not going to importantly right, that's not going to get
five hundred billion dollars of private investment off the sidelines
to build infrastructure because people are going to be too
busy worried about am I going to get regulated? I
need to hire lawyers, I need to hire lobbyists. So
really really encouraging sign that the Trump administration is, at
(43:18):
least in certain areas, going to say we're going to
let you invest and create and then we can address
regulatory issues as needed instead of trying to predict the
future and regulate the entire space and one file swoop right.
Speaker 2 (43:34):
And I think in theory of free market sounds amazing,
But I still think a lot of people are freaked
the crap out from AI, like it just terrifies them.
And I think you could have potentially the most like
conservative small government person in the world that then when
you talk about AI, they're like, we need to have
you know, that's the one thing that they're actually maybe
(43:55):
like we need more regulation in what do you what's
your gut feeling on this whole AI thing? Are you
concerned about the potential because even Elon at the AI
summit was like, this thing has the potential to destroy civilizations,
like this is not a joking mad like, and you
hear that, and you're like, oh well, and then you
hear him that he's like make brock and he's also
(44:18):
doing things with open eye and so like, what's what's
your gut feeling do you feel that this is something
that could turn to be a big problem for humanity
or is this just another tool that is just going
to be fine.
Speaker 1 (44:32):
Yeah, So I would encourage folks to start using it
because that is the way that you're you're going to
understand what exactly it is. And I've begun using it
even in my own job. It's it's incredible how much
more productive it's making me. It's not replacing me. It's
enhancing the what I can do in twenty four or
eight hours.
Speaker 2 (44:51):
And now that's the argument right now, it's only but
the concern is that ten years down the line is
going to get so good that you're not needed anymore.
Speaker 1 (45:01):
Yeah, that's again. I don't want to get into the
business of predicting the future, because I've been telling politicians
that you should be out of the business of predicting
the future.
Speaker 2 (45:09):
Right right, If you look back right to the safe space,
this is a safe space.
Speaker 1 (45:14):
So instead of trying to predict the future, let's look
to the past a little bit. There's a great website
I would encourage everybody to go to. It's called the
Pessimist Archive. Right, google it, the Pessimist Archive. It's fantastic
because there are examples as old as time that they
have compiled of people being afraid of people are reading
too many novels. It's just going to destroy their brain
(45:35):
and make them lazy where and then there's cartoons about
people tangled up in electrical lines. We got to stop
the spread of electricity. It's going to destroy our society.
And so these fears have always existed with frontier technologies,
and I think that every technology is going to have
a drawback. But if we look at another really important
(45:59):
but powerful technology that had the I don't think AI
I'm not one that believes that AI is going to
turn into skynet. But look at a technology that did
very much so have the potential to destroy civilization, nuclear
nuclear energy and nuclear weapons. We regulated that so much
out of fear that we're not getting the benefit from
(46:20):
it either, and that's a huge, huge issue. So I
think that the more realistic chance of hurting ourselves is
by smothering a technology before really understanding what we can
benefit from that technology. And right now we see societies
all around the world that are regulating themselves, including the
United States, into energy scarcity, and I would encourage folks
(46:44):
to understand that there are no there are no energy
poor rich nations, right. You have to have an abundant
energy supply, and we've really harmed our ability since the
middle of the last last century. UH to have that
technology and the best benefits of it, I think it's
important to look and not say there's never any regulation
(47:06):
that is necessary, but making sure that regulation comes with
certainty and is addressed in an incremental way and narrowly
targets specific outcomes instead of trying to regulate the technology
as a whole. Right, because any technology can be misused, right, sure,
I I a hammer from home depot and go smack
someone in the head with it. That doesn't mean you
ban the hammer, right, that's already a crime.
Speaker 2 (47:28):
And that's an interesting thing that you just said, even
about like the nuclear like the fear of climate change
and the fear of because I'm not listen, I'm I
am one that leans to I think the world is
only going to be deteriorating until God comes back. Okay,
so that's my biblical worldview. So that's what I think.
It's so sure is the world maybe changing for the negative,
(47:50):
probably the over time. Sure we're gonna run out. Do
I think that's the biggest crisis to where we shouldn't
drill for oil here in the States. No, that's stupid,
but right, like for fear of climate change, we stopped
the certain research when when it comes to oil, and
then for fear of nuclear you know, like Chernobyl for
turning America into Chernobil for example, we don't even really
(48:12):
do we even dabble in nuclear anymore? Is that even
a thing that they're researching as an option for us?
Because I know for a while that was like that
was viewed as the future, like having a small nuclear
planet could power that you know, fall you know the
whole country, Like, is that even something they're talking about anymore,
because it seems like everything now is electric.
Speaker 1 (48:30):
Yeah, what we saw happened with nuclear was what I've
seen happen is happening with AI at least. And where
you have or the tech sector broadly, you have folks
like Elizabeth Warren and even Senator Lindsay Graham have come
out and said, well, we're going to have a specific
new tech and AI regulate regulatory body, right like a
CFPB or whatever for just technology. And while okay, we
(48:54):
did that with nuclear power, right, we put in place
in Nuclear Energy Regulatory Commission. And when you put that
in place, they approved very few, if any new nuclear projects.
I mean, that was a place where investment and proposals
to build new nuclear capacity largely went to die. And
that's what I don't want to see with any sort
(49:15):
of new frontier technology. And so again I would actually
encourage folks to look to on AI last Congress. Late
last Congress, there was a bipartisan House of Representatives Task
Force report on AI and a couple hundred pages long actually,
but it's incredible about how there was a bipartisan consensus
(49:36):
built that says, look, we're going to regulate AI in
a sectoral and incremental way instead of trying to create
this brand new regulatory agency that just focuses on AI.
It's a lot easier to teach a health regulator about
AI than it is to teach a new agency about
both AI and health. Addressing again, specific incremental harms is
(49:59):
really important. Right. There are certain things that we know
we can do, We can we very can can very
easily say, for example, creating a deep fake image of
somebody in a non consensual situation. That's a very narrow
thing that we can say you're not going to do
that that doesn't smother the entire technology. Instead of trying
to say we're going to rewire the technology through regulation
(50:21):
that it's never going to be able to do that thing,
you go after the specific bad actors and bad uses
of that technology. And that's the approach that was adopted
in that report, and I think that is going to
solve a lot of the problems people are focused on.
And in terms of the other big AI issue which
I think you were getting at, is displacement of folks.
(50:41):
I think there's plenty that we can be doing to
at the K twelve level to make sure that people
are better suited for the economy that exists today and
the economy that will exist in the future. I mean,
even today, we have schools that are producing kids to
go and just rack up a bunch of debt and
get a liberal arts and then not actually find fulfilling career.
Speaker 2 (51:03):
And then go work at Starbucks afterwards.
Speaker 1 (51:05):
Right, Right, So there is a lot and that's that's
an area where the government has a lot, almost all
of the power, and so that that's an area where
we could be doing a lot better job of teaching resiliency,
teaching diverse skill sets, and and and teaching folks to
kind of understand market signals about where their highest and
best use is. And then there's certainly a room for uh,
(51:30):
the ability to make sure that folks that do get
displaced in narrow industries to retrain and UH and and
and and advance themselves as technology changes. I don't think
we do a very good job of that. And those
are those are things that are certainly government costs and
will require some regulation. But the trade off there is
that you do have greater innovation and dynamism, which we need.
Speaker 2 (51:54):
Yeah, that's a that's and colleges suck, dude, Like the
guys Like we're getting to the point where like I'm
not convinced I'm going to tell my kid to go
to college. Like we're getting closed because I'm just like
I don't outside of the social skills that you can
gain and learning to live with roommates and learning to
like those you know, intangible skills, those are great, but
(52:16):
those could be learned in other ways too that I'm
just kind of like, I don't know. College is just
the way that there needs to be a massive, massive
change to college for me to be a supporter of it,
because it's just insanely it's just for the any anyone
with a finance degree or anyone with a would be
(52:36):
like the cost benefit is not there. It just doesn't
exist unless it's like the certain you know, like if
you're going to be a doctor, you don't really have
a choice, Like that's just that's the route you have
to go, you know. But for a lot of professions,
like I'm in sales for my real job, right, you know,
me own a media company. I didn't need to college
for that, Like that, I just need a mentor you know.
(52:57):
That's so I don't know, you got me on a
are you you on the like? Were you out on
the college thing?
Speaker 1 (53:03):
You know. I don't want to tell people how to
live their lives. I think that's it.
Speaker 2 (53:07):
You can.
Speaker 1 (53:09):
At the same time, I worry that there are things
that we're doing as a society that is not allowing
folks to see that there are a lot of different
potential pathways to economics, prosperity, and success. You have government
institutions and programs that are essentially incentivizing people to go
down a couple sets of narrow pathways that all involve, Hey,
you've got to get a four year degree in X,
(53:31):
Y and Z. It's why why aren't people ready to
be members of society at the end of K twelve?
Are you an adult at eighteen or are you not? That?
I think is something that again there's no silver bullet here.
I'm not saying that all college education, I mean is
a waste of time at all. I mean it's very
degree specific. It's just specific. But this idea that it's
(53:53):
you know, a four year degree is now a path
to prosperity, I think that's really being challenged, and I'm
worried that we have programs in place that aren't allowing
the market signals to folks to understand that I start, I.
Speaker 2 (54:05):
Think that goes down all the way down to the
K through twelve level, though, like I think they're already
the I mean, that's the whole point of getting rid
of the Department of Education. That's the whole thought of
like they're already you know, training the kids to through
the K twelve only to have certain types of skill sets,
only to certain have certain type of abilities to then
(54:25):
that they have to go to college to sharpen those
Whereas I know, like so I was my dad was
a Christian school principal. I was blessed to go to
a Christian school that the academics were top tier, like
very very very tough to where when I went to college,
I was very prepared. In fact, I was like my
first two years, I was like, you know, I got
a very good high school degree. But I had a
(54:45):
bunch of friends that went to public school that they
were like their tenth love and twelfth grade years, they
just like they didn't even have like it was just
a joke, like they didn't even have to do anything
pretty much, they ain't. They didn't even take attendance in
one of the public schools close by. Like it's just
like whatever, you know. And I'm like, if the tenth, eleven,
and twelfth grade years were treated like almost a college,
(55:06):
how much farther ahead would we be as a society.
And then also teaching them way more practical skills like
budgeting and finance and loans and debt and credit cards
and those things where kids don't know their parents are
in credit card debt. They just think that's the way
they do. So then they go get their credit card
and they go throw a bunch of clothes on it,
(55:28):
and next thing you know, they have a thirty percent
APR and they're screwed. That's how they're That's how so
many pop's life starts here in America. This starts with debt.
And then if it's not credit card debt, it's student loans.
And so to me, it just stems all the way
down to the K through twelve, Like it's a problem
not just in colleges, it's just the whole education system.
Speaker 1 (55:46):
Yeah, And that's actually kind of gets at something that
we're working on, which is there is all of this
of work out there at the political and policy level
to try and regulate technologies to prevent harms, but there's
been very little done to try and teach people how
to not get themselves in those bad situations to begin with. Right,
we've understood that we're not going to stop people from driving,
(56:09):
So what do you see in most schools see drivers
at And what we need to increasingly see is how
do you live in a world where the Internet is
in everything? How do you yourself safe online? What's really
encouraging is that we've seen a couple of states realize
this and that this is a completely effective and it
(56:30):
doesn't have any constitutional problems associated with it. Teaching Internet
and online safety curriculums in schools. Florida and Virginia are
starting to do this, and I think that's a really
great thing to do that nobody's arguing with nobody is
suing to stop that. Yes, we should be teaching people
how to spot a scam artist online right, how to
identify what's probably misinformation or purposeful disinformation. Those are very
(56:55):
narrow things that we can do that would really pay
a lot of dividends and also get people kind of
used to the using technology in their everyday life, instead
of having to go back and say, we're going to
regulate this down to a level where we reduce all
of the benefits we get from it as well.
Speaker 2 (57:11):
I mean even just child safety when it comes to
posting videos and pictures. I mean the amount of parents
that just let their kids have a phone and just
let them exist with their phone, and they could be
in chat rooms, they could be and their parents have
no earthly clue. And the amount of time that the
average individual is going to spend on technology is so
much more than they're going to spend on their whatever
(57:31):
job they get one day. And so you'd think that
training on how to be safe with technology because it's
going to be an integral part of their life. It
blows my mind, man, how many kids have phones. Like
It's insane to me. I'm like, I saw like a
four year old walking down with an eye like an
iPhone the other day. I'm like like, I'm like, that's insane,
Like and that who knows what trouble that could get
(57:54):
unless a trained. So that's a really interesting point. Instead
of just being pro like reactive, being proactive and teaching
kids how to interact with technology. That's a very good point.
Speaker 1 (58:03):
Yeah, well, speaking of technology, I apologize if you heard that.
My smart home smoke detectors just went off. I think
they were doing the monthly test, so if anybody heard that.
But you know, see all technologologies it's good. Yeah, but
it has its trade offs as well, So that's not
a problem at all in the middle of a at
least we know if you're going to have a fire,
(58:23):
you're going to be alerted.
Speaker 2 (58:24):
Now, so that's start working. That's a good job.
Speaker 1 (58:26):
But you know, to your point, I think you're exactly
right that you know, when you log off for the
day from work, you're generally logging on to see what
your friends are doing, and you know, texting people and saying, hey,
where are you at? And so yeah, technology is going
to be part of our lives. The Internet is going
to be part of every part of your life, right,
I mean most people do their banking online now, right,
going down?
Speaker 2 (58:46):
Any last time I saw a dollar bill, like, I
just you know, you play with you know some sort
of you know now you just pay with your phone.
You know you have Apple wallet or Google allt or whatever, like.
Speaker 1 (58:56):
Yeah, it's crazy and I worry that there has been
a big push out there to try and say we're
going to make phones, tablets, the Internet itself perfectly safe.
Through policy, you can help, but it's never going to
be one hundred percent. And I worry that without the
component of education, whether that is the students themselves or
(59:18):
it is the parents. Right there, you can do plenty
of public service education at the adult level as well.
You're never going nothing is going to be more effective
at protecting kid online than a parent controlling access to
the device.
Speaker 2 (59:31):
Yeah. Well, policy doesn't fix user error one percent.
Speaker 1 (59:35):
Yeah, couldn't agree more So, it's it's it's preventing that
user error and also not giving the user a false
sense of security by saying, hey, we passed the Internet
is Safe for everybody act, so now you're safe. That's
never going to happen. So it's about empowering users, and
you can do that without again creating a lot of
the constitutional issues, the censorship issues that we talked about earlier.
Speaker 2 (59:58):
Sure, well, let's do this because I can feel like
I love this kind of conversation. This is fun and
it's enjoyable to kind of like learn more because I
don't know much about the policy space of tech, and
so it's been really intriguing to me. Would you mind
telling our listeners about net Choice? What is net choice?
And then I know you're the director of policy at
net Choice, So what is net choice? And then is
(01:00:19):
there anywhere that people can support you? Guys follow you.
If people want to hear more and they're interested, where
they can they go?
Speaker 1 (01:00:26):
Yeah, absolutely so net Choice. We are a trade association,
so we do represent several major and mid and small
technology companies. But that said, we are a principles based
trade association. So what drives the policy work that we
do is that we believe in an Internet built for
free expression and free enterprise. Sometimes that puts us at
(01:00:46):
odds with some of the specific things that some of
our members would like to see, but we are again
focused on our principles and ultimately we believe that will
be the greatest benefit for the country and for our
members going forward is again that Internet built on free
enterprise and free expression. So you can see what we're
working on at netchoice dot org. That is our website.
(01:01:08):
We're working on lots of different ways again try and
protect free speech online and also prevent regulations that would
stifle innovation and competition going forward.
Speaker 2 (01:01:17):
That's awesome. Well, man, I appreciate it. This was really fun.
I appreciate you coming on and chatting. And I know
I'm sometimes all over the place and what we talked
about is probably more of your more random.
Speaker 1 (01:01:30):
But I appreciated the opportunity to talk about some other
things than what I have to talk about every single day.
So I really appreciate it.
Speaker 2 (01:01:36):
Hopefully I didn't get too many Saudi Arabia princes mad
at you today. That's hopefully that's not going to happen
from our Saudi, from our Sadi.
Speaker 1 (01:01:44):
I wasn't planning on visiting the line anytime, so yeah,
I think, yeah, it's still the pictures i've seen. It's
very much so just a ditch in the desert.
Speaker 2 (01:01:53):
Yeah, yeah, So I'll text you after I'll get them.
I'll send you the YouTube video. This guy went there
and like literally just stayed in the desert with their
local bedouins. And but the amount of trucks, like they
are working twenty four to seven, no break, like they're
making mountains because the amount of sand that they're taking
(01:02:15):
away from this little like strip, they're building mountains with
the amount of sand they're moving, so like it's not
a like they're not joking about this, and that's what's crazy,
because they are putting I think I think it was
like the original thing was like five hundred billion was
the original quote, and now it's up to like, you know,
two trillion or something like that, and it's like, well,
(01:02:38):
first of all, that's f you money, that's what they have.
That's just you know, that's another level of wealth that
they're saying. It's fine, go ahead, but yeah, anyway, I'll
send you The documentary was insane though it was very
it was very well done.
Speaker 1 (01:02:52):
So yeah, I think I think I've seen like a
snippet of that video. It is crazy looking. Again, I'm
excited when people want to build big, cool things, but
I worry when government's at the center of.
Speaker 2 (01:03:03):
Yeah yeah, yeah, so she when it's a dictatorship at
the center of that, it makes me like a little concerned.
Speaker 1 (01:03:10):
You know.
Speaker 2 (01:03:10):
Well here, let's I was like, we can just keep
trailing off into that conversation. Well, I appreciate you coming
on the show. Guys, please support Patrick, supporting at choice
of doing a lot of good work. If you have
any other questions, I can always have Patrick back on
the show, and I can ask your questions to him
as well. Well, thank you brother. We'll keep talking about
this because I'll tell you one more thing off camera
in a second. But thank you brother for coming on.
(01:03:33):
This is really really fun and I appreciate a lot.
Speaker 1 (01:03:36):
No, this is great. I really appreciate it.
Speaker 2 (01:03:38):
Perfect man. All right, Well, see you. Thanks everyone. Please
subscribe to the channel, follow, and see you guys on
the next episode.