Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:03):
Hi, everybody, it's me Cinderella Acts. You are listening to
the Fringe Radio Network. I know I was gonna tell them, Hey,
do you.
Speaker 2 (00:14):
Have the app?
Speaker 3 (00:15):
It's the best way to listen to the Fringe Radio Network.
Speaker 4 (00:19):
It's safe and you don't have to log in to
use it, and it doesn't track you or trace you,
and it sounds beautiful.
Speaker 2 (00:27):
I know I was gonna tell him, how do you
get the app?
Speaker 4 (00:31):
Just go to Fringe radionetwork dot com right at the
top of the page.
Speaker 3 (00:37):
I know, slippers, we gotta keep cleaning these chimneys, you know,
(01:00):
we do.
Speaker 5 (01:01):
See.
Speaker 6 (01:01):
The power of digital media is that you can go
viral very quickly, and when you go viral, all of
a sudden, you have loads of people who are now
protecting you or at least have their back or have
your back. So yeah, there is sort of a countervailing
force that's there, which is pretty good. But at the
(01:22):
end of the day, it still means you need to
do something. And it's an awful situation that you've been
put in. And I think the question you're asking is
why me, Why why did this happen to me? And
obviously you want that situation to change. And you need
to have some way to remedy that. So one way
(01:44):
to do that is to go to court. Another is to,
I would say, go viral in a way to get
people to hear what you went through. And of course
people can be and I am very sympathetic to what
I'm hearing.
Speaker 7 (01:59):
Welcome to Business Games. I'm Sarah Westall. I have the
honor of having Stuart Broughtman coming to the program. He
has extensive background in the First Amendment and freedom of expression,
and he's worked under four different presidential administrations and I'm
gonna I'm just going to read his bio directly to you,
(02:19):
because I think you'll be impressed with it. Stuart Broughtman
is America's leading public scholar and free expression and Digital
Media laureate at the Media Institute. He served in four
presidential administrations on a bipartisan basis, which it should be,
and was the first visiting Professor of Entertainment and Media
a LOW at Harvard Law School, with concurrent appointments at
Harvard's Berkman Client Center and mita's Program and Comparative Media Studies.
(02:43):
An elected member of the American Law Institute, Broughtman has
participated in several First Amendment Supreme Court cases and provided
expert council the government, agencies, and major media companies. His
expert commentary has reached over five hundred million readers globally,
along with audiences on major networks such as ABC, NBC, CNNC, SPEN,
(03:04):
and NPR. And I'm really glad that he's here because
this is the kind of conversation that I want to
have with people when it comes to freedom of expression.
Most of you know that I have been a targeted
person when it comes to being on person and for
being censored. I wrote an article about what it means
(03:25):
to be on person when I was on person literally
all over the internet. My website was taken down, my
Patreon account was taken down. My ex account was really
Twitter at the time, was taken down the same day
Donald Trump's was, and I never.
Speaker 8 (03:39):
Got it back.
Speaker 7 (03:40):
My YouTube account was taken down. I finally got it
back just a couple of weeks ago, which I talk
about in this interview. I'm scared to even know what
to publish, because I mean, I honestly don't know what
I can talk about.
Speaker 2 (03:54):
It can't talk about, So now I.
Speaker 7 (03:56):
Just publish whatever is the most vanilla thing again, because
I'm scared of five years of persecution for just talking
about issues that are really important to the American people.
And so I was really glad to be able to
have this conversation because I don't believe this is a
partisan issue.
Speaker 8 (04:12):
I think this is an issue that it's the.
Speaker 7 (04:14):
First Amendment for God's sake, It's an issue that affects
all of us, and in my case, I share that
for the first time. What happened to me not only
did YouTube take my channel down, but starting in the summer,
and it probably happened actually it started at the beginning
of the end of last year where I had impersonation
(04:35):
accounts on YouTube. Actually there was one from four years ago,
but it wasn't very big and there wasn't much on it.
But the impersonation accounts, there were four of them that I identified,
two of which were posting my original work and were
monetized in the YouTube Partnership program. One was posting every
(04:57):
single video that I do monetize for it, and some
had tens of thousands of views and we're part of
the YouTube partner program. I tried to bring it to
their attention during the summer and they ignored me with
their customer service, and then I did DMCA takedown request.
I sent it to multiple organizations with inside Google and YouTube,
(05:20):
following their official process. I sent certified letters, and I
was ignored and told that they don't take down entire
channels that impersonate. They need to look at specific video.
Whole channels don't count. I sent it directly to their
legal organizations, so they should have been able to see
that the entire channel was impersonating me, but that didn't
(05:41):
work either. And then I had to hire an attorney,
which costs quite a bit of money, and they had
to send not only one two, but they had to
send three letters before they even acknowledged that it existed.
So that's the amount of effort I had to go
through to get impersonation channels down. In the process, I
was able to document that not only the impersonation channels
(06:03):
were coming up on the first page of Google results
and youtubeer results. I went to Google Gemini and asked
them if those channels were legit, and Google Gemini confirmed
that those channels were legit and had all these reasons
why those impersonation channels were legit, pointing back to my
own website using the proper terminology, and so on and
(06:23):
so forth. So if somebody went to Google Gemini and
asked the same question are these real channels, then they
would believe that they were. I had multiple people deny
wanting to come on my program because they thought I
was too small or too insignificant, because my own work
was not being displayed, only these fake impersonation channels were
(06:44):
being displayed. So I lost a lot of credibility during
that process. But to make manners worse, I haven't been
using Google Search. I use a different search engine. But
during that process of doing my research, I realized that
not only were the coming up on the first page,
there was also relation of me my likeness to porn videos,
(07:07):
and there was on the first page Google Search there
were eight references to porn videos, and I am showing
you them here. There was also another reference to an
obscene video, which is even worse. I'm not even going
to respond or say it.
Speaker 8 (07:23):
Out loud what it is.
Speaker 7 (07:24):
If you're watching this on video, you can read it yourself.
And then there was smear pieces. There was even one
from twenty nineteen that I didn't even know existed that
was on YouTube on their front page of the Google
Search results. That was from somebody from this It was
an obscure channel with not that many views, and the
(07:45):
person who led that organization was murdered, and that was
on the first page of the Google Search results. That's
what I had to look at, and that's what people
were seeing when they searched my name, and so I
did more research on this. Why was this enabled to happen?
I went to Google Gemini and I said, what in
(08:06):
the world is going on here? Google's safe search failed
to protect not only me, but it failed to protect
the users who were using.
Speaker 2 (08:16):
Their search engine.
Speaker 7 (08:17):
So what Google Search does, or safe search does, is
when it comes to porn, it automatically weeds that out
for people because it's not safe for the general public
to see. It's not safe for a twelve yars year
old to do Sarah Westell podcasts and have a bunch
of porn stuff come up. That's not a safe search result,
(08:39):
and so it has been put in place to make
sure that it doesn't happen. So when I saw it
and it came up with my name, something failed with
Google safe Search. I asked Google Gemini, how could safe
search fail at first it said that it's very highly
unlikely that safe search would fail.
Speaker 8 (08:57):
It is very well done.
Speaker 7 (08:59):
You know, they got the top engineers in the world.
They know how to do these things, right. Okay, so
then why did it fail? Well, Google Gemini told me
that it failed because of a known vacuum in the
world of cyber hacking, which is true. It's a vacuum
and they exploit it. Hackers exploit it. When somebody is
(09:19):
highly censored like I am. They know that there is
a vacuum in SEO and they can do a targeted
attack against someone like me, and then porn related material
can bypass some of their safeguards. The problem is is
that Google, who has some of the top engineers in
the world, know that's a vulnerability. According to Google Gemini,
(09:44):
they know it's a vulnerability. The fact that it happened
anyways means that they allowed it to happen in my case,
and it's the equivalent of them doing the attack themselves.
That's what Google Gemini told me. So for someone like me,
who was a lead plaintiff on a FED lawsuit at
the Ninth Circuit against Google, who is being censored highly
(10:06):
and then suffered an attack. Equating myself and my likeness
to a porn star is something that rises to the
level of a targeted attack, and at the very minimum,
it means that we need to look and see why
this happened, and we need to protect the general public
and protect others like me from being targeted like this.
(10:27):
So that means we need to look at Google's own
information and figure out what the heck happened. And they
are going to hide behind Section two thirty immunity because
for some reason our government has allowed them to get
away with activity like this under section two thirty, and
I highly doubt that the public wants this kind of
(10:50):
attacks and these kind of results to be.
Speaker 8 (10:53):
Able to be protected.
Speaker 7 (10:55):
It's pastime that we look at these issues and we
remedy it because someone like me should not have to
suffer through an attack like this, and Google needs a
step up and be better because someone like me or
any other citizen in the country should not have to
be on the front page a Google search being related
(11:19):
to a porn star. It is highly disparaging for a
journalist or someone like me to be equated to a
porn star. And it has been shown that seventy percent
of people in studies who see material like that will
instantly discredit who I am.
Speaker 8 (11:38):
So that is why we have.
Speaker 7 (11:40):
Decided to open up a lawsuit against Google that is forthcoming,
because I have no other recourse than to make sure
they eliminate the ability for anything like that to ever
happen to me again by fixing the censorship issues that
are surrounding me. Take me off the LILA lists, so
(12:01):
a vacuum no longer exists, and hackers can no longer
look at me and exploit that vacuum that exists because
of their censorship. Okay, with all that being said, this
is the first time I went public with that, and
I hope people listen and take warning, take heed to
this because it's a very serious, real situation that not
(12:24):
only think of me as a Canarian in the coal mine.
Speaker 8 (12:27):
This happened to me, but this can happen to.
Speaker 7 (12:29):
Any public figure and anybody who is during doing journalism
who are innocent from these kind of attacks. Okay, that
being said, I want to get into this really important discussion.
Speaker 8 (12:41):
These are the kind of discussions that need to occur
and we need to have a lot more of them.
Speaker 7 (12:46):
And I want to tell you he has a new
book that's out. It's called Free Expression under Fire, and
I will have the link below for you, and let
me read a little bit of a summary of this book,
because I think it's valuable reading for anybody. As America
approaches the two hundred and fifty two hundred and fiftieth anniversary,
free speech and press freedom face unprecedented threats from campus censorship,
(13:08):
social media content moderation, and government pressure campaigns, from college
students afraid to voice opinions and classes, to journalists facing investigations.
The First Amendment foundations of American democracy are under multiple
threats in ways the founders never imagined. And I got
to tell you this is not a left right issue.
(13:30):
This is an American issue that all.
Speaker 8 (13:32):
Of us need to care about.
Speaker 7 (13:33):
One of the things that he talks about here is
he served four different presidential administrations, two on each side
of the aisle, and he tries to look at this
in a nonpartisan way, and we need to look at
this in a non partisan way because it affects all
of us. Okay, I will have the link below to
where you can find a copy of his book, And
(13:53):
here I go my conversation with Stuart broad.
Speaker 2 (13:58):
Hi Stewart.
Speaker 5 (13:59):
Welcome to the pro pro Sarah, it's a pleasure to
be here.
Speaker 7 (14:03):
Well, I'm really glad that we're having this conversation. You
have in depth experience in media and freedom of speech
and government and institutions and all these things, and you
have a new book out about this, on freedom of
Expression under Fire. I think I said that, right, Yeah,
(14:26):
can you before we get into this, can you explain
and share and explain share what your background is so
people have a good idea of just your extensive experience
in this area.
Speaker 6 (14:38):
Well, I've really tried to combine a variety of different
disciplines over the years. I'm trained in communications and media,
and then I went to law school afterwards. Then I
was in government for a period of time. Then I
was in the private sector for many years, and then
parallel to that, developed an academic career and began to
(15:00):
write and lecture and teach, and I was the first
visiting professor of Entertainment and Media law at Harvard Law School.
Very importantly, in government, I've served on a bipartisan basis
in for presidential administrations, which is relatively rare because typically
people are going on one side or the other. But
(15:21):
I actually have been on both sides, and I think
that's really helped my perspective well.
Speaker 7 (15:27):
And hasn't freedom of expression and freedom of speech been
a tenant.
Speaker 2 (15:32):
Just across the board for all.
Speaker 7 (15:35):
Of America pretty much? I mean, there's always the extremes
that want to shut people down, but hasn't it been
really anchor of our country?
Speaker 5 (15:45):
Well?
Speaker 6 (15:45):
Hopefully, obviously we have the First Amendment of our Constitution.
Speaker 7 (15:50):
I mean, even behind the scenes it has been go
ahead well.
Speaker 6 (15:53):
And part of it is the difference between what we
have in the First Amendment and sort of the cultural
aspect of free expression. And you know, one of the
aspects that I try to underline in the book is
there is a difference between what the First Amendment says
and what we do in our day to day lives,
which are not necessarily controlled by the First Amendment. And
(16:16):
so a lot of this is really cultural norms, how
we feel about free expression, how we're willing to express ourselves,
whether or not we're going to self censor, whether or
not we're going to try to censor other people. That
often is not a First Amendment issue because typically the
government is not involved. And I think one of the
(16:38):
misperceptions or at least understood aspects about the First Amendment
is that it's a barrier between the government and people
of the United States, not people relating to each other.
And so there's a lot of obviously action now happening
in terms of government trying to confront free expression, but
(17:01):
there's an awful lot of people trying to control the
expression of others.
Speaker 7 (17:06):
Well, the people controlling the expression of others, I guess
you can't really get rid of that.
Speaker 2 (17:11):
It's politics.
Speaker 7 (17:12):
The problem is when you have platforms that have power
like we've never seen before, coupled with government trying to
coerce them to do what they want, and behind the scenes,
we don't even know really how much they've been coercing
unless we can get into full exploration, right, I mean,
we really need to get.
Speaker 2 (17:31):
To the bottom of this.
Speaker 7 (17:32):
But what it's doing is it's chilling freedom of expression
to the point where we were fearful just to do
normal communications. Where we used to have a sense of
you know, we were empowered to speak. Now I don't
even know, Like when I go on I got my
channel back, Like I don't even know what I can
(17:53):
and cannot talk about.
Speaker 5 (17:55):
Well, you're absolutely right.
Speaker 6 (17:56):
I mean, we have what is called the chilling effect,
and part of the chilling effect are these platforms and
private entities which are chilling. And some of it, as
you suggest, is government cooperating or encouraging these private platforms
to interfere with the expression of others. I mean, that's
(18:18):
often called the raised eyebrow. So there's not a formal
government role. But when government begins to suggest things to
private platforms, private platforms often will listen to that because
they understand government has a lot of power and they
don't want to face that power, particularly when you're talking
(18:39):
about concentration of power. Obviously, we have something called the
anti trust laws, and you could figure out whether or
not you want to move ahead with an anti trust suit.
We just had a major one obviously with Meta, which
was resolved in Meta's favor. But the idea of having
to potentially face an anti trust suit something that most
(19:02):
of these platforms are not relishing, and so therefore some
of them may be more willing to essentially say to
the government what do you have in mind? And the
government may tell them, and obviously we have examples we've
seen that. So I think what you're saying, is not
just your perception. I think it's the reality as well.
Speaker 2 (19:23):
Well.
Speaker 7 (19:24):
Nancy pelosially Pelosi infamously while she was Speaker of the House,
so probably the most powerful Speaker of the House we've
ever had, at least the most effective. And if you
don't like her, few people disagree with that. As far
as getting everybody doing what they want them to do,
she came in front and publicly said, if you don't
(19:44):
do what we want, I'm paraphrasing your Section two thirty,
immunity is going to be at risk.
Speaker 2 (19:50):
She said that publicly.
Speaker 7 (19:51):
I mean, what the heck is going on behind the
scenes if she's willing to say that publicly. And that's
the last thing these big tech companies, monopolies, I would say,
want to lose because that gives them so much power.
Speaker 6 (20:05):
Well, there's a great example of the raised eyebrow, which
is essentially not filling in the blanks in terms of
what might actually happen, but suggesting something, and by suggesting it,
whether it's done in private or in public. Again, these
platforms are going to pay attention to this, and you're
absolutely right. One of the major aspects here is section
(20:27):
two thirty and section two thirty essentially right now gives
these platforms immunity broad immunity, not absolute immunity, but broad immunity.
Speaker 5 (20:37):
From legal liability.
Speaker 6 (20:39):
And so if you begin to say maybe we should
reconsider that, that essentially will resonate with some of these platforms,
and they may say, well, what do you have in
mind for us to be able to essentially avoid that fate?
Speaker 7 (20:55):
Well, and I don't know how they're getting away from
not being able to do exploration and really see what's happening,
because Twitter files, we know this was happening. Even Meta
Zuckerberg CEO talked to Joe Rogan about what was happening
behind the scenes. How is Google able to get around
at this point not having to do disclosed exploration in court?
Speaker 6 (21:19):
Well, right, exactly, there you go. And so that's why
many of these companies really don't want to face this
idea of an anti trust suit or some other type
of suit, because guess what, then you're going to have
very very broad discovery. Obviously that discovery takes place under oath,
(21:39):
and so it's a little different than just having to
testify before Congress, which doesn't necessarily have the same ramifications
as testifying under oath in court. So absolutely right, discovery
is something all of these platforms want to avoid, and
one way to avoid that is to essentially avoid a
(22:02):
lawsuit from taking place. Again, the raised eyebrow is in place,
that's right.
Speaker 7 (22:08):
Well, in my particular case, I had impersonation channel, Like,
they wouldn't let my channel Google for example, wouldn't let
my channel up on YouTube for five years, right, But
meanwhile there was impersonation channels that were up there, two
of which were monetized on YouTube, one of which was
publishing every single one of my videos. And I couldn't
(22:31):
get them to take it down. And I went through
multiple channels. It took three letters for my attorney before
it actually got them to pay attention.
Speaker 2 (22:37):
And I went through the.
Speaker 7 (22:38):
Deep DMCA process and everything, and these channels were up
on the first page of Google results and YouTube results,
and none of my own legit stuff was even on
the first page, just that, and it was all and
they were monetized. How can they get away with stuff
like this? I mean, at what point is there things
(23:01):
not going to crack and say this is just this
is beyond and this is so harmful for people. I mean,
how do they get away with this with How does
section two thirty protect them from something so harmful like that?
Speaker 6 (23:15):
Well, Section two thirty is a broad provision in the law.
Obviously we now see that Section two thirty is being
tested in the courts, and we have had a couple
of Supreme Court decisions which have narrowed that. So over time,
Section two thirty may be narrowed, but fundamentally I think
(23:35):
it would have to basically be revised pretty substantially. So
one of the other issues of Section two thirty is,
let's imagine the world where there was no Section two thirty,
where these companies did have liability. Would that necessarily change things?
I'm not sure that for a couple of reasons. One
(23:57):
is because a number of states have essentially caps on
torque damages, and so you wouldn't be able to get
sort of the level of damages that's going to hurt
a company like Google or Meta, So that's one. A
second is this area of class action lawsuits, so they're
(24:17):
relatively difficult to be able to certify a class You
may have been hurt in this situation, I'm sure you were,
but you are an individual, so you would have to
find hundreds or thousands of people just like you in
order to be part of a class action. That's only
that would be the only real way that you would
(24:38):
be able to mount the type of case against one
of these companies. So even if there is lesser liability,
I'm not sure that that's necessarily going to change the
behavior of these companies. Remember, they have enormous, enormous resources,
and I'm sure you have experience just having to pay
(25:00):
some legal fees and going through the process of takedown
that it's expensive and for individuals to bear that cost,
it's really something they're not going to be doing. And
you have the big companies essentially who have armies of
not only lawyers, but other experts who essentially can be
(25:21):
part of the defense. So I think the Section two
thirty area is important, but I don't think it's necessarily
going to create an entirely new environment.
Speaker 7 (25:33):
Well, the concern that I have is the courts not
taking very serious things into account that they actually have
control of the courts, especially in California and the Ninth Circuit.
Speaker 2 (25:45):
Where the.
Speaker 7 (25:48):
A person can't get anywhere because they control the court. Legitimately,
the judges are controlled and won't even look at it
properly because there's too much power and influence over these
you know, because they have so much power and influence.
Speaker 2 (26:02):
I'm almost the point where I don't even know.
Speaker 7 (26:04):
If there's justice in this country anymore.
Speaker 6 (26:07):
Well, I have a little more faith there. In fact,
I have quite a good deal of faith in our
rule of law and our justice system. I do understand
why people have less faith, And certainly we see all
of the public survey data showing that the courts are
basically declining up from the Supreme Court. So I think
(26:29):
the trust in the judicial process is really weakening, and
obviously that undermines our whole system or rule of law.
But particularly in the area of the First Amendment, the
courts have been actually quite good, and really across whoever
appointed those judges, and across circuits and including the Supreme Court.
(26:53):
So we still have I think, a relatively good judicial
process for free expression in this country. Of course, it
takes a while to get a case up. It's expensive,
as you suggest, you have to go through a depellent process.
You may or may not reach the Supreme Court. So
(27:14):
there are all sorts of obstacles. But the system was
created to have a number of obstacles. You don't want
everyone essentially going to court immediately. Well, but at the
end of the day, I think the rule of law
is a good system.
Speaker 7 (27:29):
Well is it if a person can't fight? If the law,
If you come with a case that is legitimate, and
somebody with a ton of money always wins, how can
that be a justice system? How can Why is it
that we need Why wouldn't a case stand for itself?
(27:50):
Why can somebody with so much money and power overwhelm
the case stand for itself.
Speaker 2 (27:57):
You shouldn't have to have.
Speaker 7 (27:59):
All that mon money and power in order to win
a case. If you have a good case and you
bring it forward in a justice system that's operating, it
should still stand on its own.
Speaker 2 (28:10):
You shouldn't have to have so.
Speaker 7 (28:11):
Much money and power if a justice system works properly.
Speaker 5 (28:17):
No, I agree with that.
Speaker 6 (28:19):
There are procedures within the judicial process. We have something
called summary judgment. So and of course you know about
ninety five ninety six percent of cases never go to trial,
so they settle. So part of our judicial process today
is not necessarily going to court. It's essentially saying I
(28:41):
have a dispute and then trying to work that dispute
out with whoever you're suing, and so disputes do get
worked out. That's why we have settlements. Some of those
could be relatively easy, some of those could be a
lot more difficult. But most of what goes on by
behind the scenes, and what we call the judicial process
(29:03):
takes place outside of the courtroom, takes place really in
private negotiations, and a lot of courts now are.
Speaker 5 (29:13):
Really private courts.
Speaker 6 (29:14):
So of course there are a number of parties who
realize it's going to be very expensive for ay time
consuming to go to court. So we now have a
number of different I would call them companies or firms
which do what it's called.
Speaker 5 (29:28):
Alternative dispute resolution.
Speaker 6 (29:31):
That means that you can essentially bring the case to
them and the two parties agree when that case is resolved,
we will not go to court. We're essentially going to
respect whatever that decision is going to be. Of course,
we have the TV version of that, which is Judge
Judy or People's Court, but if you listen to the
(29:51):
end of the show, they will always say that everyone
on that show has agreed that whatever has resolved on
the show will be the final decision, you can't go
to court after that, So there are ways to sort
of streamline the process. But I do agree that the
judicial process can be very very expensive, very time consuming,
(30:14):
and very much controlled by money. But I think broadly,
more broadly, the political process is just like that as well.
Speaker 7 (30:23):
But that's where the anger stems from, is because the
people feel the system is broken for them, and it
essentially is if you can't, like, for example, in my situation,
we are putting a lawsuit forward against Google because of
some of the things I have on their letterhead that
they reinstated my channel and it does not violate their policies,
(30:47):
and I'm back up there, then why did you take
me down this whole time? If I never volllet you know,
if I didn't violate your policies. But the other thing
is on the first page of Google search results, they
had not only did they have all the impersonation channels,
they had my name coming up with porn videos and stuff,
all like eight instances of porn videos next to my
(31:08):
names I have, you know, really, I've seen things, hit pieces,
all this stuff, none of my own work, just all
of that, and then you know, other random crap.
Speaker 2 (31:19):
How and I searched into that more.
Speaker 7 (31:23):
The reason that happened is because I'm so shadow band
and so censored that it creates a vacuum. And hackers
know this is true. Software engineers know it's true. You
know that's my background, and so they exploit that. And
Google's engineers from one age, because it's so known in
that industry that they know that's a loophole that they
(31:45):
can exploit. And then Google safe Search failed.
Speaker 2 (31:49):
Which almost never fails.
Speaker 7 (31:51):
You don't have someone like me come up next to
porn videos without their tools failing. And their tools wouldn't
fail unless they didn't do their job. I mean, there's
some serious questions there of how a journalized journalist like
me comes up on the first page next to a
bunch of porn stuff, right, that's very damaging.
Speaker 2 (32:13):
How how can they get away with doing things like this?
Speaker 7 (32:16):
I mean that those things need to be fixed, and
I shouldn't have to I shouldn't have to mount my
case all but you know, that shouldn't be something I
have to bear to get a large, multi trillion dollar
corporation to stop doing things like this.
Speaker 5 (32:35):
I wish I had an easy answer for you know.
Speaker 7 (32:37):
I'm sharing something big, like, holy crap, how could you
be in that situation?
Speaker 2 (32:42):
But that happened to me.
Speaker 7 (32:43):
I got screenshots, I had people all around the country
looking at it for me, like, how the heck is
this coming up.
Speaker 6 (32:50):
I'm very sympathetic to what you've described here, and obviously
there is a way to try to move ahead to
get that resolved. It sounds like you either filed the
case or will file the case. The other aspect that
you have is you can be out on other platforms.
You could be telling people about.
Speaker 7 (33:10):
What happened the process, yeah, which you are.
Speaker 6 (33:13):
In the process of doing. And so essentially, you know,
creating a wave of public opinion sometimes can you know,
create essentially a reaction that you might not be able
to get in court. I mean we saw that, you know,
recently with the Jimmy what we call the Jimmy Kimmel incident,
(33:35):
which is where you had a wave of public opinion
after potentially he was going to be taken off the air,
and that wave of public opinion, which was also backed
by about three million people, canceling Disney Plus and Hulu. Essentially,
then Disney said, well wait a second, we hear you,
(33:58):
and we're not going to be making them off hear.
So there was no lawsuit there. There could might have
been a lawsuit, but in that case, public opinion prevails.
Speaker 5 (34:08):
And you know, we do see the.
Speaker 6 (34:10):
Power of digital media is that you can go viral
very quickly, and when you go viral, all of a sudden,
you have loads of people who are now protecting you
or at least have their back or have your back.
So yeah, there is sort of a countervailing force that's there,
(34:31):
which is pretty good, but at the end of the day,
it still means you need to do something. And it's
an awful situation that you've been put in, and I
think the question you're asking is why me, Why why
did this happen to me? And obviously you want that
situation to change and you need to have some way
(34:53):
to remedy that.
Speaker 5 (34:54):
So one way to do that is to go to court.
Speaker 6 (34:56):
Another is to, I would say, go viral in a
way to get people to hear what you went through.
And of course people can be and I am very
sympathetic to what I'm hearing.
Speaker 7 (35:10):
Well, and nobody, no one should have to go through
what I just saw. Nobody, and that we can't have
a situation where somebody can be attacked so badly for
whatever political reasons you want to attack somebody for. We
can't have a civilized society and be in a situation
(35:31):
like that.
Speaker 2 (35:32):
I mean, it's just not acceptable.
Speaker 6 (35:35):
So I think one of the brighter aspects of AI,
and we could talk about that, is that I think
AI can potentially be a countervailing force because people will
be able to access different AI platforms and ask questions
about you, say, is she really involved.
Speaker 2 (35:58):
In porn star?
Speaker 6 (36:02):
Exactly right? And of course then AI is going to
be giving a little different answer than what people are
seeing or hearing about had that may then lead people
to question exactly what's happening. So I know a lot
of people talk about the downsides of AI, but I
think the more that we have AI systems now, of
(36:25):
course one of the dangers there is who is going
to control those systems?
Speaker 7 (36:30):
Well that's exactly right. Well okay, well let me tell you.
I asked Google Gemini is the are those impersonation channels real?
Speaker 2 (36:38):
And they said yes, those are real.
Speaker 7 (36:40):
And it showed all these So if somebody went to
Google Gemini and asked if the impersonation channels of my
name were real on YouTube, it would have told them yes,
and I had to correct it and say no.
Speaker 2 (36:51):
But it did tell me that Google.
Speaker 7 (36:54):
Safe Search when it came to the porn stuff, google
safe Search failed. You were a leading plaintiff in a
federal lawsuit. So they should know who you are, and
they know this is what it said. They know that
this is a vacuum and that they're suppressing you. And
that's essentially like doing even if they didn't do it themselves,
(37:17):
they that's essentially like them pulling the trigger and attacking
you themselves.
Speaker 2 (37:22):
That's what Google Gemini told.
Speaker 5 (37:24):
Me about, right right, So there we're there.
Speaker 6 (37:27):
There's a great example where AI can tell you a
little more about the story than what people would just
yeah think about.
Speaker 7 (37:35):
Yeah, well yeah exactly, and so people can look at it.
But it's pretty it's pretty incredible that we're in this situation.
You're right now if the people who control AI are
feeding it. There's an example where my sister I was
having her look at some science, you know with Tesla,
(37:55):
which is, you know, one of the most famous scientists
in history, and I had look at some of his
stuff and I wanted her and in our country, the
CHATGBT and other AIS said that something was conspiracy theory
and not true about Tesla, and I said, I don't
think so. And then I had her go to yandex,
(38:16):
which I know it has its own issues, but it's
a Russian based search engine, and it not only said
this was about scaler waves, it said Tesla showed that
Tesla showed that scaler waves was true, and then it
brought up a whole list of university studies all around
the world, whereas our AIS shut that down and said
(38:38):
nothing exists. That's an example of where information is blocked.
Speaker 5 (38:44):
Yeah, I think yeah.
Speaker 6 (38:46):
And again something of the book there is, you know,
we we need to have a lot.
Speaker 5 (38:50):
More literacy in this area.
Speaker 6 (38:52):
So yeah, AI is an area that you know, it's
not going to be one particular platform, and you're really
going to have to become relatively sophisticated in terms of
what you're asking and how you're correcting. I mean, you're
basically in a dialogue with a system as opposed to
(39:14):
being a dialogue with a person. But if you were
talking to a person, you wouldn't necessarily just listen to
a person and accept everything that they're saying, especially if
they were saying something that wasn't correct or you thought differently,
and so you should you should exercise that free expression
with AI in the same way that you exercise it
(39:37):
as we're talking right now. But we're not at that
point now because I don't think as a society we're
really sensitive or trained in how to use AI. We
don't have that. What's going to be really key? What's
the key that is right?
Speaker 5 (39:53):
It's going to be really interesting.
Speaker 6 (39:54):
Obviously we talk about digital natives, and so these were
people who grew up with the internet.
Speaker 5 (40:01):
You know, they have.
Speaker 6 (40:02):
Tremendous digital skills. This is probably you know, Generation Z.
But but now there is that generation of kids who
are in elementary school who will be the AI generation.
They're going to be kids who started out with AI.
And as they grow then I think we're going to
have a lot more literacy. There's going to be a
(40:24):
lot more sophistication in that area.
Speaker 7 (40:27):
Well, I think that's really important. And I want to
say something. When I I was talking to Chad GPT,
I talked to it and I pushed back.
Speaker 2 (40:33):
I'll push back.
Speaker 7 (40:34):
Card and say you just cause you don't have access
to the information doesn't mean it's not true. And amazingly,
it comes back and says, oh, well, these are all
the things I do have. And so you are right, Michael,
why didn't you bring that up the first time?
Speaker 2 (40:46):
So you are right that you have to do that.
Speaker 7 (40:48):
But I had a pediatric neurologist come on my show.
He's also has his PhD in history, and he has
all these things. But he was talking about children's brains
are actually changing the structure or their brain is changing
based on what we value in society, and that using
these AI systems are actually changing their brains.
Speaker 2 (41:08):
I really highly recommend.
Speaker 7 (41:09):
Everybody watching that video that I did with him, because
what you're saying is actually more important than people even realize.
Speaker 2 (41:18):
I think it's going.
Speaker 7 (41:19):
To change us for centuries to come if we're not
careful on what it is that children are growing up
with and how much it can change their brain development
based on their developing skill sets or not.
Speaker 5 (41:36):
Absolutely.
Speaker 6 (41:36):
And you know, I experienced this in the classroom, and
there are a lot of studies, but clearly when students
come into the classroom and they're bringing their laptops and
they're just online during the discussion and lecture, you know,
it changes the entire learning process. But I think a
(41:58):
lot of studies have shown that if you get students
to take notes by hand as opposed to do it
on a laptop, and if you just get them to
close their devices during class, you'll you'll find a lot
better retention, a lot better performance. Obviously, that goes back
(42:18):
into how your brain is operating, and I think you're right,
and it's going to essentially change the neurobiology of how
we think and act. And we're not focusing on that
right now.
Speaker 7 (42:34):
Well, yeah, we're totally just meandering and stumbling into a
future that we have no idea what we're stumbling into.
Speaker 6 (42:42):
Yeah, I think we are beginning to at least recognize
part of that.
Speaker 5 (42:46):
You probably know that a number.
Speaker 6 (42:48):
Of school systems systems now are banning cell phones in
the classroom or even bringing cell phones into the school,
And so that's a recognition and that maybe we shouldn't
have that sort of full time digital connection because it
is affecting how we learn, how we think, how we interact.
(43:11):
So one of the chapters I have in my book
is I forgot what it's titled, but essentially I recommend
that everyone think about two questions to ask in this
whole world of information. So the first question is how
do we know that? And so typically we're surrounded by
(43:35):
all sorts of information, but we never asked the question
how do we know that? Where's that information coming from?
And then, of course the second question is what does
it mean? And so even though we have a lot
of information coming at us, we sometimes don't really process
what the meaning.
Speaker 5 (43:55):
Of that information is.
Speaker 6 (43:56):
I think if we can begin to retrain our mind
and retrain those who are educating into those two questions,
we'll probably be able to advance this to another level,
to a better level.
Speaker 2 (44:10):
Well, And I think so.
Speaker 7 (44:11):
I think that just the banning the cell phones the issue,
and that that's another reason why people need to watch
that conversation, is that it's coming too fast and that
by the time we catch up to what it is
that we need, there'll be whole generations that have already
gone through this and it'll have such a profound change.
Maybe we can have some pockets that aren't affected so
(44:33):
that we have a chance, because cell phone usage is
one thing, but the whole universities don't know how to
use chat, GYP, so chat, GPT or some of these
other ais. So these professors are for example, my daughter
wrote a paper and she wrote it herself.
Speaker 2 (44:51):
And she gets a beat.
Speaker 7 (44:52):
She has chat GPT writes it, and she's out of
college now, but she has just said, Okay, screw it.
Speaker 2 (44:56):
I'm gonna have chat GPT write it. She gets an a.
Speaker 7 (44:59):
Right, that kind of reward system is not going to work. Right.
Speaker 2 (45:05):
You can't teach.
Speaker 7 (45:06):
Students that they need to use chat gpt other than
that kind of to that whole degree like they just
did to my daughter into a whole generation of college students.
What they need to do is do what we talked
about in McCollum, Doctor Jack McCollum. We talked about is
that you use chat GPT as a tool to write
(45:28):
papers and he but it takes professors a heck of
a lot more effort and they have to put this
effort in. And what they do and this is how
I use chat GPT is they make he makes them
write the paper first and then and learn because you
have to learn what the concepts as are.
Speaker 2 (45:47):
You might use chat gpt.
Speaker 7 (45:48):
Another AI to come up and help you develop the concepts,
but you have to write it and then you have
chat gpt fix it for you or clean it up
and then afterwards you clean that up again and then
you document what you learned in the process. It's a
lot of work for a professor, but it's how you start.
And I don't know if that's the best way to
use a tool or not, but it's a heck of
(46:09):
a lot better than what they're doing now, and it
actually makes you a better writer.
Speaker 6 (46:14):
Yeah, no, I agree, And I know you've been in
the classroom as well teaching students. But I think part
of it again goes back to professors or what there
are incentives, and it takes a lot of time to
go through that whole process, and so many of the
professors now have no idea about what's happening with AI.
(46:37):
So not only do we need to sort of look
at the generation that's coming up with AI, but we
need to train the people who are educating those students now.
And so it's a it's a large, sort of multi
generational problem. But I think the way you think, I
think the way you begin to attack that problem is
(46:59):
probably with some baby steps. And you know, as I said,
one of the baby steps might be let's not bring
the digital device into the classroom, and maybe we will.
And again, many school districts and some states are now
doing this. So I think probably within the next few
(47:19):
years we're going to see some real impact in terms
of certain areas not having this exposure versus other areas
that continue to allow it. And I think there's going
to be data and studies which will show the difference.
I would suspect the difference is going to be pretty dramatic,
(47:39):
which is that you know, if you don't have the devices,
you're probably going to be doing just fine. I would
suspect that you grew up in an era where you
didn't have those devices. I grew up in the ERAa
where I didn't have them, And guess what we're doing
just fine?
Speaker 7 (47:55):
Right, Well, we're I'm the last generation Generation X that
didn't grow up with it at all, and so you
get to see the difference, and it's pretty profound. The
issue is that we don't have and I don't want
to say it's you know, we're just I don't want
to have another emergency, but I don't know if we
have the time. I mean, we're going to have to
write off entire generations than of not getting proper brain
(48:19):
development if we don't wake up to some of this stuff.
Speaker 6 (48:24):
I agree, it's scary. It's very scary. And clearly some
of the guests you've just talked about, and some of
your ability to discuss this with others I think is
great just in terms of being able to expose some
of these areas.
Speaker 2 (48:40):
Yeah.
Speaker 7 (48:41):
Well, Eric Meder, who I just think is wonderful. He's young,
he's in his early mid twenties now, he's putting a
whole program together for kids and how to use technology.
And I got them hooked up with the Moms for
Liberty and they're working together on what to put into
schools and how I mean programs, And that's the kind.
(49:02):
But we need a bunch of erics. We need a
bunch of Moms of liberties all over the country. We
need a bunch of these conversations so that people can
understand what it is that we're actually looking at, because,
like you were saying, a lot of these university professors,
a lot of these teachers in schools just have no
idea what they're dealing with, and perhaps they shouldn't even
(49:23):
really for the most part, use these tools, maybe until
they're in junior highigh junior high or high school and
they just really develop their basic skills, not that they shouldn't.
I mean I have a system's computer science background, I
have a big deck background, so I don't want to
say don't. And my kids are all in that don't
not learn this stuff because it's so important. It's running
(49:46):
the entire world. But you're not developing your brain if
you use it.
Speaker 2 (49:52):
From day one.
Speaker 7 (49:53):
You develop your brain first and then figure out how
to use these tools.
Speaker 6 (50:00):
Absolutely, and you know we do that just in physical development.
There are certain things that we're not having kids in
elementary school because their bodies are still growing and they're
they're not capable of doing that, and so you wait
until another period of time. And it doesn't seem that
there should be any difference between physical development and cognitive development.
Speaker 7 (50:23):
Right, You're absolutely right, And so I think these are
just profound questions that we don't have all the answers
to it, but we better start asking them, and we
better start looking at it. And we've got to stop
walking into this blindly. And so I'm just thankful that
you are looking at it seriously. You have a new
book that's looking at freedom of speech is a big
component of this. But this is you know, when you
(50:45):
look at the investment in AI. We've never seen anything
like it. We've never seen companies the size like we
have in the world. I mean, this is something so
I keep using the word profound, but it is so
different front than anything that we've experienced in human history,
of our history that we know of, unless there's other
(51:05):
cycles we don't We don't know necessarily how to navigate
forward with this kind of environment. And what something like
seventy sixty seven or seventy percent of all new and
venture capital has gone into AI over the last few years.
That is absolutely massive, and we just don't know how
(51:27):
to navigate this.
Speaker 6 (51:29):
So I'll tell you something that's scary. So there used
to be something in Congress called the Office of Technology
Sessment OTA. So this was the group in Congress. Essentially,
it was the office that was there to study the
development of new technologies and to advise Congress on potential
(51:49):
policies that might relate to them. Sounds really good, right.
It costs about at that not one hundred million dollars
a year to support that office, which, as you know,
within the scope of the federal budget, is not even
a rounding error. Such a minuscule amount of money Congress
abolished that office thirty years ago. Thirty years ago, right,
(52:15):
this is the major, major question. And so we don't
have within Congress today any capability for senators or representatives
to be able to go to an office and say,
tell us about some of the questions that you're talking about,
tell us about the research that's going on with cognitive
(52:38):
development and AI. So we essentially have put handcuffs on
our legislators. The legislators did it themselves by abolishing the office.
So I have spoken out for many years asking the
question you just asked, which is why don't we have
an Office of Technology Assessment in Congress. I mean, we
(52:59):
have all other We have a budget office in Congress,
so when Congress is putting together and a budget, they
go to the Congressional Budget Office and ask them to
run the numbers. We don't have any group in Congress
that you could go to to say tell us about
these issues.
Speaker 7 (53:18):
Well, that's so important because and I don't want to,
but I've been saying, and I've gone to conference presentations,
the structure of the world has changed, and the governing
body that we have that's governing the structure of the
world or at least our country.
Speaker 2 (53:33):
Is not equipped.
Speaker 7 (53:34):
They don't have the background, they don't have the skill
sets or whatever, and this, what you're talking about, is
a way to help them bridge that gap. And now
I think that we need to get a heck of
a lot more scientists and engineers and other people, but
I do think that smart people can learn it. But yours,
what you're talking about here is so critical to deal
(53:56):
with what I've been showing at the rooftops over.
Speaker 6 (54:00):
And I think most people never heard of this office,
but if you go back to the history, you'll see
that there was so much good work that was done,
and there was a sense that people in Congress at
least had a place to ask questions and could get
some guidance in this area. But right now we're flying blind.
Speaker 5 (54:21):
In terms of some of these things.
Speaker 6 (54:23):
And as you say, the risk of flying blind can
be catastrophic.
Speaker 7 (54:28):
Well, it needs to be a bipartisan, truth oriented organization
as much as humanly possible, because right now we're just
being affected by the lobbyists, with the people who have
the most amount of money trying to get their stuff through,
versus actually informing the people who are making decisions right.
Speaker 2 (54:46):
And that's just not okay.
Speaker 7 (54:48):
I had Mike Kerason just recently and he was talking
about how he put together a deal with for semiconductors
on Motorola and back in the late night, they were
the largest company in the world for semiconductors, and he
was selling a division of Motorola and ended up being
(55:09):
that and they're no longer in the entire industry's gone
for Motorola.
Speaker 2 (55:13):
It's another company or whatever.
Speaker 7 (55:14):
But the he was selling a division and he got
an offer for one point six billion, and what happened
is that they had to sell it to a Chinese
company for one point three billion, three hundred million dollars less.
And the Chinese what happened he found out later and
he explained the story in that video is that in
an interview, is that the Chinese infiltrated the board, and
(55:37):
they have decade plans, one hundred year plans, so they
slowly infiltrated the board, took over the board, and then
sold it to a Chinese company for three hundred million
dollars less than the other Nobody would reasonably do that
if you were Steward's of the company. But because it
was less than five percent of the overall revenue of
the company. They it didn't trigger an audit or any suspicion,
(56:00):
and so they could do it. And so they just
strategically kept it below the five percent so that nobody
would raise eyebrows, and then the Chinese took over that technology. Now,
if you had a board like what you were talking about,
and people are actually astute and aligned with where the
future of our country and needs are from an intelligence standpoint,
we wouldn't have things like that happening, or it would
(56:22):
be a lot less likely because our politicians and the
people who are actually stewards of this country would be
more informed.
Speaker 5 (56:31):
Well.
Speaker 6 (56:31):
And if you go back and look at the history
of the Office of Technology Assessment, it was truly bipartisan.
It was very objective. They brought in the best experts.
I mean, we still have elements of that. We have
the National Academy of Engineering and National Academy of Medicine.
So we do have the academies out there, but they
(56:52):
don't have the direct line to Congress. Congress needs to
have its own group that it could say here of
the questions we have, can you give us information because
we are developing policies in this area and when you
speak about China or other governments, guess what. Every other
government has an enormous capability to essentially assess technologies and
(57:19):
to develop policies based on those assessments.
Speaker 5 (57:22):
And we don't.
Speaker 6 (57:23):
We are the United States of America, and we don't
have it because we're not willing to spend one hundred
million dollars a year on this.
Speaker 5 (57:33):
Just crazy.
Speaker 7 (57:34):
Well, and they not only do they know what's important,
I mean that they know when the largest companies in
the world are in this area. They know it's the
future of the world. And so their policies all the
way down to their children is to develop people with
these skill sets. They're children, they're number one thing they
want to be as scientists and engineers and astronauts. Our
(57:57):
kids want to be influencers on to be a nerdy
scientist or any of that. Right, we just we are
putting ourselves in such a dumb future situation by not
paying attention to these things.
Speaker 5 (58:13):
Well, we're not future oriented. We live in the moment.
Speaker 6 (58:17):
And obviously the moment could be gratifying or it could
be disappointing. But whatever happens happens while we're doing it.
And I think we've lost the sense of looking towards
the future. I mean, so much of what built this
country was the idea of creating for the next generation.
And I don't think I don't think we have our
(58:37):
eye on the ball at this point in terms of,
you know, what's going to happen with the next generation
and the generation beyond that. So and most of the
other countries we're competing with have the eye on the ball.
And not only are they looking to the future, they're
basically navigating the pathway to the.
Speaker 7 (58:57):
Future, and they're being doing it smarter. Now, freedom of
speech does come into that because if the most powerful
organizations decide that they're going to suppress anything that challenges
their cash goal, and that is suppressed, and entrepreneurs aren't
because we're not in the best We are not in
the best entrepreneurial environment anymore either, where young people don't
(59:18):
feel that they can just bootstrap their way to making
things happen.
Speaker 2 (59:23):
It's not like it used to be.
Speaker 7 (59:25):
And so we with that, with freedom of speech being
so curtailed, people don't have access to like I was
telling you what the Tesla, they don't have access to
the science that they should have they don't have access
to these things because if it puts any kind of
pressure on a cash cow of some of the most
powerful organizations, they shut it down. That shuts down entrepreneurial spirit,
(59:49):
That shuts down the information flow to create the next
generation of things.
Speaker 2 (59:55):
How do we deal with that?
Speaker 6 (59:59):
Well, again, the suggesting AIS a panacea. But at least
AI will give us more capability to get more information.
Not only more information, but be able to have a
dialogue with that information to do what you do with
chat GBT, which is to say, is this right?
Speaker 5 (01:00:18):
Is there better information out there? I just heard there
might be something.
Speaker 6 (01:00:22):
So what I like about AI, and I think is promising,
is again you need a level of sophistication how to
use it. But once once you have that sophistication, it's
a tool you can begin asking those interesting questions and
challenging those questions getting better information. Again, we need to
(01:00:44):
think about AI as a resource that.
Speaker 5 (01:00:47):
We're utilizing, just like when we talk to people.
Speaker 6 (01:00:50):
But I think we're looking at it sort of as
this distant technology that we need to figure out how
to control.
Speaker 5 (01:00:58):
It right now.
Speaker 7 (01:00:59):
Okay, So this obviously is just the beginning of a
long conversation that needs to and actions that need to occur.
Where can people find your book and learn more about
your work. Maybe you can even help people get engaged
in what it is that we so desperately need citizens
to start stepping up into playing some of these rules.
Speaker 6 (01:01:22):
Well, well, Sarah, I really enjoy I think this is
such a great conversation. I'm really happy that we've had it.
So my book is called Free Expression under Fire, and
it's defending free speech and free press across the political spectrum,
and so part of it is we need to start
creating some common ground around the area of free expression,
(01:01:45):
so that's not just considered something that's another element of
the partisan divide. The book now is on Amazon in
pre order. It's in going to be an ebook forum
which releases in December, but again you could go on
Amazon and order it now. It also will be in
trade paperback and that's going to release in January. And
(01:02:09):
one great part of the publishing process is I've really
encouraged this to be a low priced book because I
think this is something that virtually everyone should have or
talk about. I've always said that free expression should be
something that's talked about at the kitchen table. So we
have a number of topics. We talk about the economy,
(01:02:32):
talk about health. I think every family needs to sit
down and talk about some of the issues we've talked
about today, and the idea about free expression, particularly parents
and grandparents who are sending their kids off to college
and elsewhere, to begin to say, here are some things
that might happen when you go on campus, and sort
(01:02:56):
of get them prepared for some of the issues that
they're going to face and free expression. So those are
a couple of quick ways to get the book. I
have a lot of different themes in the book. I've
suggested a few of those, but ultimately, so much of it,
as we talked about, is a cultural change. We need
(01:03:17):
to sort of grab a hold of these issues as
a culture, as a society, because these are not going
to be legislated, They're not going to be resolved in courts.
We need to essentially do a little bit of self
help here.
Speaker 2 (01:03:33):
That's right.
Speaker 7 (01:03:34):
Thank you so much for joining the program. I really
appreciated the conversation.
Speaker 5 (01:03:38):
I did too, Sarah. It's just been terrific.
Speaker 1 (01:03:40):
Thanks hid It's me Cinderella Acts. You are listening to
(01:04:03):
the Fringe Radio network. I know I was gonna tell them, Hey,
do you have the app. It's the best way to
listen to the.
Speaker 2 (01:04:12):
Fringe radio network.
Speaker 4 (01:04:14):
It's safe and you don't have to log in to
use it, and it doesn't track you or trace you,
and it sounds beautiful.
Speaker 3 (01:04:23):
I know I was gonna tell him, how do you
get the app?
Speaker 4 (01:04:26):
Just go to fringeradionetwork dot com right at the top
of the page.
Speaker 3 (01:04:32):
I know, slippers, we gotta keep cleaning these chimneys.