Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:21):
It's a commodity being disguised. No, not disguise, it's commodity being presented
as a service. But you know, like we're it serves us where the
people using it for social media,and but we also generate a byproduct of
data which they sell, and thatdata is a commodity to other companies in
(00:46):
the private sector. So yeah,that's that's an interesting way to look at
it. I guess there's an example. Uh did I hear it? Oh,
it's I'm reading this book right nowcalled Digital Minimalism. It's the author
is uh, what's his name,Countwn Newport. It's it's okay, it's
(01:10):
a it's a good book. There'slike some he's not an avid social media
user, but it is. Uh, it's got some interesting insight. It
does come from a place of likea lot of privilege, I think because
a lot of the suggestions that hemakes as far as like what we should
do in order to like curb ourour social media addiction is not really realistic
(01:40):
and it takes into a very ittakes into account like very Western social structures.
Anyway, he made the analogy betweenthe shift in traditional news reporting and
newspapers and print and copy and theshift that went to like the tabloid sensationalism
(02:02):
that uh that like the Hearst familyor like you know, so they he
I think I forgot who it was, but it was. The editor in
chief of the Sun basically said,I like that we stopped looking at we
stopped looking at the readers as thecustomers, and we started looking at the
(02:25):
advertisers as our customers. And ohyeah, and the same thing happened in
news media. Social media is predicatedon it, Like I think that's what
it was built on from day one. Yeah, yeah, it's a It's
an interesting topic because I always usethe analogy between like disruptive technologies and our
(02:51):
oversight tend to be you know,twenty thirty years down the line, like
the advent of the car. Anytimeand new massed technology is introduced upon the
population, it takes them a coupledecades to realize like, oh we need
seat belts, or oh we needtraffic signs, or oh we need like
traffic laws and government oversight and protocolprocedure on like how those things function to
(03:15):
stop the amount of death that happened, right, And that's why they say
laws are written in blood. Youknow, they're they're reactive by nature unless
you can to a super effective youknow, risk analysis of certain laws and
have the amazing foresight to see howthey look twenty years down the line.
(03:36):
Like, by nature, they arereactive because we're like, oh, that
was an unintended consequence, we needto fix that, and then boom you
have a new law. And whetheror not, you know, folks know
at the time, kit the moralityis not retroactive and legality is not retroactive.
You can't but you don't know theunintended consequences of technology or actions,
(04:00):
you know, twenty thirty years downthe line. But you know, it's
interesting because it kind of encompasses eventhis podcast, Like we use digital mediums
to distribute and talk about ideas,so it's all exists inside of that like
realm. And I just think,as a society, will we ever figure
out how to use it effectively doingthe least amount of harm. There's people
(04:27):
talking about it, but I don'tknow. I have friends that are in
the IT industry that think it's likesocial media is the absolute worst thing that
has ever happened, and they don'tthink that they'll ever be a balance and
enough oversight from the government to beable to stop the harm that it costs
(04:49):
to society. Well, and likethere's a level of social responsibility that has
to come with using services like thisservices that I said, the now that
comes with using something like social mediaand responsibly. I think it affects like
younger people the hardest. Like thereare laws in place, like you cannot
(05:10):
have a social media account until you'reat least fourteen years old. But it's
like, man, like do weneed to raise that age limit? And
also how are how effective are thepolicing tools of that age limit for young
people using Instagram or Facebook? Youknow, most of the harmful content you
see on Facebook or in social mediasansa Facebook, that you see on social
(05:32):
media, it's targeted at these youngpeople. I mean that's how you get
like children with extremist views and andand these really negative derogatory views. You
know, young boys are being subjectiveto negative information and negative stereotypes and negative
beliefs about women and about men ingeneral, Like the whole alpha beta male
(05:56):
bullshit is I mean, how muchof that is predicated with exposure to social
media? The the in cell alphabetamail pipeline. There's you're dealing with two
different consequences, the psychological consequences ofupon a certain population, the demographics primarily
you're talking about youth. And thenyou're also dealing with the like intentional manipulation
(06:21):
of disinformation because there's yeah, there'sactual like psychological ramifications, and there's studies
that have been done of like theamount in the increase of like usage of
social media reflected in users between theages like ten and fifteen has increased anxiety
and depression. I don't know ifthis is like noted, but like I
(06:41):
would even say it could even increaselike teen suicide. And then you're also
on the other end of that,you're dealing with actual concerted efforts from folks,
organizations, agendas that are using socialmedia as a form of that disinformation.
So it's like twofold, Like youhave those in unintended consequences of like
(07:06):
the psychological trauma that it's causing onthe youth, and then you also have
like you know, the bad actor organizations that are using it, like
the Camerage Analytica scandal twenty sixteen,Yeah, the Russian disinformation campaigns. I
mean there's congressional hearings that have beengoing on for the last couple years,
(07:27):
primarily with like Zuckerberg. They didone with like TikTok. I think we
haven't figured out because it's such anew Like we didn't we didn't know that
the information industry, the private informationindustry, was going to be a multi
multi billion dollar industry, and that'swhy there's so much power around the lobbying
(07:56):
of like not allowing any kind ofgovernment regulation to come into place. But
like if we're gonna, like yousaid, like it always takes something to
break the camel's back or like somethingdrastic to happen for folks. But like
I feel like we have the evidence, there's just the case against that to
fight a multibillion dollar corporation to createregulation, like you look at you were
(08:18):
mentioning the in cell alphabeta male pipelinethe guy like a lot of the mass
shooters that have reals exactly where Iwas going to go with that. Yeah,
they released their manifesto on either Twitteror did a live stream, like
I think the one in New Zealandand christ Church was live streamed on Facebook.
(08:39):
Like that's wild. I think thequestion shouldn't be do we demonize them
or do we study what happens.You know, it's very easy to look
at someone who commits such a crimeand being like they're a monster. But
I think that's dismissive. I thinkthat's really dismissive of a responsibility as humans
(09:03):
right to understand how things affect uspsychologically, Like, oh, he was
a monster, he did that.No, no, no, he was
a human being who was far downa rabbit hole into believing these dangerous things
and did not have the psychological toolsto regulate their emotions or whatever. Like
(09:24):
I guess what I'm trying to sayis like it's so easy to look at
these atrocious things that happen and go, oh, they're a monster or you
know, that's a fundamentally evil person. But it's like, no, they
didn't just they weren't born that way. Something happened to them. And what
we're discovering is a lot of thatnegative, the negative views their beliefs there.
(09:46):
You know, they're insecurities that getturned into actions. A lot of
that can probably be rooted to socialmedia and the things they're exposed to.
When you have people who are online and are creating content and are saying
things, these negative things about women, about young men or whatever. I
(10:07):
mean, I don't want to nameany of these people because I don't want
to expand a platform for them inany way. You know, they say
bad press is good, all pressis good press. Right, it's not
that word. Movers and shakers andthe roster's kind Shman Drew Bait. There
you go, Schman, Drew Bait, Yeah, there you go. Or
Fjordan Jeterson, you know, peoplelike them. It's they impose these stereotypes,
(10:33):
these expectations on what it is toyou know, be a man,
and these are the it's that's sucha nuanced topic. You can't sum it
up in some video or you can't. You can't define what it is to
be a man based on how manywomen you're sleeping with. And the alpha
beta discussion, it's it's just itdoesn't exist really. You know, that's
(10:56):
not a thing in human biology.It's not even a thing really in wolf
biology, which where they like togo the alpha male of the pack.
That's not a thing. It's beenthat's been debunked for years, for decades.
That's not been a thing, butwe hold on to it. Because
it's easy to compartmentalize our insecurities andour viewpoints of other people. For some
reason, I guess it's just harderfor kids to to communicate the message that
(11:20):
people are different and that we lookdifferently, and that we think differently and
we value things differently. No,we have to define ourselves apparently by how
many people we sleep with, bywhat type of beer you drink, by
what TV shows you watch, bywho you vote for, and it just
it's it's all in an attempt tomake it easier. But then you have
(11:43):
those irresponsible people that we named whowant to profit off of that. They
monetize it. The biology that theyshould be looking at, which is glaringly
there is like the US's versus themin group. Like that's that's kind of
what we're in. It's like wewe like to scapegoat things the unknown or
(12:03):
things that we don't understand, andthe easy explanations to say, whether you
know the root of all I mean, the root of all your problems come
from X, Y Z, orcome from this people group, I mean
we that's been used just because it'srooted into like our biological hardware to be
(12:24):
dubious of, like the outgroup,because we don't know if they're caring disease,
we don't know if their intention isso using that in the sense of
like we haven't really figured out howthis the immediacy of social media and how
quickly information is divulged to to reallyhave a vetting process of whether or not
(12:46):
something is factual. But they're playingon like our based biological instincts to have
fear, because that's where it's youknow, have fear of whatever it may
be. And there's a lot ofmoney being made off of like just our
monkey brains. No, yeah,I mean, dude, it is our
monkey brains. It's our need tobe entertained and our need to fit in.
(13:09):
And like, you know, youalways say everything is content. You
know, whatever you make, it'scontent. Get that out there. And
like I do want to like withthe podcast specifically, I want to increase
numbers. I want people to listento the podcast. But it's more important
that I'm making something than it isto me that I have a million people
watching or listening or seeing what I'mdoing. You know, I get more
(13:31):
enjoyment of creating, having these conversationsand chatting and just making something and putting
it out into the world, thenI really care about the returns that's on
an investment. That's you know whatI mean. It's a really important point
because we tend to always talk aboutthe negatives of social media and the vapidness
of social media, and you justmentioned something very important. When you utilize
(13:52):
a tool like social media and utilizeit in a way that it's it's jelling
with your desires or you're what theoutcomes that you want for your life,
you can create things with those platforms, with those technologies that are worthwhile and
that are like important for you asa human being, and that other people
(14:16):
will like get value out of it. And it's like we tend to always
talk about the negative ills of socialmedia, there is like this, for
example, you know, being ableto utilize those things and in the context
of just one song and in thecontext of being a singer, songwriter or
producer, composer, you're utilizing thesetools connecting with folks that you know,
(14:39):
I just met a band from Seattlecalled Shark Legs on TikTok. I just
talked to a band from Texas withinthis last week. And you bounce these
ideas and I think within the medium, using the interface to increase the art,
to push the art forward. That'sthe beautiful thing of it. But
(15:01):
with that comes the double edged swordand all the baggage that comes with the
negativity of it. And then alsokeep it in mind like, Okay,
have I been on Instagram and TikToktoo long? It's very tough to gauge
for me because it's part of myjob. I have to be on those
things all day long. But yeah, government see the need of it.
It's a it's a way. It'san equity issue too, as far as
(15:22):
like how can you know digital literacyis varies greatly, but for the most
part, you know, folks haveaccess to a cell phone and and sometimes
it's the first time that some folksever have an interaction a positive interaction with
the big bad quote unquote government orthe police or you know, these these
(15:46):
like kind of like abstract entities.To most folks that you know live in
society, they're able to kind ofhave like have a forum with and everywhere
from like municipal government, state government, to the federal level of like you
can tweet at the White House.Right when in history has that ever been
possible that you can get a messagein front of you know, their press
(16:08):
secretary or you know, the whoever'srunning their social media. As artists,
creators, musicians, how do weleverage social media to create community and to
push the art forward without also bringingalong all the negative baggage that comes with
social media. And that's a that'sa fine line to that's a that's a
(16:32):
very fine line to walk for me. Perfect example, there's a difference between
inspiration and jealousy. And I'm sureother people can relate to this. You
see an artist, you see aguitarist, for example, I play guitar.
I see a guitarist on TikTok orInstagram or Facebook, and I'm like,
oh, that guy's badass or thatgirl shreds I want to look like.
(16:52):
And there's there's that inspiration piece,But then there's also that fine line
of like where do we You needto kind of keep your child self in
check and be like all right,not you know, don't don't be jealous.
If you're going to use that,use it to push your own personal
craft forward. And that's a conversationwith like, you know, just the
negative thinking processes that we have aroundsocial media. Well, I think I
(17:18):
think that the important note point onthat is is intent matters with what you're
doing with your social media. Likeyou do social media for a local government,
right, but the intent for usingsocial media for the government is to
connect people with their community, toinform people of events occurring in their neighborhood,
you know, so they can goout and be a part of this
(17:40):
community and not sit around on yourcomputer all day, right, or be
aware there's it's to inform the publicbasically, and get the public involved.
It's kind of like, you knowhow people get annoyed when you see like
advertisements for attorneys just all over theplace. They're actually performing a civic duty
by advertising their services because you know, most people, a lot of people
(18:03):
may not know that, you know, you can retain an attorney for something,
or you know, when you seelike a class action lawsuit advertisement,
some people don't know they've been wrong. It's to let them know, like,
hey, if there's been something wrongwith you, there are services out
there. There's people out there who'sjob it is, their job is to
help you, you know what Imean. So it's the service, it's
the intent, And I think thesame thing can be said for like creating
(18:27):
something and putting something out there,and you'll start to recognize the patterns too,
Like the more you scroll through socialmedia, like when I scroll through
reels or something, you can tellwhat's there just to get you to waste
your time versus someone who's sharing acraft or someone who's there to legitimately entertain
you rather than just piss you off. Like those videos where people like make
(18:49):
those ridiculous plates of food where it'sjust like they're just wasting like hundreds of
dollars worth of food for a video, Like why that's not that's negative social
media? To me, that's that'sunnecessary. You're wasting resources just it's rage
baiting. Isn't that the term?It's rage baiting. You're baiting people to
be pissed off, and that's yourcontent. So I think intent is seriously
(19:15):
what matters when it comes to thistype of discussion, and the more wholesome
the intent, the better the material. And you know, it's it's not
all a waste of time. Idon't think not all of it. It's
a lot of it's informative. Welike to be entertained. You mentioned the
monkey brain, but I think that'sto me, that's where it matters.
(19:36):
So at least with the podcast,my intent is just to make stuff and
share it, and whoever watches watchesthat's a that's a great point. I
also think the reach like your intentmaybe you know, to provide resources,
services or just you know, publicknowledge, which is great. But these
platforms, and this is back tolike the negative side of the algorithms of
(20:00):
these platforms want you to reach bait. They want you to argue like the
talking heads do on you know,the nightly news, and that's the most
engaged content. And that's us.I even thought as far as going of
like creating like civics policy and classesto inform the general public because the average
(20:22):
user of social media doesn't know allthe like litigation that's actually happening around social
media and like people people talking aboutin court and like in the highest courts
of the land about the the intentversus the impact of social media. So
I honestly think that we need toget to a point where we're informing the
(20:45):
general public the same way that wedo. You know, if someone wants
if you want to get to uh, you know, go drive a car,
because there there are you need tobe informed about the laws and the
regulations, and like what the goodand the bad is of any kind of
interface that you use. You know, like if you're going to go swimming
at the ocean and there's a riptide, they'll put a sign out that says,
(21:08):
hey, there's a there's been reportedriptides here. I think it's it's
the duty, the civic duty ofus as like not just the government,
but just as like a society.You know, we haven't figured out the
proper usage of this stuff yet.Yeah, you make a good point,
like what sort of disclaimer can weshould there be a disclaimer added to social
(21:30):
media for the sake of the youngpeople using it, you know, or
the easily susceptible, like you know, warning using this using this platform,
you know, may not make makeconstrue ideas that may or may not be
accurate, you know, that typeof thing, or it's such a where
(21:51):
is that ethical duty drawn by thecommunities? And how do you regulate that
sort of thing? You know?Where I work, we have things like
we have an obstacle course, andthe obstacle course is to be used by
you know, members of the programwho use it. But we can't just
stop people from using it. Wecan't be policing that obstacle course all around,
(22:12):
so there's some social responsibility that exists. So what we do is the
workaround is you slap a sign onthere, and that sign says, hey,
like obstacle course must be used withan instructor present, but no one's
out there policing. What that doesis that that you know, saves us,
That protects us from liability if someonegets hurt. Hey, we've informed
(22:36):
you you can't use this without this. You need to know this is dangerous.
If you're going to use it,you can't sue us if there's no
one out here using it, youknow what I mean? So in like,
in one sense, is there away to motivate these companies to have
these sort of disclaimers aid to protectthem from liability on some sense, but
also be to like help limit them, help help when the when they police
(23:00):
their platform, or will it havethe opposite effect. We put that liability
up there. You you chose togo on and use the service. Guess
what, we're not policing it.You're gonna see hate speech, You're going
to see all sorts of negative stuffon there, and I think that's likely
what would happen over time you putthat disclaimer up, people are going to
(23:21):
use it and they're going to seesome shit they don't like, and you
know what, like that can besome very harmful material. So maybe the
disclaimer not being there is important becauseit forces the social media platforms to police
the comments, to look at thingsand do their best to keep it at
bay. At the end of theday, they're also private industries and the
moderation relies within their own individual policies. So yeah, so like, how
(23:47):
far should the laws go when itcomes to regulating how do you protect the
younger people using it? I meanthere's or just the general population. There's
a good example and that I wasthinking of when you were mentioned name the
usage of that obstacle. Course,we have liquor and alcohol laws nationally in
the United States. Yeah, youhave to be twenty one years old to
(24:11):
purchase alcohol. The law and thepart that the way that policy plays out
is if I go into seven toeleven, I'm a thirty six year old
man and I buy beer, theydo their due diligence and they adhere to
the law by checking my idea andmaking sure that I'm not visibly intoxicated.
Right the second I pay for thatbeer and it's mine, and I leave
(24:33):
seven to eleven and I give thatbeer to a sixteen year old kid that
asked for me to ask me tobuy beer, they are no longer liable
for the outcome I am the useris so correct, that is that's almost
like kind of what we're at.We're reconciling with January six again, perfect
example, what is the liability ofthe platform itself and the like I mean
(25:00):
in the Facebook papers around January sixth, they're what they're the there's damning evidence
of them pushing that narrative and allowingthings to go that created the situation on
January sixth. But then there's alsoindividual prosecution from the ft from the FBI
(25:22):
on the folks that perpetrated that thing. So it's like hand in hand.
So are you are you gonna areyou gonna slap a fine on seven to
eleven because they sold me the personthat actually broke the law, or are
you gonna prosecute the person that brokethe law which give the beard an undernaged
kid. Yeah, you know you'reright, though you can't it'd be one
(25:47):
thing, if the sixteen year oldkid used a fake ID and the cashier
field to spot it, like,there's there's criminal there's criminal implications for the
kid with the fake ID, buteven more so for the guy who sold
it. You know, it's likeyou didn't check it, you didn't catch
it, and it's like, okay, but the fake idea was super convincing,
you know that type of stuff.You're You're right, like it is
(26:11):
is it the responsibility the user?But at the same time, like we
refer to that, we use theterm hate speech, but like here's the
kicker leo. Hate speech is protectedspeech. Yeah, it's not illegal exactly,
and there's no definition for hate speechlike right, it's contextual based off
of person interpreting. It's intentional too, because who first of all, how
(26:36):
do you define what hate speech is? You know what I mean exactly?
Isn't hates You're right, like yousaid, it is subjective. But then
who's going to be enforcing that ifyou make that a law. There's no
guarantee that the person who's going tonow be enforcing the hate speech laws is
going to be agreeing with your definitions. You know what I mean, Like
it has to be subjective. Itcannot be an objective law in there unless
(27:00):
it's something that causes physical harm,because physical harm is measurable. Right,
assault, We can talk about harassment, you know, that's that's something we
can measure. But like putting aswastika on the side of a building that's
graffiti, can they prosecute for thehate speech? Maybe not? Maybe it
(27:22):
depends on who's interpreting, right,It depends on the the that's sorry,
I understand it. It depends onthe level of outrage and harm, real
harm that it does. I don'tmean to say that psychological distress is not
real harm, but real is ameasurable harm. Yeah, that's a that's
a funny that's a funny thing tothink about too, because as like technology
(27:44):
progresses and there's abilities to actually measurethe amount of like psychological distress on like
populations. I wonder if that willever make its way into court to be
like, you can measure the amountof harm that it caused this community,
uh in Flint, Michigan or youknow, because we've done we ran neuropathy
(28:07):
tests on the population and they areall suffering from X y Z. Yeah,
and that's the thing like someone somedude like standing in the middle of
a crowd and shouting the N word, right or shouting that at someone like
you can you can prosecute that personfor disturbing the peace right if they're out
(28:29):
shouting it. They don't necessarily haveto be shouting the N word. You
can be shouting like I don't know, roller blades or if you can shout
whatever you want. But if you'reshouting loud enough, dude, they can
you can be fined for disturbing thepeace. It isn't what you're shouting,
it's that you're shouting loudly. Butif it's something like the N word and
you're causing distress, however, thejudge, if if charges are pressed against
(28:53):
you by whomever, however the judgeis can to interpret what you're saying.
I mean, that can add atwait to what you're doing, but the
charge is still disturbing the piece,you know what I mean. There's a
maximum sentence for different types of crimes. What you don't have on social media
is the volume. You don't havesomeone out there shouting, and you know,
(29:18):
shouting the N word in public isdifferent than making a post because you
have a responsibility as a user,you have the ability to block that person.
You can report that person and theyand as a private institution, as
a private company, they can blockthat person. They can put them in
you know, social media jail.That's something they can do within their rights,
(29:38):
because a private company can have acode of conduct. No shirt,
no shoes, no service, youknow what I mean. Like, if
you're going to use disruptive speech,we don't have. You don't have to
use our platform. If you're notgoing to come here and play nice per
our community standards, you can't bea part of our community. Yeah,
(29:59):
I think so that example of likeyou the usage of the N word,
right, other than like, that'sthe that's the low hanging fruit. And
that's the example that most like ofthe conversations that I've heard around social media
policy, right comes from the illustrativeexample. But it's also like, how
(30:19):
often do you see like and I'mas a moderator and on social media,
you don't. That's not the kindof hate speech. That's not the uh,
the the explicit kind of hate speechto shoes or that you're seen in
the past that has arisen to actualit's more so a perfect example is the
(30:41):
the manifesto of the San Antonio shooterlisted uh to several episodes or listed specifically
an episode from Ben Shapiro talking aboutlike the the invasion of America, like
of America by like you know,the Latin X or Hispanic population. That's
the kind of like real world implicationof like but then again, like that
(31:03):
draws out there's actual harm. Butyeah, the hate that we don't allow.
Hate speech is a very subjective hatespeech. You, you, and
I may not comprehend, you know, a hate speech dog whistle because we're
not part of the queer community,but someone that's part of the queer cree
community can or for example, likeI may find something offensive as a Latino,
(31:26):
but you, you or someone elsewouldn't see that because they're not a
part of that community. So policingpolicing hate speech is not a from a
from like a First Amendment perspective,there's not really any legal precedents that would
hold that up. But from aprivate organization perspective. With policy, the
(31:49):
moderation is in the court of thosebut like that's where that's where I think
the regulation of the law comes intoplaces. Like you could as a society,
we can demand stricter or more stringentpolicy procedures from those private entities through
(32:10):
law. Yeah, we can wemaybe maybe incentivizing finding way to incentivize these
companies to have a more wholesome approach. But I mean then it's you got
to measure how far is too far? How far are you going to go?
But at the same time we haveto understand or not outstand. We've
(32:30):
observed that like you know, liketruth Social, it's not super popular as
far as I understand, like,yeah, you have people using it,
but it's never gonna be at thelevels of Facebook or Instagram or even TikTok
you know what I mean. It'sgonna stay very niche and it's gonna be
(32:50):
policed in their own way. Sothen you have to measure like, Okay,
what sort of you know, publicharm can something like truth Social do
And then get our answer the Januarysixth, Yeah, you know the January
January sixth where it's like, oh, it's it was used to coordinate the
coup an attempted coup on the inthe United States, And it's like,
(33:14):
it's funny to me because the libertarianand argument of like, oh that's government
overreach and they shouldn't have control.They're taking away your freedoms. Like back
to the whole, Like, thereason we have regulation is because people's houses
were falling on top of each other, people were getting electrocuted, cars were
killing an exorbitant amount of people.Like it in our food, there's poison
(33:37):
our food, there was lead inour paint. The reason these things came
into being was not was not becausethe government wants to take away your freedoms.
It's because there was like a societaloutcry of people being like, do
something because people are dying. It'sit's in the nature of protecting the public
(33:58):
good. I mean, that's that'swhy. That's why we have a government,
and that's why the government powers arebroad but not specific at the federal
level, you know, and soand it's like, but if how far
is that, who's going to regulatewhat? Who's going to regulate this?
You know, when you when youmake a post for your municipal government,
are your posts adhering to the locallaws of your government, or because you're
(34:21):
using a platform that exists across thecountry, are you are you using more
broad not that you're you know,I don't know what specific laws exist in
your community? How specific can weget? Yeah, yeah, I mean
yeah, traffic laws vary from stateto say, even, but there's like
general yeah, no, Actually,traffic laws are a great example, speed
(34:45):
limits roundabout helmet versus not on amotorcycle. You know, where's this standard?
I think it? I think Ithink the real answer, though,
Leo is like, it's where thecompanies are based. So like, if
the headquarters is for Meta's in California, right, then Meta has to adhere
to the state of California's laws aswell as the United States, right.
(35:07):
But what if Meta changes its headquartersto London, then it's like, Okay,
do they still have to adhere tothe laws of the United States.
And I think the free market dictates, yeah, you want to do business
over here, you've gotta follow ourlaws here too, you know what I
mean. And that's why companies likelike Disney or whatever. That's why they'll
(35:30):
like censor certain parts of their moviesbefore they're release it in China, in
a Middle Eastern country or in China, because they're like, well, we
want to adhere to those laws sothat we can get that money. So
it's it's a sort of legal pandering, I guess is what you can look
at, and there's there's also wealso have public decency laws as well,
(35:51):
you know what I mean. That'swhy you can't show a topless woman on
Facebook, but a topless man isokay. You know, like there's public
decency laws that are exist. Theyare you know, as many laws are
in this country. So yeah,I guess they do have there's a limit
to what they can adhere, butthose gray areas like hate speech or something
that caught psychological distress or really justthings targeted at children. I think that's
(36:16):
the other thing too, is ishow can you limit things that are targeted
at children? Yeah. YouTube gotinto a lot of trouble for that.
Yeah, it's regulating those videos.The algorithm was sending them creepy, weird
stuff. That's a that's a reallyfine line to walk to as a because
free speeches the musicians artists best friend. Right, You have the you have
(36:42):
the ability the right to say whatyou want to say and wherever your inspiration
takes you. Because it kind ofcomes together with like what what we're talking
about, it's like a limit too, but then it and you know,
this is the whole problem with corporateentities being seen as individuals because because of
(37:07):
LLC law, like corporations are likenobody's liable. It's it's a it's a
person, right, like they havethe same rights at the table as you
or me, which is bananas,but it's it's not a person. It's
an entity, an entity, asingular entity, yeah, which gives them
(37:28):
or a party, uh, thesame First Amendment rights as Joe Schmo that
wants to write a protest song.So that that's a weird one because as
much as like the we value thefreedom of speech in order to create and
(37:49):
commune communicate ideas, do we dowe allow that same volume or that same
those same rights to a corporate It'syeah, there's definitely an ethical concern behind
giving to incorporations or people, especiallywhen comes to regulating certain things. You
(38:12):
see that in the housing market whenlike Airbnb, you can just buy a
bunch of houses in a community andthen they have to limit it. But
then, yeah, but you haveto limit it like Okay, a company,
an entity, a party cannot purchasemore than two properties in a community.
You know, like you own this, well you want that standard has
to apply to people too, Andin reality, that's really going to affect
(38:35):
like super wealthy people on a majoritylevel of that focus. You know what
I mean. There's probably people outthere whose job, whose whole life life
is their property managers. They purchaseproperty, they rent it out, they
do provide housing. Say what youwant about landlords, it is true the
rates at which they do it andhow they maintain it goes into the quality
of the landlord. You know whatI mean. I'm not coming out as
(38:58):
super pro landlord. I'm just sayinglike there's lines but with with with music
too, Like when you work fora record label, if you're going to
be a songwriter and you're writing musicthat's going to be owned by that label,
how much of your intellectual property lawsare you property rights are you giving
up? There are basic rights thatyou can retain, but are but you
(39:22):
can also sign those away in acontract in exchange for a paycheck. And
that's the decision musicians have to maketoo, if that's the route they want
to go. And damn Leo,I'm my lunch is like over, so
like we're getting into the meat ofit. That's good, Yeah, this
is good, and I want topick up there because I have so much
to say, and I'm sure thisis things we've already said before. And
(39:44):
the topic I want to talk toyou next time on a mini episode we
do is about I want I dowant to talk about the Andy Warhol Foundation
versus Goldsmith case. I want totalk about transformative use and copyright. I
think if we had another hour we'deventually would have gotten there. But next
(40:05):
time, I want to talk aboutthe expectations of being a musician in today's
day and age. Can you makea living? Can you realistically pay your
bills, raise a family, owna house by being a musician? I
spoiler, I would say no,I'm gonna have to agree with you.
(40:30):
And I think I don't think that'sI don't think it's necessarily a bad thing.
I don't think it's necessarily a goodthing. But I think there's a
certain level of reality that creatives haveto be okay with when it comes to
music and and and that being amused, you know what I mean, And
(40:51):
whether or not full time gigs,yeah, and whether or not you continue
to participate in it, and likethat's a that's a realization. I think
that's great. That's a Part twobaby freedom speaking to part one and we're
yeah, that sounds good. No, I definitely think like you re re
evaluating your relationship and the importance ofmusic is like you will continue to be
a creator and compose and write regardlessof the like if you're able to have
(41:15):
a sustainable life primarily just off ofmusic. I don't write. I don't
know anyone, And I know alot of folks in the music industry that
is just doing music as their maingig, saying I don't I know.
Actually that's a lie. I knowmaybe two or three people, but they're
in the production music world, andthey own their own companies, and they're
(41:38):
they do not write for fun,and you know, there's a certain level
of selling your soul a bit thatcomes with it, but it's doable,
but not in the way you think. Yeah, so yeah, let's talk
about this, right, Yeah,this is great. I appreciate talks all