Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
S1 (00:00):
The statements expressed in the following program are those of
the speaker. They do not necessarily represent the views and
opinions of the sponsor, the host and or Olas Media. Olas Media.
You're listening to the Lawyer in Blue Jeans podcast.
S2 (00:18):
Welcome, everyone, to the Lawyer Blue Jeans Podcast. My name
is Justin Isaac and this is our very first episode
of the podcast. And today I am joined by James Hess,
Christina Garcia and Adam Schneider, where we're going to be
talking about the law, wacky cases, anything that's really relevant
to the law. We're going to be breaking down some
of the crazy things that we see in the law.
(00:39):
So we're here for your requests as well. If you
see something in the news or if you want us
to talk about why your neighbors tree is infringing on
your area and who's who's at fault, let us know.
Email me Justin at Lawyer in Blue Jeans and we'll
be happy to talk about it on our on our show.
So let's go around and have everyone kind of introduce
(01:00):
themselves and give a little bit of background. So, James,
let's start with you. Hi, everybody. I'm James Hess. I'm
an estate planning attorney with the lawyer in blue jeans,
born and raised here in San Diego. Lived here my
whole life. Go Padres. I have a background in family law.
That's what I worked in before becoming an estate planning attorney.
(01:23):
And in law school, I focused on sports and entertainment law.
So have a little bit of a background in that.
All right.
S3 (01:30):
Christina Yeah. Christina Garcia I am a San Diego native
as well and graduate of California Western School of Law.
I am a recent bar passer, but I've been doing
estate planning, trust, administration and probate for about seven years now.
S2 (01:48):
And you've been in office for a couple of years too,
right now? Yeah, 2 or 3 years now. Three, Yeah.
Before you. Even before you even passed the bar. So
we were very happy when you did because that's we know,
we all know the struggle of passing the bar and
what that entails and studying for and all that stuff, so.
All right, Adam, what about you?
S4 (02:07):
Hello, everyone. My name is Adam Schneider. I unfortunately cannot
claim San Diego. I was born in outside of Pittsburgh, Pennsylvania. Go, Pirates.
Not totally. Just kidding about that. Um, my previous background,
there's a lot. Um, I was a criminal defense practitioner
and did a bunch of different areas of law. My
(02:28):
mentors advice was do everything and figure out all the
stuff you don't like and then pick the one you
want to practice forever. Since that, I've been doing estate planning, wills,
trust and probate, and I absolutely love it. And that's
a little bit about me. Thank you.
S1 (02:47):
Stick with us. There's more to come after this quick break.
Welcome back to the Lawyer in Blue Jeans Podcast.
S2 (02:59):
So we're going to kick it off today with our
most recent legal news. Lady Gaga. I think a couple
of years ago, her dogs were kidnapped, dog napped, and
she didn't know what was going on. She wanted to
put out a reward. So she put out a statement
that said she would give $500,000, no questions asked. Very
(03:20):
important to anyone who returned the dogs, found the dogs
or I think participated in getting the dogs back to her.
And a woman came forward with her dogs. And she
is now claiming and suing Lady Gaga, saying that she
did not receive her $500,000 reward. And she's also claiming
(03:40):
some other damages, pain and suffering. And also that Lady
Gaga never had the intent to actually pay that reward.
So this is a very interesting lawsuit because the woman
who came forward with the dogs actually participated in the dognapping,
it appears, and she is a part of that suit
(04:01):
or I believe she got led off with probation. So
the question becomes, should someone who actually participates in a
crime which caused Lady Gaga to bring forward this $500,000 reward,
should she be able to profit from something like that?
What are your thoughts, James?
S5 (04:20):
So my initial thought is, no, if you are committing
a crime in order to complete a contract, you shouldn't
be able to take under that contract.
S2 (04:35):
But the contract was returned. It's a unilateral contract. All
you have to do is perform in order to get
that money. So if you bring the dogs to Lady Gaga,
you get the money, the 500,000. It doesn't say anything
about not participating or having clean hands or anything like that.
All it says is if you get my dogs back
(04:57):
to me, I will give you $500,000. It's what's called
a unilateral contract. So why doesn't that work here?
S5 (05:03):
You bring up a good point about the the no
questions asked clause in in Lady Gaga's initial offer in
the unilateral contract. But there are, I think, several challenges
that that can be made. One is the the idea
(05:25):
of unclean hands, which essentially says if you are committing
a crime, you can be prevented from damages. I think
there's also issues with just public policy and what it
would do to public policy, what it would do to
cases like this in the future. You know, seeing a
rise in people stealing other people's pets in order to
(05:48):
try and get a reward out of it.
S2 (05:50):
So, Christina, what are your thoughts?
S3 (05:52):
I think it all depends and stems on whether or
not what their intent was when they stole the dogs
because the dogs were taken. And then Lady Gaga stated
everywhere posted this unilateral contract knowing that they might have
been stolen and all she wanted to do was post
the reward in order to get her dogs back. So
if you knowingly say, I'm going to give $500,000, no
(06:17):
questions asked, it's probably because you know that they were
stolen and you don't care whether or not want to
give the money to get your dogs no matter what.
And I don't know if it might there might be
public policy issues where we shouldn't want that to happen.
But ultimately, that was what she contracted for.
S2 (06:34):
Yeah, I think that if you if you really think
about this, she Lady Gaga didn't necessarily make this statement
with the idea that the person who would bring the
dogs forward would be the actual dog napper or participate
in it. She just wanted her dogs. So when she
says no questions asked, $500,000, that makes sense, right? She
just I just want my dogs back. I will give
(06:55):
$500,000 to whatever happens or however that happens. And the
lady who actually, you know, she did, I think she's
on probation from it, but she was definitely involved in it.
And they have video of her being involved in it, too,
waiting for the dogs to be dropped off to her.
But so it doesn't mean that she's wrong when she
(07:15):
claims that she was supposed to get those those funds.
But public policy might say that, hey, maybe we shouldn't
allow this to happen because that's not fair. Right? Like
we have a Slater statute which is kind of similar
to this. Right? You want to tell us all about that?
S4 (07:34):
Yeah, It's, um, you know, think about life insurance policies
and where somebody commits a an illegal act in order
to get money such as killing someone. So basically, you're
barred from collecting that money because you committed murder. Awarding
people is incentivizing people for committing bad acts. So that's
(07:58):
what we're dealing with in this situation. She's receiving money,
although she conducted illegal behavior. I think an important caveat, though,
is we're talking a lot about this no questions asked thing.
And Lady Gaga did receive a benefit by putting that
in the contract. That was the return of the dogs.
(08:21):
So she received something and she wasn't counting on the
reciprocal benefit necessarily going to the person that kidnapped her dogs.
But it is what happened. And her putting that in
there actually served a purpose and made sure that her
dogs were returned. So why wouldn't the person get the
(08:42):
reward for bringing back the dogs, too?
S2 (08:44):
Yeah, I see. She's basically saying there's no condition to
the return of my dogs and there's no condition to
this reward. Now, the rest of the complaint that the
lady filed, she claims, you know, emotional distress and some
other stuff like, you know, Lady Gaga never intended to
pay the reward. And I don't really know how we
can come to that conclusion. But I think it's interesting
(09:05):
because where you differentiate between this case and like a
Slayer statute, you know, a Slayer statute, if you kill
somebody to inherit some kind of life insurance or retirement
or something like that, it basically bars you from. Because
obviously we don't want to incentivize murder. But when you're
(09:25):
dealing with something like that where you just get a
payout if someone dies, this is different than when someone
makes a statement. Lady Gaga made a statement that said,
I will give $500,000, no questions asked. And I think
that's such an unequivocal statement that it's really hard to
say that this lady in my mind, it's hard to
say that she shouldn't be rewarded for returning of the
(09:47):
dogs because she fulfilled what her unilateral part of the
contract was. All she had to do was perform. She performed.
Shouldn't she get the money? Yeah.
S3 (09:55):
And then on the other side of that, though, is
that there are people in our society who do this
for a living. They steal dogs and they wait to
see if there are any reward flyers. And, you know,
who knows what happens to the dogs after that if
they just release them or what have you, if there's
no reward. But, you know, stealing someone's dogs like Lady Gaga,
(10:16):
you might expect her to put out this reward. So
might have been done for that purpose, for that illegality.
S2 (10:26):
But does that negate it, though?
S3 (10:28):
I feel like it should be because of public policy,
because you don't want a bunch of people continuing to
just steal dogs in order to hope for a reward,
even $100, $500 all the way up to $500,000.
S2 (10:39):
But and I'm not a contract attorney. And, you know,
we do estate planning in our firm. So this is
obviously not like what we focus on. But we, you know,
when we learn in contracts that there's two types of
contracts where you have two people who perform or you
have one person or what's called a unilateral contract will
all you have to do is perform in order to
(11:00):
satisfy the contract. And that's what we have in this situation.
But what we're going to then is really the if
we're talking about the individual, the lady who who did this,
now we're talking about her thought process before she performed.
So that brings a whole different layer to unilateral contracts.
And I, I think that it makes sense that it
(11:21):
could be negated as a result of public policy because
we don't want to incentivize that. But technically, she did
perform the contract. So it's really, really tough.
S4 (11:32):
Well, my kind of thoughts are this is all the
court of public opinion, of course. But Lady Gaga has
the financial means to pay for this dog, presumably as well.
Her mission was complete. Why does she get to skirt
the system and invalidate a contract that was arguably fulfilled?
Because she put it out there and she got her
(11:53):
dogs back. And, you know, that's a devil's advocate position.
But would she receive is she going to receive preferential
treatment because she's a celebrity?
S2 (12:01):
Well, that brings up another point, too, is that would
this be the case if it was someone who offered $1,000?
Or what about since someone was shot in this whole scenario? Right.
The dog walker was shot. Maybe that's why this is
such a bigger issue.
S4 (12:15):
And we see this all the time. I know in
my neighborhood on a on a certain app, I see
rewards for animals posted and the no Questions asked doctrine.
And you're right, Justin, and it is for a lot
lower of amount. So nobody ever really pokes these things.
But here we are a much bigger case with a
(12:36):
lot more money involved in physical harm or robbery of someone.
S3 (12:41):
And because she had the financial means, though, that's it
could be why she was targeted.
S6 (12:45):
Yeah.
S5 (12:47):
My I guess my whole issue with it is the
doctrine of of a contract. It's a contract for goods essentially. Right. Okay.
That are stolen. And if somebody were to steal property
from somebody and sell that stolen property to somebody else,
(13:08):
that contract is voidable. Right. Okay. If the person from
which the property or the assets or whatever were stolen
can prove that they have superior title to the person
that they were sold to, you know, it's an idea
of heaven and being able to essentially void that contract
(13:29):
for the sale of stolen goods. So you're and.
S2 (13:32):
But that's more like what you're talking about is in
like a pawn shop or or some other stolen.
S5 (13:39):
Pawn shop or, you know, somebody robs a house and
turns around and sells the the products they stole on
on Craigslist or whatever, you know.
S2 (13:50):
But those those situations, I think, are a lot harder
because then you have a party who innocently purchased stolen goods.
And then what if they how do they get their
money back? What if the the thieves, you know, sold,
sold something and then spent all the money? Then the
person who thought that they were buying a legitimate thing,
now they're just kind of screwed.
S5 (14:12):
And it it sucks for them. But at the end
of the day the the contract for. Stolen goods is voided.
S7 (14:21):
Yeah.
S3 (14:21):
I mean, yeah, bringing it back to the dogs. If
the dogs are stolen in the right. Really? Lady Gaga,
she can take them back. And you don't have to
pay for your to get something that's already yours back
that was illegally stolen.
S2 (14:35):
Yeah, but she's not paying for them back. She's not
buying these dogs on Craigslist. Right. She's not buying them
on the open market. She made a plea to the
public in many different places on social media, I believe,
and like TMZ and whatnot. And so she wasn't saying,
I will buy my dogs back. She was saying if
(14:55):
anyone can return my dogs or lead to the return
of my dogs, I will give you $500,000 as a reward.
So I think, yes, while it does deal with stolen goods,
I guess dogs are a good in one way or
another because it's not a service. So I still think
that it comes back to this question, as did she
fulfill the contract? And does the the the criminal party
(15:19):
in this sense, is she not entitled to that money
because she participated in or set up this whole situation
to begin with? So that's what it comes down to,
is that I think the public policy argument against this
is going to be you're probably your strongest suit. But ultimately,
it's a it's a very interesting lawsuit.
S4 (15:41):
Overall, it would be a fun one to watch in
a jury trial.
S2 (15:46):
Yeah, if they I wish they would. You know, after
watching Johnny Depp, Amber heard, you know, case, I think
that that was the best I guess live action drama
that we've had in a very long time. And because
those attorneys will some were great and some were horrific,
some are not. And that's what made it so interesting, too,
because we got to sit back and talk about it
(16:07):
at the office and analyze it on the side saying
like this, do you see this? Did you oh, did
you hear this? Because it brings up a window into
the court that we don't get to see that often,
and especially the the common public who's not an attorney,
definitely doesn't get to see very often.
S5 (16:23):
The number of people that became legal experts. Yeah. After that, yeah,
it was amazing.
S2 (16:29):
It's like whenever there's a big boxing match, I remember
like a big heavyweight fight, there's always going to be
you hear like all these people who are acting like
they're experts. I know boxing isn't as popular as it
used to be, but it's the same thing in situations
like this. You get a big case such as this
that everyone thinks that they know. And even as attorneys,
(16:50):
we don't know everything, right?
S3 (16:51):
We we're state planning attorneys. We don't even know anything
about the Lady Gaga like the dog situation.
S8 (16:56):
We just think it's interesting.
S2 (16:57):
Yeah, it's a very interesting topic. It's it's something that
anything law related or crazy laws kind of interest me.
So actually, let's go on to a different one. Let's
talk about Brett Favre. Okay. Brett Favre is suing Shannon
Sharpe and some other people, too, for defamation. James, you
(17:18):
want to kind of break that one down?
S5 (17:20):
Yeah. So if you don't know what's going on with
Brett Favre, allegedly have to say allegedly here, there were
some funds from Mississippi that were misappropriated and these funds
were part of the welfare welfare program in Mississippi. And
(17:41):
somehow these funds ended up, again, allegedly going to, I
believe it was a school that his daughter was played volleyball,
playing volleyball at to build a new stadium or upgrade
the arena or something along those lines. Yeah. And obviously,
court of public opinion, lots of people have had several
(18:04):
different opinions on this about Brett Favre and what's going on,
notably Shannon Sharpe, former NFL player, and Pat McAfee, another
former NFL player who have their own shows and podcasts
and whatnot, have talked about, uh, talked about this and,
you know, used language such as stealing money from people
(18:26):
that needed it, calling Brett Favre a thief, saying he's
stealing money from poor people in Mississippi. And now they
are being sued for defamation along with few other people
by Brett Favre. And it comes down to really adverse
opinion in my in my opinion, you know, in order
to sue somebody for defamation, you have to present something
(18:47):
as fact and it be false and it caused them damages. And,
you know, Sharpe and McAfee are saying they were just
expressing their opinion on on Brett Favre. They were just
expressing their opinion on the situation. They weren't actually presenting
anything as fact.
S2 (19:04):
Yeah. Adam, do you want to chime in?
S4 (19:06):
Well, I don't know a whole lot about the case,
but it sounds like there's going to have to be
a lot of work to be done to prove, like
you said, what actually happened in the underlying case. So
I think we'll have a a high delay here on
the defamation suit, which oftentimes those cases are put on
hold while the underlying case is taken care of. Um,
(19:28):
but well.
S2 (19:29):
At the time, he's not currently facing criminal charges. He
is being sued by the state, it looks like, to
recoup funds. I think I saw that somewhere. And he
has repaid over $1 million already. And I think that
the this is just a scandal in general because of
because of his name and because it's supposed to go
(19:51):
to welfare recipients. But using these funds for something like
a new volleyball arena at the University of Southern Mississippi
is just ridiculous when you know what they're supposed to
be intended for. But, you know, when when they came
out and said that he, quote unquote, stole money from
people that really needed it. That sounds like a statement
(20:12):
to me. That doesn't sound like an opinion. If you
use qualifiers that say like allegedly or in my opinion,
and then make that statement, that's a little bit different.
But it doesn't appear I mean, they might have said
allegedly at one point, but I don't think that they
said it enough because that that suit appears to be
(20:33):
going forward. So, um, what do you what do you think?
S3 (20:37):
Yeah, I don't know too much about this matter, but
I also wonder what Brett Forbes status as a political
figure and public figure as far as the defamation goes
like was why is that important, where the statements intended
to hurt his career and to purposefully harm him? Were
(20:59):
they done with malice? Yeah.
S9 (21:00):
Well well.
S2 (21:02):
I guess let's explain why that's important, because a lot
of people don't know that there's a different level for
lawsuits here, for defamation cases. Yeah.
S3 (21:10):
So in defamation, the standard is different for when it's
a public figure with somebody in the limelight. It there's
different elements that need to be met. And one of
them is whether or not it was the truth of
the matter. And also if there was, it was done
with malice, Right. It's been a while since it's.
S2 (21:31):
Intended.
S5 (21:32):
I believe. Actual malice.
S2 (21:33):
Actual malice. Yeah.
S4 (21:34):
I think another important point about this is, as lawyers,
we know how important it is to actually establish your damages. Arguably,
Brett Favre has done a considerable amount of damage to
his career, specifically during and after the Vikings fiasco. But, um,
(21:55):
he's going to have to show that he had losses. Additionally,
as a result of the statements that were made by
these people, which could be tough to prove.
S2 (22:05):
I think so too, because if it's already in the
news as a potential scandal, you know, what more harm
does someone making that statement come out? What does that
actually do that if? Yeah, so that's probably an argument
that they're going to have as a defense or at
least a litigator to damages.
S5 (22:23):
That's one of the arguments that I've seen raised, is
that his reputation has been just trashed so much already
from everything that's going on that these statements can't really
hurt him any more than he's already been hurt. True.
You know. Yeah. Um, but Adam brought up a good
(22:46):
point about there's still a lot of information that has
to come out. And this is one of the, I
think one of the defenses that McAfee is going to
use is that in the statements that he was making
on his show, he did bring up the point that
there's still a lot of information that hasn't come out. Yeah,
you know, he made it clear that what he was
(23:08):
saying was based on what's known so far in the public,
and he made it clear that things could change and
opinions could change because there's still a lot of information
that's going to come out. And so I think that's
something that will help his case because in my opinion,
you know, he made it he made it clear enough.
You know, what I'm saying is based on on what's
(23:31):
known right now and.
S2 (23:33):
That it might not be he knows personal.
S5 (23:35):
Knowledge. Yeah, right. As Brett Favre makes his case and
more information comes out, you know, things could change.
S2 (23:41):
So. So do you think that this is more of
a situation where he's just trying to bring a lawsuit
to save face because he doesn't like what people are saying?
Because that's entirely possible, right?
S5 (23:50):
Yeah, I think so. You know, it's something that I
think he probably realizes the the damage that his reputation
has taken from from this scandal, as you put it.
And he's trying to do something to to save face. Yeah,
(24:10):
You know, makes sense.
S4 (24:11):
I think it's interesting if you look at the statements
or specifically the payments. Um, somebody mentioned that Brett Favre
have made has made some arguable culpability. It's it's not
an admission of guilt per se, but it is. Doesn't
look good, though. It's significant because it shows that maybe
(24:33):
he does agree it's a form of settlement that he
did something wrong. Yeah. For some reason. So if I
was McAfee or Sharp, I would also their lawyers be
focusing in on that to show that. Why would you
make a payment if you didn't. Right. Arguably take advantage
of people.
S2 (24:51):
True. And the interesting thing is that this is all
happening in the middle of this, quote unquote, scandal unfolding.
So we need I think we kind of need the
scandal to fully unfold until we can know whether or
not this is defamatory or this isn't at all. So, Adam,
let's talk about this case that you you found recently. Yeah,
(25:11):
It's this theory. I guess it's more of a theory, right?
S4 (25:13):
Oh, it's it's very, very much a theory. Um, chatgpt.
Of course, everybody's talking about it in the the algorithms and, um,
robots basically drafting things for you or analyzing even the
legal sense. There was a big, um, big talk about
a program actually drafting a legal opinion and then potentially
(25:36):
arguing it in court. But we're not here to talk
about that. It's the implications really of outside of that
using these systems to hire or fire employees, it kind
of gets into a civil rights issue as well. But
there's an entire body of case law addressing whether certain
(25:57):
conduct by an employer and we're talking about state and
federal constitutes unlawful discrimination against potential employees based upon certain criteria.
So what we have is our systems that are being used.
I think it's pretty common knowledge to screen employees during
the hiring process. And you have AI technology that is
(26:21):
doing this with algorithms to determine whether or not some
people are good candidates or not from a civil rights
or employment defense lawyers position. There's concern that these systems
may be analyzing people in a way that could imply
or apply false metrics in the sense that the program
(26:45):
could be racist or it could be biased to make
sure a certain class of people is not hired based
upon the computer's model. So it's very breaking. This is
an article that was just published today kind of exploring
the the implications and the dangers of this. And and
(27:06):
Justin was right. You know, nobody really, really knows. But
it's something that people are planning on having a field
day in court. We were talking earlier that it might
actually be easy to show that a computer program targeted
a certain class of people.
S2 (27:23):
I think that it would be easy if you had
someone who's claiming some kind of discrimination. I think it's
a lot easier to show that a computer discriminated than
a human because the the computer in theory is going
to spit out the same or similar results every time
you give the same or similar parameters. Whereas with the human,
(27:45):
they can easily obfuscate their their thoughts, beliefs of feelings, opinions.
But a computer can't. Right? And even if it is, I,
I don't think we're at that level yet where they're
able to, to know that they did something wrong and
hide that. So the legality is well is the employer
(28:07):
is still liable even if they didn't know that the
AI equipment was going to commit this I guess crime,
you can't really arrest the computer. Right. But you can
find the employer, right. So are they still are they
still liable for something like that?
S3 (28:25):
I would think so. I think that they need to
do their due diligence in picking the software that they
use and how they screen employees and candidates to hire.
And if they're not doing their due diligence and, you know,
or even if they are doing the due diligence and
they just, you know, they need we need to do
better at making sure that things like this don't happen. Yeah, right.
(28:48):
And hard to go through the stuff yourself.
S4 (28:52):
Well, there's a really a really, um, quite historic saying
in criminal law it's ignorance of the law is not
a valid defense in the criminal spectrum. But in this
sense the article I have here puts in here ignorance
of the flaws in the system is not likely to
be a valid defense for the employer.
S2 (29:14):
So yeah, I mean, I don't know how to read code.
So how would we know if there is something completely
wrong there? Yeah, but as as the AI system is
an agent of you, it is acting with agency on
your behalf. You're still responsible. If someone who is working
under you commits a crime to a certain extent, you
(29:35):
are to. If they committed some kind of crime or
some kind of violation. How what's the ultimate liability there?
S4 (29:45):
And it's still very much up in the air. This
is all opinion, of course. The other the other thing
I read in this article was proving that the system
actually took actions. And you can actually show that there
was a disparate impact on a class of people, which
is kind of the legal precipice that all rest on.
(30:05):
It's not like you likely to come up if 1
or 2 people are harmed, but you can you can
bet that it's going to garner attention if hundreds of
people from the same demographics or class bring up this issue.
So right then and there, you are dealing with potentially
a big class action lawsuit that could roll down in
the coming years. True.
S3 (30:25):
Now, is this all hypothetical or do we have some
sort of instances with the technology pulling out these biases?
S4 (30:32):
It's it seems very hypothetical to me at this point.
S5 (30:35):
I think there's actually stuff going through the court system
right now regarding Fair Housing Act and AI and people
using AI to screen applicants for housing. Essentially the same
idea that these algorithms are having implicit bias on certain
(30:57):
types of candidates. So and it'll be interesting to see
how do.
S2 (31:02):
You get that out. I mean, we need a computer
scientists to to really break that down. I know that
there was someone a black couple recently was awarded a
big settlement, I think yesterday or a couple of days
ago for a couple of million dollars because they when
they listed it and they were the sellers. Right. The
(31:22):
the values coming in were like substantially less than when
they had their their white friends sell it or act
like they were the sellers. I don't know who they sued.
I just saw like the headline for that case. But
it might be something along those lines where there's some
kind of implicit bias. And with with AI, it's so
hard because it is going to be something where we're
(31:44):
going to have some case law on this eventually. And
it's going to come down to, I think that we're
as a business owner, you're going to need to really
vet that program. But then again, is it really cost
efficient to even have that program if you have to
vet it, if you have to find out whether or
not there are some issues with it? And so then
it just kind of negates the need for it.
S4 (32:06):
Well, I think I've kind of played around with having
it pull up a contractual analysis of something. And what's
interesting to me is if you have the computer program
to screen people, I think there's always the argument that
you meet with the people face to face in person.
So whatever kind of bias that computer might have shown you,
(32:29):
you're still eliminating potential problems and it's still going to
require some kind of human interaction as a quality control check.
Like Justin said, we're just we're not there to fully
rely on these systems. And the question is, will we
ever be really? Yeah.
S3 (32:44):
And I think it depends on how much leeway you're
giving to the technology. If you're just, you know, having
the the AI go through the resumes and make sure like, okay,
we want a candidate who has this much experience, who's
got these degrees, these qualifications. Other than that, we're not
going to look at it or we might like pull
(33:05):
a couple later and then after the AI narrows it
down to the actual qualifications, you're looking for it. Then
you bring people in and it hopefully it shouldn't have
that bias in just those parameters. But when you extend
those parameters to just I figure it out, that's when
we run into, well, what are they figuring out?
S2 (33:26):
So maybe just more of a screening type. I mean.
S5 (33:29):
I theoretically could be used to eliminate bias. Yeah. If
you're just supposedly saying looking at certain parameters.
S2 (33:37):
But it's still programmed by humans, right? Right. So that's
how do you how do you fully eliminate that? Now
we're getting into like a philosophical question. That's interesting. I
think that's something to watch out for because as we
go in the next few years, the way AI is
blowing up right now, we're going to see a lot
of stuff coming through, lawsuits coming through, cases coming through.
(34:00):
There's going to be a lot of crazy stuff. And
I guess let's let's stay on one other media related issue,
which is Seattle School District Sues Social Media Companies. I know, James,
you brought this one up and you've already done. Some talking,
but why don't you kind of give a brief overview
of what this is?
S5 (34:21):
Yeah. So this is a case currently in the US
District Court in the Western District of Washington in Seattle,
and it's a lawsuit filed by one of the school
districts up there against meta platforms, also known as Facebook, Google, YouTube, TikTok,
(34:47):
several other social media platforms. And essentially the idea behind
the lawsuit is that these social media platforms are creating
a public nuisance and harming children. The complaint calls for
relief and order that the conduct constitutes a public nuisance,
(35:11):
that the defendants are liable, that the defendants are required
to enjoy, are enjoined from engaging in further action. Equitable relief.
S2 (35:24):
So what this comes down to from what I read
and I started reading the lawsuit, I don't know if
other people do this, but as soon as I find
the actual lawsuit, I kind of want to skim through
to see what because some of these articles don't necessarily
give you the full details, but essentially they're alleging that
there's the whole goal of Facebook or TikTok or whatever
(35:45):
it might be, is to, quote unquote, exploit the psychology
and neurophysiology of their users into spending more and more
time on their platforms. And that obviously can be very detrimental.
There's a lot of studies that are coming out that
are saying how detrimental that could be. But really what
it comes down to is are these company liable for
(36:07):
putting out a legitimate good product that isn't illegal? Would
this lawsuit even be able to stop kids from using
it like that? Does that even make sense?
S3 (36:17):
I think it depends on the algorithms because they've already
been sued and there have been studies that showed that
they were messing with us for a while there where
on some people's algorithm feed they were showing flowers and
happy media and seeing what if their posts changed to
also be happy. And then on the opposite side of that,
(36:38):
they were just feeding people negative things and seeing if
they could be like start changing what they were typing.
And so we're being used as human gigging pigs at
that point. This was a couple of years ago. And
so I guess you could say like has any like
they're not messing with us like that anymore. But ultimately
the systems were or I don't know if they are
or not, but the systems are designed to keep us
(37:02):
engaged more like gambling. The newsfeed was designed by someone
who also designed some of the slot machines. Yeah, so
it's keeping us engaged and refreshed and just that constant
need to win. Even though you're not putting in money,
you could have a detriment.
S9 (37:20):
And yeah, but a.
S2 (37:22):
Lot of things could have a detriment. But really, should
we there no one's being forced to use these. These
are all voluntary systems. I know that they're kids, so
it's a little bit different. Kids. They can't make the
same decisions in most things nowadays. So I guess, like
my question would be, can we really come up with
a lawsuit that's going to prohibit these companies from targeting minors?
(37:48):
And is that even legal in the first place?
S3 (37:51):
I mean, honestly, I feel like we just need to
do a better job educating our children on what social
media actually is when they're seeing their friends posting all
of these fun photos and they start to feel depressed
that they're not being involved and they're being left out.
You know, you're only posting part of your life. It's
not black and white, you know? Yeah.
S9 (38:13):
But I guess the question is, these companies.
S2 (38:16):
They have a responsibility to a certain extent, right? That's
why most of these companies have an age requirement. You
need to be at least 12. You can't be a
six year old creating a TikTok or Facebook or I
don't even know what the age restrictions are, but I
know that some companies have a minimum age, so they
are aware that there is some kind of responsibility there.
But really, though, can we legally say to these companies,
(38:38):
you can't go after this target demo, you can't go
after these type of people?
S4 (38:45):
Well, Justin, you brought up a good point about the
the freedom of choice, if you will, between adults and kids.
What's interesting in this perspective, if I was going to
play devil's advocate again and was working for Facebook or
whatever big company it is, I'd push it back onto
the parents. Yeah. You know, did you allow your child
(39:06):
to have access to a phone or a laptop or
an iPad? I think that's an interesting perspective, too, because
even though the kids might not be able to have
free will like an adult. The adult is still putting
these mechanisms into their hands and then they're availing themselves
of the programs. So I think that could be an
important point as well.
S2 (39:27):
I think one of the things about this, though, is
that you have a situation where, say, like the tobacco company,
there's demonstrable effects from from smoking tobacco. Now, there might
not be demonstrable effects necessarily from overusing social media, but
maybe there is. And and then we get to that
point where, well, can we limit what social media is
allowed to do with children, just like we did with alcohol, commercials, tobacco.
(39:51):
So on on one of these articles made a good point.
There are a couple of different good points that are
kind of the opposite of each other. One said, Well,
the opioid crisis, what do we do? We stop the
flood of opioids out there, meaning like, stop the social
media companies, right? And then another person said, well, casinos
aren't necessarily held liable for gambling addicts, so why should
(40:14):
social media companies? So those are two opposite viewpoints. And
I think that they both kind of are right, except
for we're dealing with children, a very vulnerable population. So
it's a little bit different than both of those examples
because of their ability to be influence, especially in that
age range. So maybe they should be held to a
different standard.
S3 (40:35):
Yeah, I don't know. But it's it's hard, you know,
talking about children in their mental health and the mental
state because we still don't know a lot about how
our brains are developing at at what age and what's
going to affect us in the future.
S8 (40:47):
Or what the.
S2 (40:48):
Effects of social media will be long term. Because we
don't know, because that's it's still a relatively new thing.
So I think we need more information in order to
know the long term effects of overuse of social media.
And then that's very subjective to what is overuse. How
much is too much?
S5 (41:04):
Might be dating myself, but I don't know if you
guys remember Facebook when it first came out, you had
to have a a college email address in order to
sign up. And now I'm looking at the complaint right
now from from this lawsuit. And it says 90% of
children ages 13 to 17 use social media. One study
(41:27):
reported 38% of children ages 8 to 12 use social
media in 2021 and as high as 49% of children
ages 10 to 12 use social media and 32% of
children ages 7 to 9. So, wow. I mean, there's
definitely a a large amount of children that are using
these platforms. But like you said, assuming they all have
(41:52):
minimum ages that you're supposed to be to sign up. Now,
what kind of checks and balances they're using for that?
I don't know.
S9 (42:00):
But so it's.
S2 (42:01):
Interesting because if you think about one of the solutions being, oh,
let's set a timer to it or something like that.
And so now you're saying you're telling a private business
or a public business, I guess you could say, because
they're publicly traded, most of them, you're telling a business
how to to restrict part of their major use in
like the majority of their usage is from minors, I'd imagine.
(42:25):
So you're severely limiting that. And then the ad revenue,
it's going to be so detrimental to their business. So
they have to find a way to to squash this lawsuit.
I can't imagine that there's there's any good that would
come out of this in any way. So.
S3 (42:39):
Well, I think the only good would be just building
a bigger discussion on mental health. That's it.
S8 (42:45):
Just yeah, bring.
S2 (42:46):
That into the conversation. Yeah. I think the I forgot
what the Netflix show. Um, what was that the Netflix
show about. Oh I know Netflix documentary about social media.
Oh yeah. Oh social dilemma. That's what it was. Yeah.
And that was a really good documentary, especially since one
(43:07):
of the guys was from, I think, Instagram or one
of the major social media companies talking about how they
are trying to target. They are trying to create a
better algorithm that's going to keep you involved, keep you
constantly paying attention. So that's going to be an interesting
lawsuit as well to all these things are coming up
that we're going to have to keep on top of.
S3 (43:26):
Shortening our attention spans, right?
S2 (43:30):
Yeah. Whenever my phone tells me like how much I
looked at my phone the week before and makes me
just go nuts because I'm like, how did I look
at my phone for so much each day?
S8 (43:39):
Yeah, but you don't even have TikTok. No, I don't.
S2 (43:43):
Absolutely. No, I heard I heard way too many bad
things about TikTok and how they steal your data. They
can monitor your keystrokes and your passwords and your credit
cards and phone number like they steal all your data
from what I've been told. And I want nothing to
do with that. So thank you, everyone, for joining us
on our first episode of The Lawyer and Blue Jeans
podcast today, we talked about the Brett Favre defamation case.
(44:04):
We talked about whether Lady Gaga should have to pay
$500,000 to the lady who returned her dogs who actually
participated in the Dognapping and I culpability whether or not.
The employer who employs these AI systems are responsible. If
they do something that might have violated some kind of
law or code. And the last thing we talked about
(44:26):
was a Seattle school district suing social media companies and
saying that they are a detriment to students and whether
or not that would actually hold up and what responsibility
these social media companies have to minors. Thank you, James
and Christina and Adam. We've had some pretty good topics
out there, and we're going to be talking about some
(44:47):
more interesting things coming up in the future. So this
has been the Lawyer in Blue Jeans podcast. Thank you, everyone,
for listening.
S1 (44:55):
Take a break from the news! and join us at
Lawyer in Blue Jeans! If you're curious about the latest
wacky cases or have a specific legal inquiry, drop us
an email at Justin@lawyerinbluejeans.com. Follow us and subscribe to stay
up-to-date with our latest episodes.
S10 (45:16):
Olas Media.