Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Why it's Brittany and Windsor and you're listening to Thanks
I Hate It, a week in social commentary podcast where
two friends shoot the shit about social issues, rotated, unsuspecting targets.
Speaker 2 (00:11):
And fight with electronics. You would think we're some boomers.
Speaker 1 (00:15):
After how long it took us to get on a
meeting that's actually gonna theoretically work. Fingers crossed.
Speaker 2 (00:21):
We started this almost an hour ago and twenty seconds
into this episode. So, Brittany, anything new with you this week?
Speaker 1 (00:29):
You know what? This last two weeks have been a
super wild ride. This week inportat, you know, not a
bad wild ride, just a wild great Yeah, there's just
like stuff happening left and right. Though right now I'm
fully in the I don't know what my tech issues
are mode. So what happened to me this week is
(00:49):
that for two days we tried to record this episode
and here we are.
Speaker 2 (00:54):
It's gonna happen.
Speaker 1 (00:56):
It has to.
Speaker 2 (00:57):
It is what it is. If we have to see
here and go to a McDonald's parking lot, it will happen.
Speaker 1 (01:03):
Well, we will steal your WiFi to make the product,
you know what.
Speaker 2 (01:07):
We're gonna go to Starbucks to steal your wife. Did
you hear that like somebody had went in asking for water,
like some kind of like I want to say, disaster
relief people like went in and asked for water and
they told them no.
Speaker 1 (01:21):
Of course they did, because they're all like, it's water,
it's of the earth.
Speaker 2 (01:26):
Come on, and we're gonna talk about water today. We're
gonna talk about water. But first, before we talk about water,
we're gonna talk about laboo boos.
Speaker 1 (01:34):
Yeah, tell me about la boo boos.
Speaker 2 (01:35):
Also bugly little toothed monster things that I didn't even
know what the hell a la boo boo was like
a month ago. Livy's like, I want a La boo boo.
I'm like, what the fuck is a la boo boo?
She's like it's this thing. Okay, sure, yeah, no problem.
Whenever I get some money, I'll go buy one Lulls
because they're sold out everywhere. Of course they are. They
(01:57):
are being resold for upwards of four one hundred dollars
piece for the ultra rare ones.
Speaker 1 (02:03):
In my day, we just had to wait.
Speaker 2 (02:05):
Yeah, so there was in the set that they're re releasing. Now,
it's like a happiness loyalty, like Vibes. I'm going I'm
gonna call it's a Vibes line. They have a little
gray one with these little gay tea and it's like
the teeth of rainbows.
Speaker 1 (02:20):
Okay, because I was going to ask you some more
questions about that. I needed more information, Thank you very much.
Speaker 2 (02:26):
I love it, but it's one in a seventy two
chance of getting that, and they're blind boxes.
Speaker 1 (02:30):
So I hate that.
Speaker 2 (02:31):
Well, regardless, I saw this whole thing on consumerism. I
don't care. I don't care if I'm going to spend
my adult money how I want to spend my adult money.
But I was like, listen, a pre sale's coming up.
That's perfect. I can get some on a pre sale.
I only need two twenty minutes. I sat here refreshing
the page every thirty seconds, like I'm sitting here on
Instacart looking for another order, like.
Speaker 1 (02:53):
You're waiting for them Coachella tickets. I remember twenty waiting.
Speaker 2 (02:57):
For those Cochella tickets when we had like I had
my work computer on, I had my laptop up, I
had my tablet on, I had my phone on, and
we all did. There were like there was literally twenty
electronic items in Q to get these tickets. But I refresh,
I refresh, I refresh their live As soon as I
get my fat little finger to press add to cart,
(03:18):
they're gone. In fifteen seconds. They work gone.
Speaker 1 (03:21):
Some kids do not have arthritis.
Speaker 2 (03:23):
I am so sick and tired of these like computer spam,
freaking reseller bots, like what about us? Is just one
of our kids us? I fowed them them kids. That
is exactly what they said. Honestly, that can also, honestly
to be relative to this conversation today. Fuck them kids,
because guess what we're talking about artificial intelligence today. For
(03:46):
those who are nasty, it's midst AI if you're nasty.
But so we're going to talk about in the entertainment industry,
some ramifications. We're going to talk about climate ramifications. But
just to really start off this conversation, I just want
to implore and just kind of like drill it into
(04:06):
your minds. I know there's a lot of anti AI stuff,
and I get it. I get it. However, that being said, Comma,
it's not going anywhere. This is part of our lives,
and it's already a part of our daily lives in
ways that we don't even realize. And it's going to
be even further integrated into our lives, into our healthcare,
(04:26):
into our schooling. Everything major thing that we have, if
we still have a country in ten years, is going
to be directly dealt with AI every day. Go don't
give me that.
Speaker 1 (04:37):
Look, oh it happened. We younger have a country.
Speaker 2 (04:41):
When the country we're going to end up in, they're
going to have AI too. Whichever one we end up in,
they'll have it too. If you're smart, well, I mean,
there's not really much of a choice. It's the latest technology.
And what happens with the latest technology is you have
to keep up. Like just a year ago, if you
were to go generate an AI image, the human would
(05:02):
have fifteen fingers and the words would be nonsense. Guess what.
It's not like that anymore. It's not like that anymore.
So very briefly, there are two main types of AI
that is generative and assistive. Assistive we use every single day,
every day without thinking about it. Those are our series,
our alexas. Those are those text messages that you get
(05:22):
when your Discover card says that you normally don't spend
this much money. You good find your business right, your
personalized recommendations in your Spotify, your Apple Music, Amazon, anything
where there's a personal recommendation that is AI. Any algorithm,
your TikTok algorithm, your Instagram reels algorithm that is all
(05:43):
artificial intelligence, your facial recognition in your photo app. So
like you know, if you go into your Apple Photos
and all of a sudden, it's like pictures of.
Speaker 1 (05:52):
Brittany Galen in these pics.
Speaker 2 (05:56):
Beyond pictures of like all these rand listen, I have
celebrities in my phone that were named before my kids
were named because of this personalized like recommended you like
like it knows that that's Beyonce, but it doesn't know
my kid's name.
Speaker 1 (06:11):
It thinks kid is me.
Speaker 2 (06:12):
Yep. Well, I mean that's fair though. That is many
you your navigation systems, your ways, your Apple maps, your
Google maps, that's all AI. Your spam filters in your email,
and the below chatbot. And it is also really important
to note, because we are going to talk about energy,
is that these use is significantly less energy and have
(06:33):
a significantly smaller carbon footprint than generative AI work. Scopically,
because these things program that is designed to do that
one thing. It's like that that's what it does when
you are using these expect like those really good programs
to get those really like those weird AI videos of
(06:55):
like those old people swear fun picture generated by those
those type of engines and models uses as much energy
as it does to charge your iPhone one picture. So
imagine how much one of those videos takes. And you
know what it is, what it is. Unfortunately, AI is
in its infancy and we just got to buckle up
(07:16):
for the ride, and the sooner we all come to
recognize the fact that it's not going anywhere the quicker
that we can be. Like listen, governments, all of them,
we need to start regulating this because there's next to
no regulation.
Speaker 1 (07:28):
Well now we can't because that was part of that
bill that just got passed. You can't regulate any AI
for like ten years. Yes and no, we're gonna be dead.
Speaker 2 (07:38):
Yes and no, because it is still it's complicated because
with anything, half of the shit he's doing is illegal anyway.
But with the technology that's rapidly probably evolutionizing, that's rapidly growing.
As it grows, the regulations around it change, the need
for the regulations change. So AI as it is today
(08:00):
is not going to be the AI as it is
in a year. It's not something that is static. Still,
one another thing I'm gonna touch on are some ethical concerns.
Like I said, we're gonna touch in the entertainment industry.
But before I give something over to Brittany, let's talk
about energy. Wolf Wolf is right, okay, So an article
(08:21):
from Sanford dot ADU. Oh yeah, so this is academic.
Date's that a recent report for the sixty data centers
in Phoenix alone, just Phoenix, the demand for water is
approximately one one seventy seven million gallons a day.
Speaker 1 (08:41):
Oh, we're all gonna die and and this.
Speaker 2 (08:44):
Oh but wait, but wait, did you know they were
sure to note like where they got this from. We're
sure to note that they use more than that in agriculture.
Speaker 1 (08:52):
I want them to use much more than that in agriculture.
Speaker 2 (08:55):
Like that's what we need. Okay, but let's let's are
you ready to absolutely monitor every drop of water that
comes out of the everything in your house? To just
of note, the average US family of four uses about
one hundred and forty six thousand gallons a year. And
(09:15):
that's per the EPA that was back in twenty twenty four,
when we still had an EPI.
Speaker 1 (09:20):
Damn, that sounds terrible compared to that million.
Speaker 2 (09:24):
Oh wait, it will take the average family of four
one two hundred and twelve years to use as much
water as one city uses in one day to cool
their data centers.
Speaker 1 (09:36):
So this is probably what that singular term it about.
Speaker 2 (09:41):
To sixteen generations worth of a family of four. So
it would be my great great great great great great
great great great great great grandkids might finally use that
much water by then. We're not even gonna have water
or a planet. We're gonna be good. Why not even
have great great, great great great great great Greg, we're
(10:02):
god Children's dad. So but are you ready no more?
Let's talk about electricity. So global global data center consumption
is estimated to jump from four hundred and sixty tetrawats.
I think it's a tetrawat. I only put t wh
into twenty twenty two to a projected six hundred and
(10:26):
twenty to one thy fifty by twenty twenty six. And
obviously that's going to add an additional strain to our
are a crumbling electric And we just had the president
talking about something there's nothing they could do about it.
There is something you could do about a dipshit, then please,
Biola Davis, grab my bag and walk about it here.
(10:47):
All right, So let's talk about the average American home. So,
the average American home uses ten thousand, seven hundred and
eighty eight kilowatts a year. Gist, so we can have
this in perspective. One billion kilowatts is one I think
it's tetrawat well a billion of them. Remembering that this
(11:08):
is the average, because you know, people feel like, oh,
Americans juice so much electricity. We have a very broad climate,
from the coldest of the cold to the hottest of
the hot.
Speaker 1 (11:17):
The hot girl, she hot.
Speaker 2 (11:19):
Yes, and listen, there's play like Wisconsin will get like
negative one twenty and then be like one hundred and
fifteen degrees. We use a lot of electricity because we
will die, plain and simple. But so the entire world
you in twenty twenty two use six hundred and twenty TWh.
The average American home uses point zero seven nine to
(11:43):
one t wh a year. But wait, there's more. That
sounds like, so they are building. There's a few hot
spots in the US where they're putting these data centers.
Rights Vegas, Reno being the Pacific Northwest. Obviously up in
like the Sacramento, San Francisco area, not so much in
(12:06):
Los Angeles, there's really not room for them there, as
Salt Lake City, Denver, Dallas, Fort Worth, Austin, San Antonio, Atlanta,
New York. It looks like Memphis, somewhere up in the
New England area, Chicago, and Northern Virginia. Northern Virginia is
going to be hit the worst. Let me tell you why.
Speaker 1 (12:25):
I see this graph, I do, but I can't read it.
I'll read it thanks to.
Speaker 2 (12:31):
The largest projected growth is in Las Vegas and Reno,
with they're estimating nine hundred and fifty three of these farms. Now,
the number of equivalent households that can be powered in
that area is over three million. You know, they're actually
looking to only do one hundred and forty two in
(12:52):
the Northern Virginia area, which is enough to power nine million,
one hundred and fourteen, seven hundred and eighty nine homes in
Northern Virginia.
Speaker 1 (13:04):
Are they something chlorine into the air.
Speaker 2 (13:06):
And another thing is that's not a prosperous area. We
have a very very big problem in the area surrounding
these data farms. Already where the neighbors are not getting
any water, they can't even flush their toilet, and guess what,
there's no recourse because there are no regulations.
Speaker 1 (13:24):
That's literally what's happening to the people in Memphis.
Speaker 2 (13:27):
Yeah, it is so bad and it's going to get worse.
Those are the type of regulations we need. Yes, they
are building ideas that we had at our work yearly
training staff development. They did an AI course and they
actually are making these things smaller so they will eventually
(13:50):
take up less room and therefore take up less resources. However,
these data centers are already built. If it's not broken,
they're not going to fix it to put these smaller
ones in there because they don't have to. They're much
more expensive but quick.
Speaker 1 (14:06):
That is one of the things that so cities will
make these contracts. And I know Atlanta is trying to
fight against the data center, one of the data centers
that might be coming to them. But they make these
contracts because where they offer other incentives so that these
companies come to their area because theoretically it's going to
be good. But not only are they using water, but
(14:27):
they are absolutely putting like terrible shit in the air.
Speaker 2 (14:31):
And so the expected they're four hundred and eighty four
of them.
Speaker 1 (14:35):
Awesome where where where?
Speaker 2 (14:40):
So pretty sure that last year they tried to put
one in our town and the residents were like, absolutely
fucking not, absolutely fucking not. You got us twisted, and
they ended up not doing it, Thank gosh. There's one
thing that's going to happen when you have a town
like mine, the suburban town with wealthy white folks in it,
(15:01):
and you're going to bring it into an area where
it's half wealthy, half farmland income. It's mixed income. But
that area particularly is a marshland. Yeah, I live on
some marsh lands, So that's already an issue here. That's
already an issue. But you have the farmers that are
(15:21):
protected in our area, and then you have the rich
folk down.
Speaker 1 (15:24):
The street you like absolutely to throw these things.
Speaker 2 (15:28):
Is if you don't have rich folk in your area
that care, you're not going to get. If you don't
have that, you're going to get them.
Speaker 1 (15:33):
And that's what's why they spend so much money and
so much time and energy putting these things in disadvantaged spaces.
Speaker 2 (15:40):
Or redlining me twenty five and we're redlining again, Brittany
So besides Bisher at all saying that there's going to
be accelerated depletion of the natural resources, what do you
have Before I get to ethnical concerns and plagiarism in
the entertainment industry, I.
Speaker 1 (15:58):
Actually wanted to talk very quickly mine will not be
taking up your time today, but about AI and surveillance.
So one of the major concerns that people have with
AI is that it poses crazy surveillance issues because AI
is reviewing all of your different all of you for it.
(16:20):
That's why they're taking pictures of you exactly. And so
I mean, if you think about it, we are a
heavily surveilled population of people that are you know, living
here on Earth right now. We are a group of
people always that eight million of us, all eight million
of us, are part of a group of people that
are traditionally used to having our pictures taken as we're
(16:42):
walking into stores, as we're walking out of stores, as
we are existing in time and space outside of our homes,
and even sometimes in these spaces. If we've got interior cameras,
if we've got ring cameras, if we've got any of
these things, we're used to having our faces, our likenesses
viewed through technolog And so one of the things that
(17:02):
local law enforcement is doing in a variety of places
around the country is they are using an AI called
predictive policing.
Speaker 2 (17:11):
Regular policing, right.
Speaker 1 (17:12):
They think this is gonna help them do it better.
Their rationale is they would like to use AI to
assist in predicting crime so that they can then use
what resources they have as if law enforcement resources are
somehow finite.
Speaker 2 (17:29):
I know what they're gonna do. They're going to racially profile.
Speaker 1 (17:33):
Oh my gosh, it's like I don't even have to
read in the rest of my notes, because father, at this.
Speaker 2 (17:37):
Point, we know you're you are as subtle as a
bull and a fucking China shop bro, because they were
after the black and brown people.
Speaker 1 (17:47):
We want all of these people to be under our
constant watch. And there's a whole thing about, you know,
basically state surveillance where if you're on like lobation for
all something like that. Whatever. That is not what we
are taught talking well in this moment, specifically predictive policing.
And I got most of my information from an ndabil
ACP policy brief, which if you don't know what a
(18:08):
policy brief is. This is for the listeners, not for
your winster, because I know you know, But it's basically
where an advocacy group and advocate whatever will give you,
basically a one pager or a short understanding of what
is happening, why we don't like it, what can be done.
It's easy psy lemon squeeze. But basically in this policy brief,
they're talking about a variety of jurisdictions, particularly in heavily
(18:33):
ethnic area and I'm using that with quotation marks. Even
though you guys can't see me in these areas, they
are being subjected to this AI surveillance where their cameras
put up. I know in Jacksonville they've got this kind
of I want to call it a box, but it's
not actually a box. It's this thing with some type
of camera that sits on top of a pole and
it detects what direction gunshots come from. But they use
(18:57):
all of this data, the data on who is in
there area, the data on what is happening in these areas,
who is doing what action, who's just standing versus who
is driving, you know, speeding down the street in the car.
But they're using all of this and previous accounts in
the area, so previous police reports, previous radio calls. They're
using all this data to try to predict what crimes
(19:20):
will happen, when they will happen, and where they will happen,
so that they can flood that area with their resources.
Again quotation marks prior to that occurring, because in their mind,
if they get to the person, place, or thing that
is occurring first and it doesn't happen, then they have prevented, just.
Speaker 2 (19:39):
Like TSA prevents all these terrorist attacks.
Speaker 1 (19:42):
Just they prevents all of these things. But you can
I've gotten through TSA.
Speaker 2 (19:49):
Gotten through TSA with a box cutter, a box cutter
multiple times.
Speaker 1 (19:53):
What are you guys doing? Why do I see that?
When I get to a hotel and say, what the fuck?
Oh my god, how did I get your this?
Speaker 2 (20:00):
You know, on their own, those things aren't necessarily bad
because I think having a echolocator that tells you which
direction a gunshot came from, Because if you've ever heard
a gunshot I have, I don't know if I've heard
it of clothes and personal you don't know where they're
coming from.
Speaker 3 (20:18):
They echoing great, but when it's used in an already
biased system with biased results, you're going to create bias,
which is another ethical concern with AI, because these things
have bias.
Speaker 2 (20:33):
It's actually one of my notes that is one of
the biggest inherently ethical problems with AI is the bias
because it is still data sets that are input by people.
And that's with research, just with scientific research. If the
researcher is biased, they're going to put in biased results.
(20:54):
And that's not even necessarily a bad thing because bias
can go both ways.
Speaker 1 (20:59):
It's literally just and the effect is.
Speaker 2 (21:03):
You can't get every single data point ever made have
confirmation biased. So he thinks that this fits my hypothesis
put it in here. So all in all, if they
were not being used in an already biased structure and an
already racist and targeted structure, they wouldn't necessarily be back.
But the fact that we already have this existing.
Speaker 1 (21:26):
Issue within with the human being that are literally part
of this. Absolutely this is what the NACP brought up.
This is their concern.
Speaker 2 (21:35):
Because listen, everybody would love to live in a safer neighborhood.
Body would, but not the people making it dangerous Because
data has proven that when areas are over policed, more
violence happens, and the violence is at the hands of
the police. Because what do you have Now you have
this predicted thing, it is a Saturday. We have more
(21:57):
things like Now, mind you, nobody likes to talk about
how both Chicago and Baltimore are the two most notorious
dangerous cities in this country now both have black mayors
and the crimes are at all time unprecedented. Low's we
don't want to talk about that, but yet, And they're
gonna use these things in those areas, and they're gonna
go into black and brown neighborhoods and gonna say, oh,
(22:17):
on Saturdays, y'all don't know how to act. So now
we're gonna be here and where our hands are already
on our guns.
Speaker 1 (22:23):
Correct. And their other issue that the NAACP brought up
because the machine learning and the fact that it's human
beings that have biased that is their first concern. The
next is the lack of transparency because in these spaces,
there are no people reviewing this data outside of that
law enforcement blue brotherhood. And I'm gonna call it that
(22:45):
because you guys know how, But with that in mind,
they call it a brotherhood and they do they really that?
Then blue line thing?
Speaker 2 (22:53):
That is, so they're brothers in blue, there's in blue eye?
Care about are the male man's and women?
Speaker 1 (23:00):
What about the trans cops and intersect cops? Yeah, you guys.
Speaker 2 (23:04):
Sorry, sucks us up. The only cop I'd give a
fuck about his name is Rufus give a choice? He didn't.
He didn't give a choice any rights to put on
your blue uniform. Rufus didn't have that choice. Anyway, we
will get that's a different tangent.
Speaker 1 (23:20):
It's transparency. And I actually did a paper on this
exact topic and the fact that so many of these
spaces that were largely white populated but had very dense
poverty areas or areas of black and brown people were
experiencing this additional concern for their safety at the hand
of law enforcement because of AI did an oversight. If
(23:43):
this is something that is happening in your area, if
you've heard about it, if you've never heard about it,
definitely look it up. But if it's something that all.
Speaker 2 (23:49):
Meetings, if you can and like listen and transparency.
Speaker 1 (23:53):
All meetings, and when you demand transparency, ensure that you
were bringing up the fact that it should be citizens
that are reviewing these things that should have access and.
Speaker 2 (24:05):
They should have access to all the data that you
have that you have, like okay, like and they want
to know too, Like if there's some shit going on
in my neighborhood that I don't know about, but you
know about I would like to know get broken into.
That's a big problem here, Like be you transparent with it?
If if you're not doing anything wrong, why aren't you
(24:28):
telling us what you're doing? And that's the problem.
Speaker 1 (24:30):
She speaks the truth, y'all. But yeah, that's what I had.
It's the preventative policing. There's nothing wrong with policing when
it's done correctly. That is not my personal feeling. Well,
police general slave catchers, But this is like an added
militarization of our police forces. We're already dealing with the
(24:50):
stress we as citizens are already dealing with our oppressive government.
God damn it. Can we not go home without being
concerned that we're going to be predictively polite?
Speaker 2 (25:00):
Also, where are you getting this data from? I just
moved into this house, say, and your data says that
this house was a crime house, but they moved out.
Are you going to come raid my house with my
children sleeping. Are you going to come in guns blazing
into my house when I just moved here?
Speaker 1 (25:16):
Like, actually, I would like that to happen because here's why.
Because here's why. Not because it'll traumatize your family. But
that payout would be solid. Oh listen, Okay Crumpton, But
I live in.
Speaker 2 (25:30):
A town our budgets aren't that great. I would need
that to happen in like New York City where they
have an unlimited lawsuit budget.
Speaker 1 (25:36):
Yeah, your budget might not be that great, but your
police department will have insurance.
Speaker 2 (25:41):
That insurance can pay for it.
Speaker 1 (25:45):
But that's not what we're talking about all.
Speaker 2 (25:49):
Right now that I have to like reset my brain
because we kind of like so, I'm mainly going to
talk about generative AI. As far as ethnical concern, yarns
go mainly with plagiarism. That's a big thing with the WGA,
the Writer's Guild of America, and why one of the
reasons they had their strikes writing. So, for example, plagiarism
(26:12):
is a big problem in feeding these ais because they're
scrubbing websites like AO three has been scrubbed multiple times,
eating these fan fiction into these AI models. Now, mind you,
fan fiction is already a very gray legal area, and
(26:33):
so now you're using this cop technically double copywritten because
the characters are copywritten, but then the individual ideas that
are written in these belong to the person who wrote it.
There's a reason why AO three actually spends a lot
of money on lawyers. I fucking that thing do so,
and it's really just to protect, you know, fan fiction
(26:53):
authors and make sure that they don't get caught up
in some nonsense. But there's using that. And did you
know that any web past documents stuff you put in,
you've noticed in the past year they've all updated their
terms of service. Your Google docs, your Microsoft word Online.
They have access to every single word you put into
them and they feed them to their AIS, every single
(27:16):
one your Google docs, your Google forms, your Google sheets.
So if you're using any of these web based programs,
they are taking that information. And people put privileged information
in these things. They have all that and that is
being fed into these models.
Speaker 1 (27:34):
I would hope that AI would steal my identity and
make it better.
Speaker 2 (27:37):
Honestly, how about you steal my credit?
Speaker 1 (27:39):
That's part of that identity I was talking about.
Speaker 2 (27:41):
So this a little word to the wise, do not
put any privileged information to any of these web based documents.
And the thing is, now they're in college, right, all
these colleges have these AI policies. Mind you, almost every
college you go to, included in your jeuistion will be
a online web based Microsoft package that is online and
(28:04):
web based. Your papers that you're putting in there are
being scrubbed for to feed AI models. Now, bigger problem
is you can't use AI. So what are we doing.
We're going to use these AI detectors and plagiarism detectors.
The patriarchs detectors are also AI to say whether or
not you stole this or you use chat GPT to
(28:25):
write this. Right, students are losing their scholarships, They are
being expelled because these things are flagging it falsely is AI.
I saw one thing where this girl had to go
through multiple hearings to get her scholarship back when now,
mind you, these plagiarism and AI detectors say that they're
not one hundred percent accurate. They say this is only
(28:48):
a tool, right, why are you taking it as gospel?
Speaker 3 (28:51):
Now?
Speaker 2 (28:51):
Mind you, she wrote her thesis or her paper in
these web based doc is where you could see every
change that was made. There was no copy and paste
of chunks. You could see where she wrote it, where
she took out a word, where she did that. And
his thing is like, these are smart people. The smurder
you are, the more likely your paper is to be
flagged as AI. Why dashes that is one of the
(29:13):
things they flagged, which they take that out of my
cold goddamn hands.
Speaker 1 (29:18):
Right, I'm smart, and that's what.
Speaker 2 (29:20):
It is now, mind you?
Speaker 1 (29:22):
Like?
Speaker 2 (29:22):
So I actually put in my one of my papers
through this. It came up as fifteen percent plagiarism. You
want to know what, now, mind you? This is then
AI model that detects plagiarism, so it runs it against
the internet. Right, the fifteen percent was my school's name
and my sources. It flagged my sources, my quotes that
(29:48):
are quoted and properly sourced as plagiarism.
Speaker 1 (29:53):
I've had that happen to me before, and it is
so you know what I did.
Speaker 2 (29:56):
When they want those professors that make you run it
through before you give it to them, I run it through.
I just copy and paste without my resources and my
title page, and it's like one percent two percent because
it picks up that stuff. But it's like, really, right now,
before I get into entertainment real quick, a lot of
people are going to be like, well, I don't know
how to draw, or I don't know how to I'm
(30:18):
not really created enough to write, so I have to
use generative AI. Knowing what you know now about how
much energy this takes, right and it's how it's taking
and stealing from other people, you don't have to use it,
you don't. There are several options rather than especially listen
if you're not profiting off it. I honestly don't give
a shit personally if you're writing a fan fiction and
(30:40):
you use AI to generate yourself a little picture, I
don't care. A lot of people do and feel very
strongly about it. Listen, I'm not putting effort from my
soul into caring about that. But here's the thing. I
can't draw to save my life. But if I felt
that strongly about having an original art or, I'll pay
somebody to do it. Who can draw you at this?
(31:03):
No one's an expert is something right away. It takes
something like ten thousand hours to become an expert as something.
If you spend ten thousand hours drawing and you still
can't draw. There's no hope for you. But you can
still pay somebody else to do it, and you know what,
or simply just write or draw anyway, Yeah, just anyway.
Are you write a book, writing a fic? Everybody's first
draft is asked it is mess all what matters is
(31:26):
you get it down on paper or guess what. You
can't do that, just buy someone else's book. So it's
not original. I hate to break it to you.
Speaker 1 (31:35):
Yeah, that's such, that's such true.
Speaker 2 (31:39):
So also with audiobooks, you know what they're doing.
Speaker 3 (31:43):
Now.
Speaker 2 (31:43):
If you are an independent self publisher on Amazon, when
you upload your book, you're given the option of having
an AI audiobook.
Speaker 1 (31:51):
So I actually did that, not with an audiobook, but
when we went out of the country, I forced all
of my advocates that work for me to create power
points and then I used AI to voice all of
them because like myself, most.
Speaker 2 (32:07):
Of my advocates sound like freaking hillbillies. But you know,
to me, that's one thing because you are not using
it for profit. You're just making a PowerPoint at work.
Work better, right.
Speaker 1 (32:19):
For sure?
Speaker 2 (32:20):
So this author can choose to have AI phenomenal or
low cost. I don't know how much it costs or
if it costs them anything, right, so they can have it.
But AI doesn't have inflection. AI is That's one thing
they have not gotten yet is how to add human emotionality.
So you're gonna be reading something. It's gonna be like
(32:40):
Don went into the woods and he was scared. Oh
my god, he saw something, something went boom. He peeps,
so he ran away, Like who wants to listen to
that for twelve hours? It will not be And you know,
I love me in audiobook and once you find like
a narrator that you love among this also a little
(33:05):
quick side note if you don't want that annoying Google AI.
When you google something, just swear in your search, just
say what the fuck are green beans? And they will
not show up. You'll just get your ripsult. Yes, And
you know all that before I was really not I
didn't really put that much energy or thought into how
(33:26):
much our internet uses impact the environment.
Speaker 1 (33:29):
I can't. Please, don't do that to me right now.
Speaker 2 (33:31):
I'm not gonna stop using the Internet, but you know,
it is what it is. So both the Writer's Guild
of America and the Screen Actors Guild had strikes in
twenty twenty three. We all remember it. Our shows were
all delayed till the spring. It was a mess. Movies
were postponed. We're just getting Wednesday next month after a
year and a half, you know. And a big part
(33:54):
for both of these was AI and AI regulations for
the WGA. Are they going to be using AI in
the writer's room? Are we not going to be getting
paid for our work? And more using AI as a
tool rather than as a replacement. So the creativity, one
major problem, like I said before, is bias. And one
(34:14):
thing that those Screen Actors Guild is really working hard
towards is more equality, diversity and actual representation in the
stories they're telling. But if it is being fed biased material,
they're going to get biased results and it's just going
to be a hamster wheel over and over, which is
one of the reasons why a well, a very diverse
writer's room is paramount. You need to have black people
(34:38):
in the room, you need to have Hispanic people in
the room, Asian people in the room, queer people in
the room. Any type of demographic that you are writing
for needs to have a representative of that community in
the room, because that is more important than anything like
just for example, with the nine to one one, you
could tell when Athena and hen have their conversations, you
could tell that there's a black person in the room,
(35:01):
and you could tell when there's not. But you could
tell when there is a scene in any show or
movie with black characters that are written by white people.
Speaker 1 (35:11):
With that, it's just it's a matter.
Speaker 2 (35:13):
Of lived experience. If you haven't lived that experience, how
do you know how to write it?
Speaker 1 (35:18):
How do you know what.
Speaker 2 (35:19):
Wrong with that? There's nothing wrong with being who you
are and knowing that you don't have all the experience
necessary and get to get a multitude of voices. I mean,
that's all it is. At least at minimum, get some
sensitivity writers in there. Please also plagiarism, copyright infringement. They
are being fed copy written material. They're going to start
(35:42):
repeating stories and plots that are copywritten, so that's another problem.
They did come to an agreement obviously, to make sure
that it is not replacing people and that they are
just tools that are used as an additional help rather
than a hindrance. It gets a little bit more detailed
because SAG is really about replicas. So AI, what if
(36:07):
an actress doesn't want to get naked, I could just
make them naked. What if they're not comfortable doing sex
scenes or listen? An actor? A person has autonomy, plain
and people they have autonomy, and whether or not you
agree with their why, there are reasons why. Like Candice
DJ whatever the fuck her name is, she's ultra conservative,
(36:30):
right and whatever. That's her right to be. But if
her values don't have her doing something and you use
AI to have her image and her likeness do that,
that is a major ethical violation. And I don't care
the reason why she wouldn't be uncomfortable with that situation,
The fact is that she didn't signed up to do that.
That is not okay.
Speaker 1 (36:51):
Not okay.
Speaker 2 (36:52):
So there are two type of replicas. So the first
is an employment based digital replica, and that's basically, we're
filming Avid Elementary and we're just going we actually have
this scene we want to do. It's not really safe
we're going to do using AI. So we're going to
go ahead and we're going to make a replica of you.
What the SAG agreement says is that the actor that
(37:16):
is a separate stipulation, either on a separate contract or
as an addition to your regular contract with an additional
signature line, and it has to be for each instance
that it is used. So they have to sign on
exactly know what their image is going to be used for.
So if they're like, well, this is going to be
a sex scene and you're like, well, I'm not comfortable
with that, no, and they can't. They can't pull back
(37:39):
a contract for that because these are separate contracts. So
they can't just say, well, you're not going to let
us use your image. Guess what you're you already have
your original contract. It is separate. They have forty eight
hours minimum notice per instant. The bigger problem comes with
independently created replicas. So this is just you using existing materials.
(38:03):
Let's say we're I think this was an issue with
the Final Destination. Let's just say so they have their
images from these previous movies and they're like, you know what,
we're gonna We're not gonna pay you, but we're gonna
add your image to this new movie and you're gonna
be starring in it. But you're not gonna get any
money because guess what you didn't do any work, and
(38:25):
that was That would have been the problem before this
new thing. So with the entertained, with the entertaining employment based,
they have to actually get paid even if they're not
for their image and their replica. Say that scene would
have taken two days to film, they would have to
get to daily pays at their regular rate for that.
(38:45):
So they're still being paid for this and they have
to get any time it's used, they have to have
a new contract. The thing with the independent ones because
it's tricky because that is copywritten material, copyrighted, copyright in whatever,
and owned by the studio. It's a little bit more
wishy washy. The only big difference is that they have
(39:06):
to obtain a separate consent each time it's used. They
can make them, but they have to have the consent.
Only their pay is not regulated. The pay is negotiated
between the studio and a representation of the talent. Listen,
it could go either way. Do I am not an
entertainment lawyer, I don't know, but listen, ethics on this
(39:30):
generative AI actually with how good it is now, like
we have deep fakes, we'd half the time we don't
even know what's real anymore. And that's crazy, and that
is such a big problem. And I think maybe we
can do another episode to kind of touch on these
deep fakes and really go into that. But if you
could imagine what a studio, what a studio money can create.
(39:53):
Just one episode of these network shows cost millions of
dollars just in payroll. This is a very They have money.
If they wanted to, they could. So I leave what
a call to action. Whenever we get a government back,
we need to start demanding actual legislation, and even if
it's just in your local government, because you know what,
(40:15):
maybe we can't do anything federally, but you sure can
go to your local town halls. You can go to
your congress people. Write those letters. Listen. I love me
a congressional letter. I write lots of them, and I
hear back from every single one I write. And just
learn how to talk to people. So if you're talking
to your Republican representatives, just use their their buzzwords like
(40:37):
red blood and American hardworking. You know, you got to
use those words that they like to hear and say,
you know, we're talking about a data center, come into
your neighborhood. No, no, no, tell them. You know, you
are a red you are a blue collar worker. You
risk your life and nam I don't care what it is.
But you have to talk to them using words that
(40:59):
they like to hear, and just say, listen, this is
not good for our state, your our environment. This is
do some research find out in your state how much
this is going to affect negatively and how much money
it is going to cost the state and local municipalities.
That's what they listen to. So they're like, oh, we're
going to get taxes from this. But then you're like, well,
(41:21):
actually you're going to get X y Z taxes, but
it's going to cost you twenty times more. They're going
to rethink that. Start with your local municipalities, work your
way up to your state. It's a lot easier in
a smaller state, but there are people actively keeping these
places out of their neighborhoods and you can too, and
you should. You should listen. They don't need to be
(41:42):
in bump fuck Connecticut. I assure you they we don't
really need to be. Listen. Make sure that we don't
have a lot going right now, but we at least
want to flush our toilets at bare minimum. I want
to open my tap to fill my Brita filter up
so I can drink some water, and Peo pulled near
these data centers and even do that. So we've tried
to keep this to a half an hour, but it
(42:03):
wasn't a half an hour. But it was a good conversation.
Speaker 1 (42:06):
I think this is fantastic.
Speaker 2 (42:07):
I forgot to tell you about my AI thing. So
we're gonna pull up this document, right.
Speaker 1 (42:12):
I've got it ready to open, all.
Speaker 2 (42:14):
Right, So we're talking about AI writing fan fiction. It
AI writes fan fictions. Now the fuck. That's one thing
I need not write fan fiction with AI.
Speaker 1 (42:25):
Just do it.
Speaker 2 (42:26):
I will read your shitty fan fiction, okay, as long
as you know what periods are and what paragraphs are,
I will read it a.
Speaker 1 (42:34):
Speaker per paragraph, you guys, just this one, just one. Please.
Speaker 2 (42:39):
If two people are talking, don't I don't care if
they interrupted each other speparate lines. But listen, listen, Just
write it yourself. It's fan fiction. Okay, it's better than
you think it's gonna be. So we also have AI's
writing books. That's another thing I was gonna read to you.
I completely forgot. So we're gonna wind this shit back.
(42:59):
It's gonna be an extra five minutes. Okay. So I
went and downloaded on Amazon Kindle an AI book. It's
actually the story's not bad, but it just kind of
feels clunky. So I'm gonna read you like a page
of it so you can see what I mean. Like,
it isn't bad, it's just you can tell it. Yeah,
it wasn't there's no emotionality to it. Where the fuck
(43:20):
is my kindle lab? Okay, here we go. Okay. This
is from Chronicles of Altera by Josh Waltston home Akaai.
The chapter one, the crossing. The fluorescent lights of MIT's
applied quantum Physics laboratory buzzed with an intensity that matched
Kai and Nakamura's concentration. Three am, and he was still
adjusting the parameters of his experimental setup, a complex array
(43:43):
of super conducting equipment arranged in a pattern that resembled
a metallic mandola. His dark hair fell across his forehead
as he leaned in to make one final calibration. Gray
eyes narrowed behind protective goggles. Talking to yourself again, Nakamura,
Doctor Santos asked from the doorway, her sudden appearance making
him straight in, just working through the lamb to phase calculations,
Kai said, brushing his hair back with an unconscious gesture.
(44:06):
The quantum tunneling effect is showing some unexpected properties in
the higher energy levels. Doctor Santos checked her watch. Most
people saved their breakthroughs for normal business hours. Most people
aren't three years into a five year project with nothing
groundbreaking to show for. Kai smiled to soften the edge
in his voice. His advisor meant well, but she didn't
understand the pressure he felt, the weight of family expectations
(44:27):
back in Japan, the competitive drive that had followed him
since childhood. Just don't blow up my lab, she yawned,
and remember the department meeting tomorrow today. Actually at eleven
after she left, Kit returned to the experiment, a studying
quantum field interaction that he had been developing since his
second year of graduate school. Standard physics is somewhat ambitious
for doctoral research, nothing that should open a pathway between worlds.
(44:48):
At three twenty seven am, he initiated the sequence. The
super conducting magnets hum to life, conducting a containment field
around the central chamber. Kai watched data scroll cross multiple monitors,
nodding as readings, predictions, Everything proceeded exactly is calculated until
it wasn't. So it's clunky.
Speaker 1 (45:05):
So it's clunky, but you've got great inflection there. Here's
the thing. It's clunky and it's extremely detailed.
Speaker 2 (45:15):
It's this is literally not even one page like it
is very that's it's a hard open. You know, the
first line, the first paragraph in a book is meant
to capture your attention, and it just it read fan
fictiony to me that if fiction doesn't have to do that,
because you already know the world, you can go right
(45:37):
into it. In plane, you'r au in your first paragraph.
Speaker 1 (45:42):
And it was also so stereotypes can be good or bad.
Speaker 2 (45:46):
Little sit yeah, a little stereotype. I'm not really comfortable
with the Asian stereotyping from Westerholme. I'm sorry, what the fuck? Hello,
I'm not muted. No, we're almost not. I can hear you.
So now we have books on Amazon that are just
spamming everything, and there's plagiarisms. No amook too so steamy
(46:11):
emotional firefighter romance. Firefighting is about more than just running
into burning buildings. It's about loyalty, trust and knowing who
has your back when everything is on the line. Out
a tough, confident rookie at Station one eighteen, determined to
prove himself. But beneath the adrenaline and bravado, there's something
deeper smoldering inside it the job and the complexities of relationships.
(46:32):
He finds himself drawn into a journey of self discovery,
passion and love and unexpected places. From high stakes rescues
to late night are the people who challenge, shape and
change him forever. In the end, family just isn't just
the ones you're born into. It's the ones you choose.
And sometimes the greatest risk lead to the most rewarding rewards. Gosh,
it wasn't written by chat Gpt Now real thing that
(46:52):
is on Amazon and the picture is obviously AID is
using their logo, their trademarked logo Evan buck Ethan buck
Buckland at Station.
Speaker 1 (47:04):
This is because I don't know if people will be
able to see it, but it is a book that
they sell on Amazon that is very AI and Jerry and.
Speaker 2 (47:14):
Yeah, yeah, we'll find a way to put this on socials,
but it will be it will be so welcome to
AI plagiarism. This is just a trifle at least of
the loo. Nine one one is not trademarked. Nine one
one is the emergency number nine one one with the
little dashes in that button with the circle around it.
That's true buck, it's spelled b uk rather than be
(47:36):
sw That is enough for tonight, because you know what,
I'm gonna play that song about not doing drugs, do
drugs and go to sleep. Amen and men, amen? All right,
So as always remember that you were that bitch. You'll
forever be that bitch. Think about your your environmental footprint.
When you're using that AI, drink your water, might as
well drink it because.
Speaker 1 (47:55):
We're not any too much longer that will have it.
Speaker 2 (47:58):
And what Just remember it is okay to smile, It
is okay to laugh. We encourage you to dance and
just enjoy the small things because the big things suck.
Right week, Good night, y'all,