Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
EJ (00:02):
Hello and welcome back, our
fabulous listeners.
We're continuing on our bookishdiscussion in Tales from the Orc
Den today.
May be broken up into severalrecordings.
We'll see how deep we get intothis.
Um, I am one of your cohostesses EJ and with me is
(00:23):
Stacy.
Say hi.
Stacy (00:25):
Hi, Stacy.
EJ (00:29):
And Amy.
Hello,
Amy (00:32):
everyone.
EJ (00:33):
I was always talking to
people named Stacy
Stacy (00:34):
per EJ's request
EJ (00:37):
So today we're going, we're
getting into a topic that I have
been like chomping at the bit toget into for like months now.
and that is AI.
And specifically, we're going tobe talking about AI.
in the context of, the indieromance world.
(00:58):
If we were to just talk about AIin general, then suddenly we
would be a tech podcast only.
And, that's not what y'allsubscribe to.
Stacy (01:07):
Boring
EJ (01:07):
as
Stacy (01:07):
fuck.
EJ (01:08):
So, So yeah, it's, but AI
still, it's been kind of a big
topic and I think we're a niftybunch to come together to talk
about it.
for me, I have been a total newshound about AI, in part because
I pay my bills and with my dayjob of being in the tech world,
(01:32):
I have had experience.
Working with AI algorithms, thepoint is, I'm a bit of a tech
nut in this realm.
And, and Stacey and Amy are lessso, which is huge because fuck
what Silicon Valley is saying.
How is it affecting the rest ofus?
Stacy (01:50):
Plus, let's be honest,
we're more interesting than
Silicon Valley.
And it's true.
So true.
EJ (01:55):
It absolutely is.
And I, it's here.
It's growing.
I have, you know, whether welike it or not.
And so we're going to have todeal with the consequences.
I figure.
So let's.
Let's, let's just talk about it
Stacy (02:10):
really fast.
I would like to interject acomplete non sequitur.
I'm looking at the latestpicture that Haley, Hey, Haley,
what are you doing, uh,submitted on Finley Fenn's not
safe for work channel in theforum.
And I'm really impressed withhow this angel guy that's
fucking the pink haired girl,how they animated his ball sack
swinging.
(02:30):
I just.
I really, like, A, I can't doswing in Nutsacks, that was a, I
just pulled that literally outof my ass, but I'm just really
impressed that his Nutsack ismid flap, like, well done to
you, Iona,
EJ (02:45):
Now I feel obligated to ask
for that link so I can put it in
the show notes, Stacey.
Stacy (02:51):
I'll send you the link to
their Twitter.
Iona is awesome though.
She is
Amy (02:57):
also excellent plus size
girl rep.
Yes,
Stacy (03:01):
yes, yes.
And as a plus size girl, fuckyeah.
EJ (03:06):
So how I'm looking at
starting this discussion is with
the really basic question, whatis.
AI anyway.
Um, let's just lay that on theline.
I'm going to start with, um, my,uh, very academically correct
techie definition, and thenwe're going to go into what
(03:29):
everyone else thinks.
Um, so AI or artificialintelligence is a branch of
machine learning and computerscience.
It focuses on taking trainingdata and building on that
training data with computationalvariations with an algorithm to
tackle complex operationstypically required by humans.
(03:53):
Um, so whenever I think of AI,it is primarily, it's, it's the
software.
It's, there is nothing magicalin my head when I, at this
point, uh, like a decade into inthe, in the tech world, I did
start as a humanity.
Right.
Stacy (04:12):
Well, that's just it.
You're tech.
You're, you're someone who isinterested and exposed to
technology.
So you're going to know morethan the average layman, i.
e.
me, is going to know about AI.
EJ (04:25):
There is a certain bit of,
I, I will admit there's a
certain bit of magic box nesswhen it comes to very complex
algorithms.
Mm-Hmm.
I do admit I am in the camp of,you know, those who argue for
things like, singularity ortranshumanism.
Mm-Hmm.
This idea of AI becomingsentient.
(04:47):
Our brains merging with AIseamlessly matrix.
Yeah, very matrixy.
I admit, I am in the camp thatis skeptical, but I'm
specifically skeptical that itcould happen in a binary
computational environment.
What the hell is a binarycomputational environment?
It is all the computers that weuse, all of our software, all of
(05:10):
our hardware.
We use binary, but the computersspeak to each other, right?
Yeah, yeah, and how we talk to
Stacy (05:17):
computers.
EJ (05:18):
Yeah, so computers, when
they are thinking at the very
lowest, lowest level, it's allzeros and ones.
Amy (05:24):
Right?
Hence, binary.
That's the kind of stuff myhusband is interested
EJ (05:28):
in.
One time I sent my, my husband across stitch and it was written
in binary and he
Amy (05:37):
decoded
EJ (05:38):
it and it said send nudes.
Stacy (05:40):
That's fucking sicker.
I love it.
Did you guys ever watchFuturama?
Amy (05:46):
Yes.
They were so good.
Stacy (05:48):
Or, Bender inherits the
castle, but they have to stay
the night.
And there's a thing written inbinary and blood, and he goes,
All I saw were zeroes and ones,and I think I saw a two.
And, that just popped into myhead when you talked about
binary.
Yeah.
EJ (06:07):
So, pretty much every
computer that we, we interact
with is talking to you.
itself and each other in zerosand ones.
Right.
There are other computers outthere.
They are called quantumcomputers.
Those are really only availableto certain governments and some
(06:31):
high tech labs.
they are a completely different,beast entirely.
Stacy (06:39):
They're not binary?
EJ (06:40):
Okay,
Stacy (06:43):
so here's a question that
has absolutely nothing to do
with anything.
Can these quantum computers talkto binary computers, or is this
like a completely, like there,there, there will be no
communication between theirpeople?
EJ (06:55):
So, uh, I don't know about
binary computers being able to
interact with quantum computers,but I, I'm under the impression
that quantum computers can talkto binary.
Stacy (07:10):
Okay, which would make
sense.
That's a I guess it would
EJ (07:13):
be sort of
Amy (07:14):
backwards compatible,
unlike quantum computers, which
could not
EJ (07:18):
be
Amy (07:18):
forward compatible.
EJ (07:19):
And really what it just
comes down to is, these quantum
computers, one of the reasons,there are many reasons, why they
Are they're currently in a stagewhere binary computers used to
be like, you know, in a galaxyfar, far away, right?
they, they take a crap ton ofspace.
they.
(07:41):
They essentially need to useatoms to communicate.
That's an overly simplified wayof describing it, but it means
that they are very delicate.
They need the right environmentsto work with, but they're able
to come up with results that gobeyond zeros and ones.
Um, so it's more.
(08:01):
It's more infinitepossibilities.
It's, uh, a lot more of how likeour brains work.
There's a lot.
There's literally neuronshitting at
Stacy (08:11):
each other.
EJ (08:12):
So it's a wee bit spooky.
Um, but cool.
I could see that being a little.
Spookity.
Yeah, I agree.
Yeah.
Um, whenever I've heard peopledescribe quantum computers, it
always, for some reason, like,they're completely different
things.
I want to be very clear.
It's not the Hadron Collider oranything in CERN.
(08:34):
Okay.
But it does make me think ofthat because it's, it's a
similar thing where it's likeyou're, you're smashing atoms at
each other.
And, uh, you know, it just takesa very, you can't really like
shove that into a tiny laptopand then go to Starbucks with
it.
So, It would
Amy (08:54):
take over the Starbucks,
wouldn't it?
Stacy (08:56):
Right, right.
They're huge.
Are these huge in the way thatlike people thought computers
were in the 60s, where it waslike a whole room?
EJ (09:03):
Yeah,
Stacy (09:04):
absolutely.
Yeah.
With real data and punch cards?
EJ (09:07):
Yeah.
So I could see maybe thembecoming commercialized if they,
if we could like adapt them to acloud environment.
However, there's other hugeimplications involved.
One of them is they are socomputation, computationally,
efficient and effective.
if you want to freak out asecurity researcher, AKA a
(09:31):
hacker, ask them what they thinkabout quantum computers.
Stacy (09:35):
Oh,
EJ (09:35):
really?
Yeah.
Is this like
Stacy (09:37):
the boogeyman of the tech
world then?
EJ (09:39):
It's, it's, I don't know if
it's so much of a boogeyman, it,
but it would have giantimplications on our commercial
environment from a securitystandpoint.
just because, you know, wealready have really good
software programs out, on, like,the black market, And, various
(09:59):
other places where you canpurchase software to do things
like hack into people'saccounts.
Those are already great as theyare.
if you get an even better,computational mind doing that,
you're fucked.
Stacy (10:14):
Well, yeah.
It would be like putting atissue up to keep a burglar out
is kind of what it sounds like.
Right, right.
Um, yeah.
EJ (10:22):
So it, in, and in those
cases, you're using things, you,
you would be using, uh, binarycomputers to protect against
quantum computers, and it's justnot going to happen.
It is very much bringing a knifeto a gunfight situation.
Right.
Stacy (10:38):
It's, it's like, it's
like if binary computers can
only go back and forth and sideto side, but quantum can come at
diagonals as well.
Right.
EJ (10:45):
Yeah.
Yeah.
It actually very much.
Quantum does not give a fuck.
Um, so, um, and, and alsowhenever you're trying to look
for more information on quantumcomputers, because it's usually
like the US government, who'sbeen doing a lot of research on
it.
It tends to be really vague andcagey.
I love going.
(11:06):
Of
Stacy (11:06):
course.
EJ (11:07):
Yeah.
Classified.
Yeah.
So like, you know, if, you wantto learn more about this sort of
stuff, join the NSA, uh, Uh,don't, don't come to your
friendly smut sluts for this,like
Stacy (11:23):
Fuck yeah.
Smut, sluts.
EJ (11:25):
Yeah.
Stacy (11:27):
Yeah,
EJ (11:28):
your, your friendly
neighborhood smut, sluts are not
going to, uh, know nearly asmuch as the goddamn NSA
Stacy (11:36):
Yeah.
Especially not a mo Yeah.
EJ (11:39):
So I, I'm over here like I
can appreciate where some people
do kind of like go into thatnext realm of thinking of like,
you know, transhumanism andstuff.
I'm like, that is cool.
And I totally appreciate whereyou're coming from.
I'm just of the feeling of like,I, I mean, maybe, but I am just
highly skeptical of it happeningwith a binary.
(12:02):
Computational environment.
Stacy (12:03):
Okay.
But, but quantum, potentially,potentially, potentially.
Right.
EJ (12:08):
And I suppose that's, that's
his like imaginary as I get.
And of course, I fricking lovecyber funk.
Um, but you know, that's, say itagain.
Stacy (12:16):
definition of what it has
to be to actually be cyberpunk
because I had no idea about thisuntil you told us this.
Oh, okay.
EJ (12:22):
We share that.
That's fun.
Um, so cyberpunk, we, we hadthis great discussion listeners
on like, what does it mean towrite cyberpunk?
I, and I stated, well, you needto be in a late stage capitalist
environment.
And I fell down this rabbit holebecause If you're on the
(12:44):
internet, you see late stagecapitalism everywhere.
And after a certain point, Iwanted to ask myself, what the
fuck is it?
And I came across, thedefinition, uh, well, several
definitions.
One of the hard things is, like,it's an economic theory, and one
of the things you learn abouteconomic theories is there's a
(13:06):
lot of argument on the nuance ofit.
It is an older it's as old asMarx.
Karl Marx did talk about latestage capitalism.
Stacy (13:16):
Well, yeah, that was part
of the whole, like, if you I had
the misfortune to have to readthe fucking Communist Manifesto
that's all laid out becausesupposedly then we become post
history and time is no longer athing and everything is just
right now and it gets very,like, the whole Communist
Manifesto is weird but it getsreally fucking weird at the end.
EJ (13:37):
Right.
Late stage.
Capitalism is a state ofcapitalism where the capitalist
environment can get so there arecontinually growing, never
limited expectations of profitgrowth.
And it gets to a point wherethings are unstable,
unrealistic.
(13:57):
Right.
Yeah, it's the
Stacy (13:59):
same.
It's the same.
It's the ideology of cancer.
EJ (14:02):
Yeah,
Stacy (14:02):
late stage late stage
capitalism is right before
cancer takes over the whole bodyand kills the host.
EJ (14:07):
It is, it is very like
cancer cell ask.
And I thought it was veryinteresting that people pointed
out that one of the symptoms ofthat, how do you, how do you
even keep consumers anyway.
I and kill I what's the word I'mlooking for innovation,
(14:30):
progress, progressivism.
Um, well, so innovationprogress, those that takes
money.
So how are you trying to cutmoney to increase your profit
margin?
Um, and still have customers.
You do that in late stagecapitalism with marketing and
PR, essentially propaganda.
Amy (14:52):
And
EJ (14:53):
Naturally, innovation also
takes its toll, especially
because you want to cut down asmuch cost as possible in your
labor, because that just becomesa raw material that you need to
streamline and cut down as muchas anything else, any other
(15:14):
materials, so at that point,What these companies really are
depending on is their marketingteams a lot more than their R& D
more than anything else.
All
Stacy (15:26):
funding is now going to
marketing rather than
innovation.
EJ (15:31):
And so that actually is kind
of a big part of cyber punch.
Yeah,
Stacy (15:37):
which again, as I said
before off air that I would not
have tracked this prior, but Ibelieve that Tiffany Roberts is
infinite city or infinite.
The Infinite City seriesqualifies cyberpunk.
EJ (15:51):
Yeah, so you could have like
kick ass AI, you could have like
amazing medical innovations, butwhy are you having this story in
the first place?
It's because there's a deeperrot happening, and a lot of
Yeah, and that And that is Andthat
Stacy (16:07):
is the rot.
EJ (16:08):
Late stage capitalism is the
vibe, whether you mean to or
not.
Mm hmm.
And I always thought that waswhen I came across that notion.
I was like, holy crap.
And I need to find it.
I need to find more commentaryon it because I'm like, Oh, man,
I could write an essay on thisnow.
(16:29):
Right.
So, yeah, so you could have likereally cool cyberpunk esque tech
in your story.
But if it's not an alleged agecapital environment.
It ain't cyberpunk.
Gotcha.
It's something else,
Stacy (16:45):
sci fi.
Sci fi, right, or Star Trek, youknow?
Kind of, you know, classifier.
Well, because dieselpunk is athing.
Oh, yeah.
That's what Star Wars is, isconsidered, at least the first
trilogy I know, is dieselpunk.
And I want to say dieselpunkisn't the definition of
dieselpunk, it's the future thatthe past imagined.
(17:06):
Oh.
That's the technical definitionof dieselpunk, but don't quote
me on that.
Um, but yeah, like I'm sure thatthere are like, like cyberpunk
is a tenant of sci-fi.
I'm sure you could find a athousand other very specific
descriptors to describe likewhatever, you know, your
particular flavor is.
EJ (17:27):
Right.
Getting back to what the hell isai?
Like Yeah.
If you, if you love differentcyberpunk stories Yeah.
You've probably come across AIin some
Stacy (17:40):
way.
Right.
Well, and so my first, like, AIwas like a vague term that I can
remember that I probably gleanedfrom different books and stuff
like that that I've read overthe years.
But my first experience with AI,where I registered it as a
concrete idea, was of allthings, did you guys ever watch
Reboot, the cartoon?
Yes.
(18:01):
It was the first CGI cartoon.
Andrea, in Andrea's name, the Ain the I is capitalized, she was
an AI game sprite, which meantshe was capable of learning and
she made it out of the gamewith, when she and, Enzo were
lost in season three, the end ofseason three.
(18:21):
And so for the longest time,whenever somebody said AI, I
assumed it was like Andrea,where it was a computer program
essentially that had gainedsentience and had become a
person, basically.
That was always myinterpretation of it until, and
I know too that prior to the AInightmare that we're currently
living in, there were, or I'msure still are very, very rich
(18:43):
people that are attempting toteach computers how to be alive.
Because there's the one I canthink of specifically is there's
a very rich person, they'retransgender went from male to
female.
The whole story is fuckingfascinating.
Because.
This person is, I can't remembertheir names, unfortunately.
(19:05):
It's a husband and wife duo.
The husband, I think, whobecame, who, who originally was
a husband is now the wife's wifebecause the couple stayed
together, even though the personwho was the husband transitioned
from male to female.
And they had had kids, like, itsounds like their relationship
was very established before theyrealized that they were
transgender and had a couple ofkids.
(19:27):
And like, like on top of it,they're a mixed race couple.
And I want to say they've beentogether since like the 60s or
the 70s or something.
And the spouse who istransgender is some kind of like
epic computer nerd.
And like Stephen Hawkins levelof, you know, into like the
(19:49):
language of all of this.
And it has been for the lastseveral years attempting to
create an AI of their wife whois not the transgender portion
of the relationship is a blackwoman.
And so they've created a sort ofautomaton of the wife's head and
(20:12):
over the years have been tryingto teach this computer just by
talking to it essentially.
So it's, it's some kind ofsoftware that can learn at least
in a rudimentary fashion.
It's not sentient, but you canhold conversations with it where
you would swear it was,basically.
Because so much time and workhave gone into this.
I'm going to look that up andsee if I can find that couple.
(20:33):
Because I read a whole thing onit.
It was just this fascinatingarticle.
Like, their life started,Unusual simply because they were
a mixed race couple in a timewhen it might not even have been
legal for them to be together insome states.
And, and then it's just, I mean,one of their children got very
thick and almost died.
And there was all this stuffthat they had to do because one
(20:54):
of the, the transgender partneris such a computer maven that
that's how they've made all oftheir money and why they can
afford to do this, this AI Youknow, attempting to create like
a cyborg, I guess.
although I guess no, it wouldn'tbe a cyborg, an android, I guess
but it was because of thepartner, the transgender
partner's extreme knowledge ofcomputers and stuff that that's
(21:18):
how they made their money.
So they were able to save theirkids life because they had some
extremely rare Something,something, something syndrome
that, like, two people in therecorded history have had, or
something to that effect.
And so it's just like, it's likeeach chapter of their life is
just like, like, you know, you,you hear, like, when you read a
(21:38):
book, right?
Like a romance novel.
And it encompasses a couple, andmaybe it encompasses them over,
like, And when you read thestory, the way the story is set
up, it's always like there's thebig hill that they have to get
over.
And once they're over the hill,it tends to be pretty smooth
sailing for the rest of theseries, or the rest of the book,
at least for the most part.
This couple, I don't thinkthey've hit the peak of their
(21:58):
hill yet.
Like, it just keeps getting,like, each chapter is just like,
wait, what?
So, really, really fascinatingstuff.
EJ (22:07):
And, uh, this really sounds
like the premise of some sort of
like Pulitzer prize winning likesci fi novel,
Stacy (22:14):
right?
Right.
Especially the way I'mdescribing it because God knows
I'm not making hash out of this,but it's, it's really
fascinating.
I'll, I'll see if I can find aand find it.
So you guys, we can at leastknow their names and maybe
that's something that you couldput into the show notes if you
feel so moved.
EJ (22:29):
Oh
Stacy (22:30):
yeah.
But that was always my conceptof AI until.
You know, the current problem,extremely problematic definition
of it.
The definition is notproblematic.
The behaviors of people using itare problematic.
Amy (22:43):
So my examples of AI
actually were definitely
divergent.
The first example was Skynet.
Oh yeah.
Yeah.
I didn't think of that.
Yeah.
That's a good point.
Because that's the terror, theterror part of it.
But then you also had parts thatare similar to like the Jetson
stuff that you saw in old, not,not old cartoons, but you know,
(23:06):
cartoons where basically AIwould be, you know, assisting
with the drudgery type ofthings.
Stacy (23:11):
Star Trek, basically.
Amy (23:12):
Yeah.
Yeah.
Yeah, the sad thing is, is thatwe've got the complete opposite
right now, which But, I mean, Iwish I had more on that, but sci
fi has always been my thing, andAI is definitely part of that
elemental lot.
Yeah.
And what I can see, I can seethe usefulness of AI because
technically at the end of theday, it is supposed to be a
(23:33):
tool.
Stacy (23:34):
Yeah.
Amy (23:34):
Problem is, is that this
tool is being used for the
completely wrong things.
Exactly.
Well,
Stacy (23:39):
it's like a knife.
You know, I can use a knife toslice my food and make a
delicious meal, or I can use aknife to stab somebody to death.
Right.
It's not the fault of the knife,it's the fault of the person
who's holding it.
No,
EJ (23:50):
not at all.
Yeah, and that actually bringsme very smoothly into, another
part of the discussion I wantedto bring in.
So, like, with, with ourconcepts of AI established for
our listeners, Why is it, why isit, in our view, AI is so damn
controversial, and again we'regoing to like focus specifically
(24:12):
on the indie romance book world.
And my first point is money.
Yeah, it's always the money.
And I actually have a veryspecific question for you both.
What about AI do you see ascutting into how the indie book
(24:35):
world makes money?
And also, why is that bad?
Stacy (24:41):
Well, for starters,
people are talking about, we can
get AI to write our books, andthe way, and I mean, I can see
the appeal of it from just likea purely hypothetical situation,
right?
Like, say I want to read a bookwhere there's a seven foot tall
teal vampire.
And he falls in love with awoman who can turn into a cow,
right?
Like, odds are high I'm probablynot going to find that on Amazon
(25:03):
no matter how hard I look, buttheoretically in AI I could type
in what I'm looking for.
AI could generate the storyspecifically for me.
The problem with that, ofcourse, is that this didn't come
out of nowhere.
Like, it's not, it's not made upout of whole cloth, basically.
Anything that comes out of AI aswe currently know it, is stolen
from other people.
And as somebody on the internetput it, if you can't be bothered
(25:25):
to write it, why the fuck wouldI be bothered to read it?
Amy (25:28):
Mm hmm.
Maybe I
Stacy (25:29):
kind
EJ (25:29):
of did it for you, though.
I concur on, on that exact,like, phrase.
Like, if you didn't bother towrite it, why should I bother to
read it?
Right.
I also, throw in, too, it.
You're getting back to latestage capitalism, uh, it is
looking at labor as something tocut for the sake of profit.
(25:52):
Well, right, and
Stacy (25:54):
the thing that it's being
used to do, it's not doing
anything that makes anybody'slife better.
Right.
In fact, it's stealing thequality of life from a select
group of people.
Uh, of humanity, basically.
nobody has to go into the fanartmines against their will, and
take a pickaxe until we can findthat streak of hentai, so that
(26:17):
we can sell it to all theperverts out there.
Like, no.
People create art because theywant to create art, not because
somebody has a fucking gun totheir head.
So AI is, AI, art, and I'm usingthat term extraordinarily
loosely and sardonically, isn't,it's not filling a need
anywhere.
It's taking away from people,and in this case, it's taking
(26:37):
away from people who make aliving doing things like that.
And that's bullshit.
And then not only is it takingaway from them, but it's
stealing, uh, Their own outputto teach the algorithm how to do
it in the first fucking place.
So it's like a one two slap,
EJ (26:54):
right?
You know, we'll go further intothe whole like data stealing
part I I I will also throw inbecause I know there's going to
be some folks who are like wellAll these other big companies
are using AI to cut their laborcosts.
And I agree that there is rightnow a huge surge in especially
(27:17):
the tech world of where can weuse AI instead of a person.
to do this.
And it's funny because earlierduring brunch, I was talking
about this exact same thing withmy husband.
we're quite the little duobecause I am a data person and
he has a software persontogether.
We make, we make a whole datascience startup.
(27:37):
Uh, that's kind of our joke.
Which is only funny if you're inthe tech world where you can be
like, Oh, that's so cute.
Everyone else is like, whatever.
Those are words.
Stacy (27:50):
Nerd on nerd
impregnation.
Right.
That has to be a book genre outthere, and if it's not, I'm
trademarking it right here,right now.
EJ (28:01):
So like we were talking
about this exact same thing and
it actually reminded us a wholelot of a different cost cutting
wave that happened not that longago, I would say roughly a
decade ago when there was a hugesurge of offshoring tech.
Yeah, yeah.
And that still happens, but notin the same way that it used to.
(28:25):
in fact, actually, my softwaredeveloper husband likes to point
out, he got his start as ajunior software developer fixing
the fucking problems left overfrom the offshore wave.
Amy (28:41):
And
EJ (28:42):
part of it comes down to
it's not merely when you are
cutting labor costs if you'retrying to, if you see yourself
as a writer who's like, I can'tafford an actual artist right
now I need to, you know, work ona budget.
One, I actually do empathizewith you, but I also cannot
(29:05):
emphasize enough.
You're not the first one who hasfound yourself with these sort
of stresses.
Um, and there are more issuesahead of you than you may
realize for a lot of these bigtech companies, issues that they
will inevitably run into arethings like, Oh gosh, I need to
(29:28):
find that article.
But there was a, uh, airlinethat has been using AI, uh, for
more of their customer service.
Well, that AI has been straightup lying in some cases on, uh,
company policy or with refundsthat the leadership is not cool
(29:49):
with whatsoever.
There will be repercussions downthe line, especially I
Stacy (29:53):
agree with the lady.
What a fucking bummer, right?
EJ (29:56):
And it's not like you can
fire the A.
I.
And even then, like you can.
Yeah, you can make the argumentof I can just shut down the A.
I.
But also this is a hell of a wayto fix all of the issues that
the A.
I.
gave you, right?
There's there's still acomeuppance that needs to
happen.
And for some companies, they'vealready put a lot of time and
money into A.
(30:17):
I.
for, you know, For the creativesout there very specifically, I
can appreciate where, again,you're over here.
Like, I don't see any of thishappening.
Well, I could, I, you know, takeit out of the abstract.
you do have quite frankly, someshit AI out there.
Oh God, yeah.
There, there is.
(30:37):
I swear some people are going tocome up with an AI art aesthetic
because even used that, no,
Stacy (30:44):
it's already there and
it's, yeah.
It's horrible because there arethree authors who are, tend to
be releasing books together,like they're writing in the same
series kind of thing, and I'mnot going to name names, but if
you want to find out, DM me,I'll fucking tell you.
But, All three of them, and Ireally liked one of the writers
(31:07):
and I've quit reading herbecause they're using AI covers,
but every fucking main femalecharacter who's on the cover is,
it's interchangeable with anyother.
It's always a waifey blonde, butusually so blonde that it's like
white hair, and who looks like astiff breeze will legitimately
snap their, their spine, andthey're always making the weird
(31:28):
almost duck face.
Have you noticed that?
Or like a quasi pout, and it'sthe same, it's the same fucking
character over and over and overagain.
Like, I could literally pluckthe main character off of one
author's book and put it onanother author's, another
author's book in this samelittle, little trio, and I don't
(31:48):
think any of them would noticethe difference.
They're the same fuckingcharacter over and over and over
again, and they're ostensiblydifferent stories with different
characters.
And the thing that really sucks,too, is at least one of those
authors I know has been pirated.
And was really angry about it ina super against AI book writing,
(32:09):
but apparently it's okay to fuckover an off or an artist.
Critical.
So nice.
Night.
Yeah.
Nice hypocrisy there, asshole.
EJ (32:17):
And it's a really nice segue
into my other point.
Besides it's a money thing.
It's an IP copyright thing.
So I, I want to make it veryclear for those.
In the indie world, very temptedby using generative ai.
I want you to stop and think fora really hard second why the big
(32:40):
publishing houses are notstrangely doing anything with
generative ai Right now, I wantyou to, eh, with,
Amy (32:52):
I'll point out, at least
I'll point out at least one, no
two technically, the UK editionsof a certain series.
We're AI generated.
It is a popular series.
Popular because the author isextremely popular.
And I don't want an army ofpeople coming after me.
That's Harry Potter, isn't it?
(33:13):
No, it's not.
That's Harry Potter.
It's not.
I have nothing
Stacy (33:17):
to back this up.
I'm just saying Harry Potterbecause I think it's funny.
Amy (33:20):
Anyways, no.
It's a very popular romanticyauthor.
Stacy (33:25):
Oh, is it, uh, Yes.
Who we were discussing the otherday?
That's
Amy (33:28):
Sh Sh Sh
Stacy (33:29):
I don't want her army
coming after us.
No, I get it.
I don't, I mean, they can, theycan click and start it.
I'll finish it.
But no, I, we don't need tobring IR down on our head.
Amy (33:37):
Yeah, that's a prime
example.
The other example, I don'trecall which the publisher, but
that was, that was Bloomsbury.
There's your other hit, people.
Bloomsbury using that on the UKpaperback editions.
Anyways, the other one I'm gonnamessage you.
If Paolo Baccalut I'm gonna callhim out.
Cuz he know he should knowbetter.
Paolo Beccabellucci, whateverthe hell.
(33:59):
Dude who wrote the Paragonseries.
One of his more recent titleshad an AI generated cover, and
I'm just like, why?
You clearly have the funds forthis author.
Why are you doing this?
EJ (34:12):
I wonder what publishing
house he used.
So here's, here's where I'mcoming from.
It's specifically, it's a U.
S.
thing.
So I won't speak for the UK onthis.
but the U.
S.
IP law setup does not recognizeAI generated content.
as ownable by the person whogenerated it.
(34:35):
And that is a huge reason whyyou're not seeing, in the U.
S.
a whole lot of companies dippingtheir toe into the water.
I see,
Stacy (34:46):
because theoretically
they don't own it.
They rip it off and they
EJ (34:49):
don't have They do not own
it.
Why would Penguin put out thingsthat they cannot own?
Yes.
Right.
Amy (34:55):
And I,
EJ (34:55):
and I should very
specifically, like, you know,
acknowledge, like, yeah, youknow, they've got classics and
stuff, but at the same time, ifthey're trying to create, quote
unquote, new stuff, new art, newtext, um, but then like any of
their rivals could go in andtake it.
It reduces the overall value ofit, which is totally purpose.
(35:19):
There are many things about theU.
S.
justice system that I find to bevery behind the times on when it
comes, yes, when it comes totechnology.
This is a weird, archaic.
Issue that is weirdly helpful,um, only on accident.
It wasn't deliberate.
(35:40):
We promise totally an accident,right?
So, but I think it's reallyimportant because I don't I
worry about not seeing thisenough in the indie world,
especially publicly because Ithink we are really stuck on the
moral.
Issues
Stacy (35:58):
of
EJ (35:59):
A.
I.
And that also makes sense.
By the way, it's another symptomof being in a late stage
capitalist environment whenyou're right when your media
consumption because again, thiscomes down to the real
innovation.
is in the content, the marketingthat's put out there.
One of the few ways as aconsumer you do have power in a
(36:19):
late stage capitalistenvironment is your media
consumption.
So your media consumptionbecomes the tribe that you
associate with, the totem thatyou build up for yourself.
So if you are very clear oflike, my tribe is not AI.
Naturally, people can get very,uh, defensive about it, in part
(36:41):
because that's one of the fewplaces where they have power as
consumers to do anything.
Anyway, fun, fun fact, even whenyou're like, I'm not in late
stage capitalism, you areparticipating in it, my friend,
we are all participating in
Stacy (36:55):
it.
EJ (36:56):
Go on.
Anyway, like, so, but this We,we focus a lot on the, this is,
this is not what I associatewith.
This is not part of my morality.
This is not part of my ethics.
When there is a very seriouslegal implication out there,
when you are an indie author andyou're using AI generated cover,
(37:17):
you could put yourself in a lotof danger.
Again, like Stacey hadmentioned, you could get
pirated, but legally, at leastin the U S.
No one cares.
You didn't own it to begin with.
Stacy (37:32):
Which, I mean, you stole
it from somebody else, so I'm
not terribly, concerned aboutyour feelings.
EJ (37:37):
And to be clear, part of it
comes down to, it's more of,
they don't recognize all thattraining data being regenerated.
into something quote unquotenew.
The US IP setup just has no slotfor that whatsoever, and if they
(38:00):
don't have a slot, I guarantee aUS lawyer is going to be like, I
don't know, maybe if you take itto court, man, and then you
create a precedent.
Stacy (38:11):
It's that whole thing
that, you know, I was discussing
with a co worker yesterday wherethis is, this is, So what we
were discussing has nothing todo with AI, but I think the
concept here of what I'm goingto get around to will make sense
in a minute, where he wasasking, because we had a kid who
(38:31):
has a cut on his hand, and hewas asking, because we can put a
band aid on it, but are weallowed to put Neosporin on it?
And I said, I don't think so,because technically that would
be operating outside of ourscope of practice, because none
of us are medical doctors.
And, and I, and he said, well,what if we got like verbal
(38:52):
confirmation from the parents todo so?
I said, the problem with averbal confirmation is that you
have no way to prove that youhave that verbal confirmation.
Now, if you had it written down,you might have a leg to stand
on, but if it's not writtendown, like basically in anything
that comes with CYA, which forthose not in the know means
cover your ass, is if it's notwritten down, it didn't happen,
(39:15):
even if it happened.
And so.
I feel like that's kind of thesame thing that's applying here
where we don't have a law forit, which means it didn't exist
in the, like, the law didn'texist in the first place to be
broken because we don't have alaw for it and we don't have a
slot to create a law for it, atleast not yet.
EJ (39:34):
Right.
No, I think it's actually, thatis a great way to describe how
the U.
S.
legal system generally works,and the reason I'm really
focusing on the U.
S.
is one, all three of us areAmericans, and we reside in
America, it's what we know.
I would also throw in theargument, it's still relevant to
the global, uh, romance indieworld, because A good chunk of
(39:56):
readers are stationed inAmerica, right?
Yeah, here, here we are.
And I'm not saying that'sawesome.
I'm just observing facts.
Stacy (40:10):
Well, right.
And I mean, I would love to beable to say like, you know, this
compared to UK law or thiscompared to the law in Nairobi.
But I mean, I don't know shitabout international law.
I don't know shit about domesticlaw.
EJ (40:23):
I mean, and if our listeners
have some, uh, some commentary
to throw in.
On that, like, my DMs onInstagram are open.
My email is open.
Feel free to contact us.
And if you're on the
Stacy (40:37):
Discord channels with us,
feel free to contact us.
EJ (40:40):
Right.
Yep.
I shout out to pretty much allof Finley Fenn's Discord server.
Y'all know who we
Stacy (40:48):
are.
Hell yeah.
And we love y'all.
We spun from there, so.
Finley Fenn, for when you wantyour orc dick big and juicy.
Amy (40:59):
Not to mention a lovely
rollercoaster of emotions.
Oh god, if you want to cry
Stacy (41:05):
every
Amy (41:05):
time.
Stacy (41:06):
You
Amy (41:06):
will
Stacy (41:06):
cry,
Amy (41:07):
but then you will be happy.
Big dicks, copious semen, you'regonna cry.
Now, I will say, with regards tothe AI cover, AR, we're just
going to say AI cover.
I cannot say AI art withoutwanting to.
Well, no.
It's not art.
(41:27):
No, it's not.
Anyways.
With regards to that It's likepeople who trace shit and act
like they came up
Stacy (41:33):
with it.
Amy (41:34):
Yeah, but my issue is, and
this is me personally, I don't
know if anyone else feels thatway, I'm sure you guys might
understand how I feel.
Basically, if you're makingshortcuts for your cover, what
would make you not take ashortcut with your own writing?
EJ (41:53):
That's a good point.
That is a top.
That is a good point.
And there are
Stacy (41:56):
a couple of people, like
a couple of authors out there.
It's not anybody I amparticularly familiar with, but
I know there's been at least oneauthor, fuck, I can't remember
their name now, who somebody.
ran it through one of the AIdetectors and like, big chunks
(42:16):
of what they had written cameback as AI.
But now I can't remember it.
It was somebody in the indiebook world, but I can't remember
who the hell it was now.
Amy (42:23):
There's apparently been a
slew of AI published works on
the Overlord site, Amazon.
Of course, because Amazondoesn't give a flying fuck.
And of course, well, I mean,they're getting backlash for it
now, but that's because you'vegot consumers that are extremely
upset by the It's like, you'repaying, you're asking 2.
99 for this 80 page trash pile.
Stacy (42:45):
Dribble, right?
Amy (42:48):
And it's like, no, I'm
sorry, it's not worth it.
I mean, what, you're like, no,God, no, are you kidding?
If you can't be bothered toactually write it, why should I
be bothered to pay for it andread it?
Stacy (42:59):
Exactly.
It's not a real story, whyshould I pay real money?
EJ (43:04):
I will throw in, I, I'm on
Reddit a crap ton I'm a lurker
more than anything.
Don't bother trying to find mycomments or anything.
I ain't involved like that.
but a subreddit that I think isactually great for readers and
writers alike is, the subredditfor erotica writers.
(43:26):
Oh, really?
Yes.
Those people, they get reallyinto the nitty and gritty about
the business of writing erotica.
Well, they have to
Stacy (43:35):
because erotica, if you
have anything identified as
erotica on Amazon, you're sentto what's called the erotica
dungeon.
Right.
And if your shit doesn't getadvertised, like, people can
only find your shit if they'respecifically looking for your
name.
Like, you can't even link jumpto get to it.
And it's fucking puritanical,
EJ (43:54):
which is not a real English
word, but y'all know what I
mean.
Stacy (43:58):
Yes, it is.
Puritanical is absolutely aword.
Oh, it is?
Yeah, it is.
Yeah.
Okay.
Oh yeah, puritanical is a word.
EJ (44:04):
Oh, it's a good word.
Puritans
Stacy (44:06):
fucking ruin everything.
Yes, they did.
EJ (44:08):
But like, uh, in, Erotica is
like, a huge moneymaker.
And Amazon would know this.
But of course, they have to.
But Amazon treats it like it's,ew, it's dirty.
Stacy (44:21):
It's dirty.
You're
Amy (44:22):
a family oriented website.
Yeah,
Stacy (44:24):
sure.
Which is not why you, you, youfuck over.
The employees that you claim aremembers of your family, right?
Right.
Um, because you're so familyoriented,
EJ (44:35):
right?
Before before I go into a ranton Amazon warehouses, I, I will
point out back with thissubreddit.
It's very interesting becausethey do get very straightforward
and down to brass tacks aboutwhat it really takes to put in
(44:55):
the time, the effort, theresearch.
into writing and making a jobout of it.
Well, and
Stacy (45:04):
the thing too is, you
know, cause, cause we've all had
that one dipshit friend who'slike, I could write a romance
novel.
I bet it's really easy.
And then like, okay, so where'syour romance novel?
It's been five months.
I thought you said that this wasreally easy.
And, and I think people feeleven like, like doubly so in
regards to erotica, but thesimple truth of the matter is,
(45:25):
is it's like, yeah, anybodycould write something that is
arguably dirty.
But that doesn't make ittitillating.
It has to be titillating
EJ (45:34):
to
Stacy (45:34):
qualify as erotica,
otherwise it's just Chuck
Tingle.
I have nothing but mad respectfor Chuck Tingle.
Like, that man spun literallynothing into a career.
Totally.
but I've read several of hisquote unquote books, and
they're, they are neither booksnor are they erotica, and yet
they're listed as erotica.
And that versus something thatyou can read that genuinely
(45:57):
turns you on, that genuinelytitillates you, are two wildly
different beasts.
Oh, for
EJ (46:01):
sure.
For sure.
So I appreciate some, like,there is that acknowledgement, I
feel like, in that subredditcommunity.
I appreciate that they're veryreal about the story crafting
part of writing a story.
Because There is, like, there isfeels.
(46:22):
It is art, but there is also alot of just crafting discipline
when it comes to any sort ofgood writing.
Period.
Amy (46:32):
Yes.
EJ (46:32):
and so it, it makes it a
very interesting world to talk
about AI.
people have been very upfront oflike, Hey, I've been using AI.
I want to talk to you guys aboutmy experiences with it.
And they generally findthemselves with not a
enthusiastic audience, but moreof a neutral audience.
(46:53):
Like, okay, give it to us.
How'd that go using generativeAI specifically, right?
And usually the, the successfulstories are, I can do it for
small shorts that I throw onwhat, on like what pad and they
will get some.
(47:13):
So it's not like I'm not gettingnothing, but at the same time,
they point out it's also superformulaic and, there lacks a lot
of novelty.
So, they will admit, I couldprobably make more money if I
wrote more of this myself.
Stacy (47:34):
Right, if you actually
took the time to put, like,
initiative and imagination toit.
Like Amy said, like, you'restill creating something, even
if what you're creating issomething to titillate, which
there's nothing wrong withcreating something to titillate,
by the way, I'm
Amy (47:51):
in no
Stacy (47:52):
way, shape or form, kink
or slut shaming anybody, like,
you do you, boo.
Mm
EJ (47:59):
hmm.
So, but at the same time,because it is.
a algorithm using training, thesame training data for you as
anyone else, it will, it's lesslikely to create something that
is going to be marketably novelcompared to everyone else.
Stacy (48:25):
Right.
Well, yeah, because if you're,if you're scraping from
something, What you get from itis going to be the common
denominator right now.
Like it's not going to be theoutlier
EJ (48:37):
and I certainly I mentioned
the IP issues of this generative
AI stuff and I've I have harpedon about how I'm a data person.
I in this training data.
We should
Stacy (48:51):
probably specify too that
by IP we mean intellectual
property, not internet protocol.
Right.
EJ (48:56):
Yes.
So intellectual property, thestuff that allows you to go out
to the internet and say, hey,you're stealing from me.
Right, exactly.
Get off that pirating site.
That sort of thing.
Right.
Bringing it into the indiecommunity.
Amy (49:12):
Actually, can I bring up
something that happened
recently?
Oh, yeah, sure.
Oh, yeah.
So, I don't know where this camefrom, but a prominent writer on
Tumblr, who posts not just onTumblr, but on other, um, sites
such as Wattpad, AO3, apparentlysomeone took it upon themselves
(49:34):
to say that the collection ofstories they had logged, either
on Wattpad or AO3, was a book,and it was put onto Goodreads.
Stacy (49:46):
Oh my god, so they just
ganked somebody's book and then
put it up as their own work?
Well I don't know
Amy (49:51):
if they did that because
they created an author page that
had said person's actual nameand then also it said this
person was based in the UnitedKingdom.
I'm like, oh hell no.
So yeah, so that sounds likepiracy.
Yeah, and something's happeningthere, right.
(50:11):
There was no actual book forsale.
It was just listed on Goodreads.
And I'm just like, that's what'sgoing on here.
So I, I had to make an entirepost for it on in the librarians
group on Goodreads.
And I'm just like, and they're,they're like, Oh, but do they
have anything listed on theirprofile not to post it on
Goodreads?
Why specifically Goodreads?
(50:33):
Because it's, it's onlinestories.
Goodreads is about books.
Right.
You're not supposed to putfanfiction or whatnot on here.
Stacy (50:43):
So, so, so, okay, hang
on.
I might be looking at this thewrong way.
Is this something?
That they're trying to claimtheir stories are a book in
order to maybe drive up theirreadership through Goodreads?
Amy (50:57):
I do not know because the
person who actually wrote the
stories was completely confused.
Okay,
Stacy (51:05):
it's just straight up
piracy.
It must be somebody piratedtheir stories and are trying to
claim it's a book to maybegenerate ad revenue or something
on their own site.
Amy (51:15):
I, but that's the thing
though, Stacey.
I don't know if they wereactually pirating the stories.
They just created a listing forit on Goodreads.
Right, but they're saying Butthere's no actual way to get to,
there's no way to purchaseanything from that, from the
page.
You can't like, it
Stacy (51:32):
doesn't like, like link
to a website or anything like
that?
No.
Oh, that's fucking weird.
Yes.
EJ (51:37):
Yeah,
Stacy (51:37):
that is super fucking
weird.
EJ (51:39):
I, I can also throw in, I,
at least under, the US IP and
patent system, fan fiction isalso not, recognized either.
Correct.
Well, it can't be, like,
Stacy (51:52):
you're using somebody
else's intellectual property.
Right.
Amy (51:55):
Exactly.
Even if it's, even if it'stechnically an original
character that was not depicted,well,
Stacy (52:03):
Mm hmm.
But if you said it in thatworld.
Yes, but they're
Amy (52:06):
interacting.
They're interacting withcharacters from that world.
That is still consideredfanfiction, whether you use an
original character in that worldor not.
Stacy (52:16):
So, well yeah, because
that's why, that's like the
whole thing with like FiftyShades of Grey and whatever the
hell Cassandra Clare wrote.
Like, I know those started offas fanfictions and then were
turned into their own.
Say that again, I'm sorry.
It was Dark Hunters forCassandra Clare, wasn't it?
Maybe, I think it was originallylike a Harry Potter fan I don't
(52:37):
know that that's Maybe I mayhave that completely wrong.
I have no clue.
EJ (52:41):
But, yeah, similar thing,
just, just throwing it out
there, mostly for our listeners.
But, yeah, so with the datatraining, There, there's a
couple of issues I've got withthat, against getting back to
the U.
S.
legal system.
Oh my goodness.
Our friend, the U.
(53:01):
S.
legal system.
A big issue to consider rightnow for most large AI companies
right now, notably, I thinkOpenAI, they're the ones who,
have created ChatGPT.
So whenever you think ofChatGPT, think OpenAI.
When you think of OpenAI, thinkof, right now there's a crap ton
(53:23):
of US litigation that'shappening right now.
Right.
Amy (53:26):
Yep.
EJ (53:26):
I will absolutely bring that
up in the show notes, the nature
of those, because these AIcompanies know that there is IP
issues with, with the data thatthey are feeding their
algorithms.
They know it, so at the moment,there is nothing in U.
(53:47):
S.
books, in the U.
S.
law books, there's no regulationfor these companies to create a,
system of transparency or,governance or quality checking
or anything related to that,that can assure folks, hey, we
have only been scraping thingsthat, you know, are Our IP free
(54:12):
the closest we have that I havepersonally come across, at least
as far as U.
S.
Companies go, which is prettymuch all the big players like
Google's got nothing.
Microsoft's got none of that,face meta slash Facebook.
They have not touched thatdefinitely open a I who has
gotten money from all of theseplaces, including and especially
(54:35):
Microsoft.
there is concern already aboutMicrosoft's involvement with
open AI because they have beencollaborating so damn closely.
all these guys are under someform of litigation right now in
the U.
S.
because There are people who arevery certain that they have
(54:56):
scraped their data, text,images, which is a yes.
They currently have litigationright now that is coming on them
saying, hey, you've stolen fromus.
This is how we know.
I hope they get hammered.
The only exception I have found,and they have come into issues
(55:17):
with this, is Adobe.
Adobe has tried to help theirown customers get around being
worried about using AI.
Because again, we've talkedabout the whole IP thing.
One of the, uh, big sellingpoints for using Adobe AI in the
past has been, hey, if you useour AI work for generative, Art,
(55:45):
that's okay, because ourtraining data only uses our
images that we have a legalright to, to do that.
So you are fine when it comes toUSIP law.
Now, you know, there's somethingelse that I'm going to add to
that.
Of course.
(56:06):
I and for someone who's workedin data quality and governance,
I was like, Oh, crap.
Oh, um, so they never, they didnot lie.
The good news is they have notlied that we know of on what
their training data source was,but it is no, it has been found
(56:26):
out that they did not vet thattraining data.
Of course.
Enough.
Oh, of course.
Amy (56:34):
Human
EJ (56:35):
element.
Recently, because mind you, allof these image files that Adobe
has been using to train their AIalgorithm, that's been user
generated.
And this has been kind of apoint of contention for Adobe
users, because they know thatlike some of the images that
they put into certain Adobelibraries, they have to agree,
(57:00):
yeah, this is going to be usedfor training purposes, you got
to be cool with that.
So they did not.
Keep in mind that some of thisuser generated data that they've
been training on, it might be AIgenerated itself, which totally
screws up their whole marketingpremise.
(57:21):
and specifically, they havefound mid journeyers generated.
Products and I do find midjourney to be especially kind of
hilarious because they have beenlike in a spat straight up spat
with a UK based company becauseI want to say it is stable has
(57:44):
mid journey has accused stableof stealing, uh, data from mid
journey.
to train.
Oh, no.
Somebody stole from algorithm.
That's a riot.
I have the links for that.
I will put that in the shownotes so people can see it for
themselves.
(58:05):
It is hilarious, especiallybecause, again, any sort of
Argument on these AI companiesbehalf of like, oh, we're just
we're just doing innovation.
This is how the Internet worksnow.
No, it's not.
And, you know, right.
Otherwise you wouldn't get intoa fucking hissy fit.
(58:26):
It's a way to fuck people arecalling out at me.
Right.
So, you know, it's again, it'sit's kind of fun.
Following the tech news forsomeone like me, because a lot
of these tech bros, they.
They're, they're so damnfragile, and such obvious
hypocrites sometimes.
Amy (58:46):
I
EJ (58:50):
could go on.
But yeah, so there's, there'smultiple things happening there.
So there's the litigationaspect.
And I also threw in, like, thereis that governing aspect as
well.
Do you actually trust, can youtrust MidJourney to look at
their, their training data?
Like, first, you're, you'rescrewed on IP, for your American
(59:12):
market, to, like, You could be,for all you know, you could be
using training data that is UCP.
Explain what UCP is, please.
No, CP.
Child pornography.
Stacy (59:31):
Oh, God, sorry, right,
right, right.
I was thinking of a computerterm.
No, you're right.
You're absolutely right.
I didn't even think of that.
I did not even.
Oh, my God.
So, yeah.
What a horrible thing torealize.
EJ (59:43):
Yeah.
Seriously.
Yeah, um, because there isliterally nothing on U.
S.
books or anyone's books rightnow to regulate what training
data you use, where you sourceit from, who is monitoring that,
what are, what is even thepolicy and protocol?
I mean, for a lot of U.
S.
laws, they don't even ask forinspection.
(01:00:04):
That's an E.
U.
thing.
If anyone's going to check shit,it's going to be someone like
China or the E.
Amy (01:00:11):
U.
EJ (01:00:11):
who actually put in the
time, resources and give a shit
about inspecting this stuff.
U.
S.
laws.
We got no
Amy (01:00:18):
time for that.
We're trying to make some.
Right.
EJ (01:00:21):
Exactly.
U.
S.
law has a hilariously low bar,usually, when it comes to that
sort of thing.
At most, they're like, youshould have a process for this.
Otherwise, the best thing youcan do for keeping people in
check for, you know, keepingregulated processes going in the
(01:00:42):
U.
S.
is usually some sort ofreporting system.
And occasionally understaffedinspectors.
Stacy (01:00:52):
Yeah, really understaffed
inspectors, I'm sure.
Amy (01:00:55):
Okay, so my thought is
Before they even rolled out all
this shit, all this crappy AI,whatever, software, blah de blah
de blah, whatever, rather than,you know, just scraping whatever
the fuck they could get theirgrimy little hands on, why not
approach, say, something likeDeviantArt?
(01:01:19):
They have chosen violence, bythe way.
Yes.
Who?
I'm sorry?
DeviantArt.
EJ (01:01:25):
Yeah, I feel really bad for
DeviantArt, uh, artists right
now.
Amy (01:01:30):
If they could have
approached, if they could have
approached basically those thatare, oh, whatever, the overseers
of a website like DeviantArt andasked, Hey, could you maybe get
us some art or artists that arewilling to loan us, Something
that we could use for trainingdata, you know, and maybe
Stacy (01:01:52):
we'll, you know,
Amy (01:01:53):
we will compensate said
individuals because one, there's
a lack of transparency to,there's also the lack of
compensation.
Three, there's the theft, justoutright theft.
And honestly, that's just, thatis really what gets the artists
so up in arms about it.
Because one, their livelihoodsare being threatened.
(01:02:16):
They're not receiving anycompensation.
And it's, they're seeing their,their work being transformed
into something else.
Utterly shit tastic.
Stacy (01:02:27):
Yeah, 100%.
Amy (01:02:30):
And it's baffling.
It's the same also with regardsto the writing, whatever, I
guess with ChatGPT or whatever,I don't know.
I mean, they tried to do this onAO3 of all places.
And of course, the writers ofAO3 were not having that shit.
Stacy (01:02:51):
No, of course not.
That sounds like a really greatway to make sure that your
website's bold.
Pretty much.
Amy (01:02:57):
Pretty much.
And it's just, why?
Why must you go out and, and notdo your due process?
It just, it, it baffles me, thelaziness.
Well, and it's also like, why isit,
Stacy (01:03:11):
why, why are you okay
with fucking people over if the
opportunity presents itself?
Amy (01:03:15):
Mm hmm.
Stacy (01:03:16):
Like, what is wrong with
you that you think that, like,
the whole world can't be fuckyou, got mine?
Amy (01:03:23):
No.
Stacy (01:03:24):
Like, none of us are
going to get anything if that's
the tact we take, basically.
I almost guarantee, once again,it comes down to money.
EJ (01:03:31):
Of course.
Stacy (01:03:32):
It always does come down
to money.
Right, but it's not just that itcomes down to money.
It doesn't, but so many people,I really think fucking somebody
over is like, it's a feature,not a bug.
You know what I mean?
Yeah, that's true.
It's not a, it's not the sort ofcallous.
Whoops, you know, who gives ashit?
Let's move on.
But almost more like, can wefuck these people over?
(01:03:54):
Because I would enjoy that.
EJ (01:03:56):
Right.
It gets further exacerbated whenyou're in, I would say the
mainstream corporate environmentdoes treat people like just
another raw material.
100%.
So, this is, this is not metrying to be dramatic and grim
(01:04:18):
or anything, I, this is just.
Well, right, I mean, you're
Stacy (01:04:20):
cogs, you're cogs in a
fucked up machine, and I say
that as somebody who, I wasworking in the mental health
sector, but I got caught up in abig machine, and, excuse me,
that, that had found a way toprofit off of mental health.
and you are an insignificant cogin that machine too.
And they do not, your ass willbe replaced the minute you are
(01:04:42):
not there to serve the machine.
EJ (01:04:43):
Yeah.
Amy (01:04:44):
I really
Stacy (01:04:45):
loved helping people, but
the machine was set up less so
that I could help people and inmany ways blocked me from
helping people because itinterfered with their ability to
make profit.
And I don't want to be a part ofthat, that fucking system again.
EJ (01:05:03):
It's all, it comes down to
those KPIs.
I say that as someone who hasactually set KPIs.
Key performance indicators.
Stacy (01:05:13):
There we go.
Which is not something thatshould be applied towards.
Mental health or you can.
Yeah, I get it.
Oh, yeah, you're right.
So very frequently is,
EJ (01:05:23):
the whole point of KPIs is
to create objective metrics to
show, progress or theaccomplishment of a goal.
Right, which
Stacy (01:05:33):
is not how anything
that's helping somebody should
be structured, and yet it alwaysis.
EJ (01:05:37):
And, and, when you're, when
you're trying to apply it to
human well being, you, you go,you get into a real shit show.
Oh yeah.
And, you know, you run intothat, actually, whenever you're
talking, when you do talk aboutethics.
I rarely come across a businessleader who really takes that
(01:05:57):
caution seriously.
Right.
No, I and you're in
Stacy (01:06:01):
charge of a major
corporation, you're scum, right?
I just accepted it.
That's the truth of it.
There's no such thing as a goodCEO.
EJ (01:06:09):
Well, yeah, and I think
like, one of the hard things is
by the time you're a CEO, andyou're part of a global
corporation, I'm talking like,you're, you're part of an IPO,
you are, your company has itsown cute three letters on that
stock market.
Ticker tape.
Um, your ultimate master is notthe market.
(01:06:31):
It's your stakeholders.
Those are the only people youcare about.
It's not your employees.
It's not even your customers.
I can never take a CEO seriouslywho's like, I care about my
customers.
My dude, if you sincerely feelthat you're not doing your job.
(01:06:51):
Which sucks for you if youreally believe that, but
everyone else in your company, Iguarantee you is very aware that
your real customers are yourstockholders.
Yeah.
Everything else can go fuckingout the door.
Yeah.
I don't believe you can be a CEOand possess empathy.
(01:07:13):
No, not in this currentenvironment.
We would have to have a
Stacy (01:07:16):
successful CEO.
The first thing that you do iskill your empathy and that's not
a sign of a good person to be inwith.
Amy (01:07:21):
Cut empathy out.
CEO.
EJ (01:07:23):
Yep, you're right.
I mean, in order to survive, inthe current System that the
global economy works with.
Once again, I will call up latestage capitalism requires
unfettered profit profit growth.
And that is exactly how ourcapitalist system across the
(01:07:43):
globe.
Even if you are part of China orsomething.
Congratulations for part of theglobal economy.
You're part of unfettered Profitgrowth.
and there's some greatnonfiction books that, you know,
talk about, the environmentalimplications of that.
and I would also say Or thedelightful
Stacy (01:08:01):
human rights violations.
EJ (01:08:03):
It really makes our little
area, our little niche of, of,
like, romance book world fuckingquaint, man.
Mm hmm.
But it's our world, dammit.
Stacy (01:08:16):
Well, that's just it.
Like, I think a big part of theappeal of reading indie romance
is we don't fucking want.
CEOs and publishing houses andshit like that in involved
because they're gonna fuckingruin it.
EJ (01:08:30):
It's uh, a real rebellious
way of going about creativity.
Especially in a economic worldwhere we have so much mainstream
media that is quite formulaic.
I, I think of like Disney andlike, dear Lord, they're not
(01:08:51):
going through a good creativetime right now.
A lot of everything is veryrecycled.
Well, it
Stacy (01:08:55):
depends on where you're
looking though.
Yeah, not necessarily.
I wouldn't necessarily agreewith that.
It depends on where you'relooking.
I think in terms of some oftheir storytelling, they're
finally moving away from Princeand princess and living happily
ever after like in Canto andMoana and you know, and we're
finally seeing representation ofpeople of color.
EJ (01:09:14):
Yeah, that's more.
So
Stacy (01:09:17):
in regards to that, I
would agree.
But in regards to like, like alot of the Star Wars stuff that
they're putting, putting out, itreally does seem very like,
wash, rinse, repeat.
EJ (01:09:27):
Yeah, Star Wars ain't doing
great.
Marvel ain't doing great.
Marvel is mixed.
I like some bits of Marvel.
The overall universe, I feel,has not been well.
I'm over
Stacy (01:09:37):
it.
It's, it's been too long.
I am
Amy (01:09:39):
completely marveled
EJ (01:09:39):
out.
Yeah, we, we have
Stacy (01:09:41):
sustained an unnatural
lifespan and it's to the point
now where I don't have theattention span for it anymore.
I
Amy (01:09:47):
don't, I think phase four
could have been their final
phase and yet we're in phasefive.
Yeah, but they won't
Stacy (01:09:52):
though because they're
going to drive it into the
ground because the minute thatthere's You know, uh, a penny to
be made.
They have to squeeze thatproperty until it pulps in their
hand, regardless of the damagethat they actually do to the
creative process of thatproperty.
EJ (01:10:07):
All right.
And they, they are, I, I pick onDisney because they are in this
very weird space where they area creative company.
Everything about what they doactually revolves around
creativity and art.
But, They are a publicly tradedcorporation who must make profit
(01:10:32):
at all cost.
Stacy (01:10:34):
I see them more as what
you were talking about as a
representation of late stagecapitalism.
They're not a company that'sfocused on creative output.
I think creative output happensaccidentally now.
Amy (01:10:45):
The
Stacy (01:10:45):
main focus of Disney is
profit.
And Disney's beenproblematically money grubbing
for, I mean, since I was achild.
They're just more monopolisticabout it now.
EJ (01:11:00):
And this is not to say that
Indie authors are, they are
above making a profit.
that would be like sayingthey're above paying their
bills.
Amy (01:11:11):
Is very rare to see them
able to quit a day job and just
completely write and write.
Well, I think it's doable, but Ithink it's,
Stacy (01:11:20):
you have to put in the
work.
Cause like Tiffany Roberts, theydo that.
EJ (01:11:23):
Yes.
they
Stacy (01:11:24):
clearly do
EJ (01:11:25):
a crap ton of work,
Stacy (01:11:27):
and Ruby also has, a
publishing name, like a name
that she publishes, like she haswhat she refers to as her New
York name.
She won't tell anybody who itis.
Amy (01:11:39):
But
Stacy (01:11:40):
she was a writer before
she became an indie writer.
The indie thing happened justbecause she wrote Ice Planet
Barbarians and wanted to write astory for herself, basically,
because she couldn't find whatshe wanted to read.
And then it turned out everybodywas like, what a novel idea.
And then You know with as muchas I hate to give Amazon any
(01:12:00):
credit ever The fact is is theydid break big publishing houses
death grips on, you know, peoplebeing able to independently
publish.
Because prior to that, if youwere an indie public, if you
like self published something,it was called, it was considered
a vanity press.
And no one in the traditionalpublishing world would have
(01:12:23):
touched you with a 10 foot pole.
And that's as recently as like20,years ago.
EJ (01:12:29):
So, but comparing even a big
name in our world, like Ruby
Dixon to Disney, that's adisingenuous comparison.
Oh
Stacy (01:12:37):
yeah.
They ain't the
EJ (01:12:38):
same.
Oh God no.
Ruby Dixon, I, I'm, she can beas ambitious as she wants, but
she is not going to have a I, Iwould, I would bet damn good
money that she does not have theresources nor the ambition to
have the unfettered profitgrabbing need.
(01:13:00):
Ruby Dixon is doing
Stacy (01:13:01):
exactly what Ruby Dixon
wants to be doing.
She's not out to take over thepublishing.
Amy (01:13:06):
She just wants to release
books that she finds fun.
She wants to write her stories.
Stacy (01:13:11):
And she wants to enjoy
the books.
That's all she wants.
And God bless her.
Godspeed.
Amy (01:13:16):
Yes, indeed.
So I love Ruby Dixon.
With regard to Disney though, Imean, for some, for some bizarre
reason, they will sometimesproduce something that is just
really good.
And granted, I don't want tojinx anything, because
supposedly there are two moviesin this franchise that are being
(01:13:36):
made.
And it's because of Predator.
Really?
Yes, there's going to be asequel to Predator.
I had no idea.
There's going to be a sequel.
That makes sense.
Supposedly.
And the pray was super, wassuper
Stacy (01:13:51):
successful.
Amy (01:13:52):
I tell you, it also has to
do with having a great.
woman of color in a leading, ina semi leading role.
All of the Predator movies thathave had that have had a
prominent woman of color intheir movies.
But anyways, moving on.
Also, I'm sorry, but the firstAlien versus Predator is fun,
but it's
Stacy (01:14:10):
not a good movie.
Amy (01:14:14):
I'll give you that.
I still enjoyed it a lot becauseI love Lex.
No,
Stacy (01:14:17):
it's fun.
It's super fun.
Yeah.
And she was great.
I just thought that the actualThe AVP
Amy (01:14:24):
portion of it was pretty
weak.
Yeah, that didn't really need tohappen, but anyways.
But the other movie issupposedly called Badlands, and
it takes place in the future.
Oh, cool.
So
Stacy (01:14:36):
Well, Predator 2 kind of
took place Predator 2 was like
the near future.
Now it's the past, because Ithink it was 1997, I think, was
the year that think you'reright.
Amy (01:14:46):
It came out in 1990, but it
was 1980.
Yes.
Stacy (01:14:51):
Although, you know,
Predator 2, I think, would be
another movie that we coulddescribe as cyberpunk ish.
Yeah.
Yeah.
There was a lot of hell going onthere.
Well, and it's definitely latestage capitalism, like you see
that.
Mm
Amy (01:15:04):
hmm.
Because of the gang wars and allthat fun stuff.
Stacy (01:15:07):
Well, right, and like,
the cops are, you know, like,
what they're able to do isextremely limited.
People are being murdered andnobody cares until cops start
getting murdered.
What does that sound like?
Amy (01:15:18):
Anyways, so yeah, sometimes
it has to do with the, the,
what, what are the sub houses orwhatever the heck, whatever the
film houses that are not,they're under the massive Disney
umbrella, but it's not right.
Right, how it's likesubcontracted out kind of thing.
Yeah, something like that.
EJ (01:15:39):
Yeah, I wanna, it could also
wanna call it subsidiary, but I
don't know if that's actually,there you go.
No, I
Stacy (01:15:44):
think, I think it's, I
think you're right, ej.
I think that's right.
It's Subi.
Yeah, I think it's a subsidiary.
Yeah.
Where it's held by Disney, butit's not an officially Disney,
it's not under the umbrella termof Disney kind.
It's
Amy (01:15:57):
Fox.
It's Fox.
When Disney acquired Fox, Iguess Fox owned both of those
friends.
How Buena Vista was in the, theeighties where,
Stacy (01:16:05):
you know, stuff could get
released that like.
PG 13 or even R, you're bullyinga Vista.
And at the time, even thougheverybody knew Disney owned it,
it wasn't affiliated withDisney.
Amy (01:16:15):
Yeah.
EJ (01:16:16):
Sorry, I got off on a
tangent.
No, I think it's cool.
We went down a predator tangentand, you know, it's gonna
happen, man.
You know,
Stacy (01:16:23):
Amy, actually, there's a
really good book.
That I think you would like, andit's called Broken Earth by S.
J.
Sanders.
Oh my
Amy (01:16:29):
goodness, what could this
possibly be?
Could it be?
Let me see.
A trilogy called Broken Earth byS.
J.
Stacy (01:16:34):
Sanders.
By S.
J.
Sanders, and I think you'dreally enjoy it.
I don't know if I've evermentioned it before.
Not at all.
This is a completely new series.
Each year this is all new.
S.
J.
Sanders, you say?
I've never heard of that name.
You know, I was hoping I'd getthe opportunity to teach you
about that.
And you, you brought it to me.
So thank you.
Amy (01:16:53):
Thank
Stacy (01:16:54):
you for being complicit
in your own harassment.
Amy (01:16:58):
I know it's, the harassment
comes out of love, and because
you really like to read thebooks.
Stacy (01:17:03):
Same with EJ.
Well, I do want you to read thebooks, but I also really like to
tease you, so.
EJ (01:17:08):
So at the risk of this
conversation being just all
like, boo, AI, because we didacknowledge like AI, it's a,
it's a tool.
AI is
Stacy (01:17:18):
problematic.
It's extremely problematic.
It's extremely problematic.
In the
EJ (01:17:21):
context that it's being used
in.
Yes.
At present.
When I, let's talk about somepositive things.
Potentially positive thingsabout AI.
When is it actually useful?
When does it actually appeal?
I will straight up say that Isee there's a lot of unharnessed
(01:17:44):
potential in how it could makethe internet more accessible for
a future.
So
Stacy (01:17:52):
it has the potential to
be a great, like, learning tool.
Yes.
You know, like, because we weretalking about Grammarly uses AI,
but I feel like that's a more,but at the same time, you also
have to be careful with thattoo, because like, A lot of
people are even getting fuckedover in that regard.
Like Duolingo Mm-Hmm.
fired all of their translatorsin favor of ai.
(01:18:12):
What?
So yeah.
It used to be like they had likesix people for each language,
and now they have one.
And it's just to, to read the AIto make sure that it's accurate.
Amy (01:18:21):
Oh, it's
Stacy (01:18:21):
weird.
So that's actually why I quitusing Duolingo.
Mm-Hmm.
Oh,
Amy (01:18:25):
I, I can't go back to using
them.
I was using them, but now Yeah,I was too.
I fell off, I fell off wagon.
Now I'm nuts.
Get it back on.
Stacy (01:18:34):
Yeah, I'm going to go.
So there's another one.
I can't remember what it'scalled.
I've heard podcasts advertised,and I'm going to look into that
as soon as I remember what thehell it's called.
Amy (01:18:43):
Obviously, if you're
wanting to do learning language
and you want to do it for free,check with your local library
and see if they have access toeither Mango Languages or
Transparency Languages.
Stacy (01:18:53):
I know Rosetta Stone is
supposed to be really good too,
but I don't know.
I know that's super expensive.
That's expensive, and I don't
Amy (01:18:58):
know of any libraries that
offer that.
But those other two resources,either Transparency Languages or
Mango Languages.
Yeah, you can also findpodcasts.
Stacy (01:19:09):
You can also find
podcasts where native speakers
will teach you.
And you can also find stuff on,YouTube.
Mm hmm.
With native speakers, nativespeakers teaching people, and
that's also great because youcan get the pronunciation right
there.
Amy (01:19:23):
Sometimes you can find
funny ones, like, you know,
hearing languages, differentwords in different languages.
I had mentioned, basically inour conversation before we
recorded, that, You could, youcould probably build, with
permissions, and also completeand utter transparency, an AI
that could serve as an editingservice, which I will call
(01:19:45):
actually a pre editing service.
Basically, it's like the Yeah,it's like what we were talking
about, Grammarly.
It's, it's kind of likeGrammarly, but in the extreme
there, and hopefully no one'staking advantage of anything.
But, again, what should be thepre edit, you send it to the
human editor to catch any of theissues that the pre editor had
missed or even generatedthemself.
(01:20:07):
Itself.
Cough, cough, I'm looking at youautocorrect, cough, cough.
See, I swear, in one of mypapers, I, I had typed.
As duck approached when I meantas duck, as dus
Stacy (01:20:22):
approached.
Oh, as dusk approached.
Amy (01:20:24):
Yes.
And of course I didn't catch it.
Autocorrect didn't catch it.
'cause duck is a complete, isit's a word,
EJ (01:20:31):
right?
Right.
and no.
And that, that is where like,another, another check is really
useful.
And to be very clear, I useGrammarly for editing, because
it does, it does a really goodjob.
pointing out, oh, hey, here issome weird punctuation crap that
(01:20:52):
you should probably changebecause I'm really bad at that.
Amy (01:20:55):
Um,
EJ (01:20:56):
similar thing with, you
know, it does some really good
spell check.
It also, I like personally howit tries to simplify my
sentences, specifically becausemy first drafts are flowery as
fuck, and ain't nobody got timeto read that.
(01:21:16):
and I know that for a fact, youknow, I, I say you, I use
grammarly having had theprivilege of time and money to
have been in undergrad and hadseveral years mentorship.
one on one with an, with anactual writing mentor who is
now, past, you know, may he restin peace.
(01:21:37):
He was amazing for me.
I still fondly remember gettingup, for our 8 a.
m.
appointments.
Every single goddamn week whenschool was in session,
regardless of the weather, and Iwould trudge to the English
department, and he would make meread whatever the hell I wrote
(01:21:58):
that week to him aloud.
Amy (01:22:01):
And
EJ (01:22:02):
it was so fucking intense.
I would spend hours in thatman's office, and he was, he was
not cruel to be clear, but hewas not flowery at all.
That man had an editor's mind,and had no problem telling a 19
year old me, the fuck you meanby that?
(01:22:27):
Which I really needed and so hehelped me figure out where I am
actually quite weak in mywriting regardless of genre or
whatever.
And one of my things is I amflowery as fuck.
and Brammerly's great onhandling that.
Now, I want to be very, veryclear.
Grammarly does have this thingcalled Grammarly Go, and that is
(01:22:49):
generative text.
They use ChatGPT as the backbonefor it.
Stacy (01:22:54):
Oh, I don't like that in
the slightest.
Uh,
EJ (01:22:56):
Grammarly, like, OG, that
was made in house.
They, once they got into the,that, their generative AI
products, which are separate,but they, they, they use ChatGPT
in part because ChatGPT is.
Free to use.
(01:23:16):
It's free as in beer, as we sayin the world.
so, heads up, especially forthose who use Grammarly out
there.
I don't know about things likeAutocrit.
Or ProWritingAid.
I don't know what engines theyuse for that because both of
those are meant to be, smarteditors as well.
(01:23:37):
I would need to look furtherinto those.
I have found them, they appealto me as potential, like, proto
editing tools.
But to be clear, like, there isnothing like an editor.
Period, like a human editor as Isay this, not as someone who's
trying to be curmudgeonly.
(01:23:59):
Hopefully our listeners canappreciate that.
I have a lot more reservationsabout AI than simply it's coming
for our jobs, right?
It's like, no, it's it's comingto wreck us.
In its current form.
I think it's trying to
Stacy (01:24:15):
replace humanity without
any humanity.
EJ (01:24:17):
Yeah.
I because Amy, I think you bringup a really good point with that
idea.
And like, you're such a damnlibrarian for it.
I love it.
I think.
Because I also have a master'sin library information science,
like, that's probably why Ifound myself doing data quality
and governance, becauseRegardless of anything I was
(01:24:42):
doing in data science, it alwayscame down to, well, was your
initial data trash to beginwith?
Well, that's probably why youhave a trash product to end
with.
Trash in, trash out.
Stacy (01:24:53):
Yeah, exactly.
EJ (01:24:55):
no amount of fancy algorithm
is going to make trash not
trash.
Exactly.
Stacy (01:25:01):
You know, it's funny, I
used this euphemism just last
night with a friend of mine on aphone call.
if you have one pound of shitand ten pounds of ice cream and
you mix them together, you haveeleven pounds of shit.
Oh,
EJ (01:25:11):
pretty much.
And, and, in a very simplifiedway, that is exactly how it
still works.
In, In the computer world.
I know there are some people whoare like, but my algorithm is so
good and I'm over here.
Like, no, it's not.
And it never will be.
Yeah.
Stacy (01:25:29):
Pretty much.
Exactly.
Because it's all,
EJ (01:25:31):
especially when your, your
algorithm depends upon the data
itself to learn from, like allAI algorithms are,
congratulations.
Your algorithm is only good asthe data.
You run through it, right?
Yep.
So, and until you come up with aalgorithm just that can come up
with anything without any datawhatsoever, until then, you
(01:25:57):
still need to worry about yourdata quality.
I'm getting off my soapbox.
I'm like imagining some sort oflike, obstinate Silicon Valley
bro in front of me right now.
Stacy (01:26:08):
No, I mean, well, tech
bros are the fucking worst.
Amy (01:26:11):
Don't worry, we'd be right
there with you, EJ.
Yep.
Stacey and I are pretty damntall.
Yep.
And the tech bros are probablyjust, you know, you sneeze and
they blow away.
Stacy (01:26:21):
Yes, they all have raging
short man syndrome.
Even if they're not short, theyhave short man syndrome.
EJ (01:26:27):
And like, and I do find it
really too bad.
I think it, there is a lot ofpotential out there to, to.
Shoo away tediousness, even inthe creative arts, like, you
know, again, it's like reallybasic proofreading and, and such
(01:26:49):
like this tedium, I emphasizethe tedium of, of the story
crafting process, I could seepotential there could see
potential there..
Amy (01:27:01):
That makes sense.
But my husband did bring up agood point with regards, it's
actually involves the gamingindustry, um, with, with, with
art as well.
And the way he put it wasbasically.
Now I wish I had paid furtherattention because it was
yesterday, but basically, theidea is, um, to What?
(01:27:23):
You want to come in?
Okay, come on.
Okay, he's coming.
Oh my.
Okay.
Husband is coming to assist withregards to the barrel business.
I promise not to try and bitehim.
Stacy (01:27:35):
I appreciate that.
Okay.
Amy (01:27:38):
This time, you know, we're
talking AI.
This is my husband, JC EJhusband.
Amy's Husband (01:27:47):
Nice to meet you.
Amy (01:27:48):
Hello, husband.
All right.
So go ahead.
Basically how AI can actuallyhelp in the game.
Gaming art industry having to dowith that barrel thing you were
telling me about yesterday.
Okay,
Amy's Husband (01:28:01):
yeah, so that's
one of the examples, um, where,
so right now, for example, in avideo game, you shoot a barrel.
So you have, on the one hand,you have the regular barrel, and
then a designer, a 3D artistmade that one.
And then the same 3D artist isgonna sit there and make an
exploded version of the barrelwith all the little pieces.
(01:28:22):
So anytime you shoot a barrel,it gets replaced with the pieces
and then it just gets thrownaround to make it look cool and
all.
But let's say you want that tolook even more realistic and
make that exactly where you shotit and how that explodes.
You want that to be precise.
Well then, of course, you'regonna have to make a physics
(01:28:43):
engine.
You have to spend a whole lot ofeffort and time on making that
happen.
And that's wasteful in terms ofgaming.
That's, you know, wasting a lotof processing power on just that
one part.
But let's go even further.
Let's say you're making a gameand, uh, you're getting to an
area of the game where thedeveloper didn't think you'd go.
(01:29:05):
And you're trying to destroy athing that the developer didn't
think you want, want to destroy.
Like you should add the phone onthe desk.
Okay.
That's one of the most commonthings in video games.
Phones tend to be likeinvincible, but now you're
trying to destroy it.
And no developer thought of itlike, Oh, I wanted you to show
(01:29:26):
me how that, how the phoneexplodes.
So that's one of the thingswhere the, the, they could have
AI tools, but basically itthinks ahead.
And it would then actually have,you know, uh, an idea of what
that would look like to, toexplode that thing for you.
And save, you know, 10developers, uh, sleepless nights
(01:29:50):
where they had to stop theirgame from releasing because, Hey
guys, we didn't make the phonebreaks.
And it's, it's this give andtake thing where you want to
make something really, reallycool for players, but then, you
know, that takes time and effortand then players are going to do
the exact opposite thing of whatyou thought.
(01:30:11):
And they want to shoot a phone.
Stacy (01:30:15):
Now I want to shoot a
phone.
Amy's Husband (01:30:18):
Yeah.
EJ (01:30:20):
It's super hilariously
common, mostly because I'm one
of those jerks in the videogames.
Let's fuck around and find out,shall we?
Absolutely.
If you can't do that in a videogame, where can you do that?
That
Amy (01:30:33):
is a good question.
What's the point otherwise?
Thank you for my, my, my guestspeaker.
Of course.
No, I appreciate
Stacy (01:30:40):
that.
That actually kind of makessense in a, basically what he's
saying is that something thatcan be used as a shortcut for an
already existing item that wascreated by people to make, in
this case, the game play.
It enriches the gameplay ratherthan deciding that AI can create
(01:31:01):
the entire game.
Amy (01:31:03):
Mm hmm.
Stacy (01:31:04):
Right.
And so it's like what I wassaying before, it's a spice.
It's not the whole meal.
Amy (01:31:08):
Correct.
You don't want to eat a spoonfulof curry powder.
Stacy (01:31:12):
Exactly.
EJ (01:31:13):
I'm thinking of actually the
cinnamon challenge, you know.
Oh my god.
Everyone loves cinnamon, goodold cinnamon roll, but then you
actually put just straight upcinnamon in your mouth.
Yeah, try that.
It's not a good time.
Yeah.
I think about all of my favoriteAI potential applications and AI
(01:31:34):
products that I am, I want tolike, Mm-Hmm.
they're all assistive Mm-Hmm.
Yes.
Right.
You are, you are.
Right.
Stacy (01:31:43):
I feel like that's what
they should be.
It should be assistive.
Amy (01:31:46):
Well, I mean, like, in the
medical field, there are some
medical professionals that areturning to, this also is within
the, the mental health, area,Stacey, they're, they're using a
form of, AI transcription tohelp in recording their sessions
with their patients.
You still have the therapist orthe doctor reading through the
(01:32:07):
transcription and making surethat everything makes sense.
But the thing about those isthat those are protected under
the HIPAA law.
At least in the U.
S.
In the U.
S., yes, excuse me.
They're protected under law, sothey cannot be accessed by
anyone.
Stacy (01:32:24):
Right, without a release
of information filled out.
Correct.
Yeah.
See, but that's, that's thewhole point, is that's what it
should be, is there should besomething that assists in areas
like that.
I just don't see where AI is.
at least in its currentincarnation, truly applicable in
the art, in the arts,essentially.
Amy (01:32:46):
No,
Stacy (01:32:48):
for whatever reason,
that's the first place that
everybody decided it should be.
Amy (01:32:51):
Yes.
I don't know why, which is soweird.
It's absolute ass backwards.
It truly is.
When I thought of AI, I wantedit to be doing the tedious
drudge work, housework, so I canbe able to create things.
I didn't want the, the oppositeto happen.
You got it.
Exactly.
Stacy (01:33:12):
We have humans doing the
drudgery and AI doing all of the
creative.
Amy (01:33:16):
I don't want to do the
drudgery.
I want it to do the drudgery.
So I don't have to do thelaundry.
Right.
EJ (01:33:23):
I want to do higher order
thinking more, please and thank
you.
Amy (01:33:28):
Thank you.
Great.
Stacy (01:33:30):
I want to lay in bed and
think of new and increasingly
unusual alien penises.
I
Amy (01:33:35):
knew you were going to
Stacy (01:33:38):
say that.
Hell yeah.
I like wieners.
It would
Amy (01:33:42):
not be a podcast episode
without Stacey daydreaming about
alien penises.
Stacy (01:33:48):
Yep.
And I'm including fantasy inthat, you know, like, I'm an
equal opportunity wiener looker.
Of course.
EJ (01:33:56):
So yeah, it's, I, I think
that is really like a kind of is
a pretty good summary alltogether.
Like, of course, we could godeeper into other areas that we
have no business going into.
No, I mean,
Stacy (01:34:10):
no, let's not.
No, because, I mean, EJ, you'vegot a pretty comprehensive grasp
on this, but.
Amy and I are probably at bestdilettantes.
I don't even, you know what?
I'm not even gonna qualify as adilettante.
I'm, I'm probably just barely adilettante.
I would have to, like, I'm, I,I, I'm surface level.
(01:34:32):
I'm not even, like, shallowenough to be a dilettante.
Like, I would have to applymyself to be that shallow.
EJ (01:34:40):
I, I think it is where I'm
currently at.
I know enough that I know that Ican talk in an inaccessible way
for folks.
So it's been invaluable, Ithink, having the two of you to
be like, hold on, can wetranslate this?
Let's, let's dive a littledeeper here or something.
Stacy (01:35:02):
Once more in the common
tongue, please.
EJ (01:35:04):
All right.
And that is, I think that is oneof the biggest issues that tech
actually has.
It's not a progress thing.
We're fine on that.
It's really a, I put it in themost abstract way.
It's a communication thing.
I think we use technology, butdo we know what the fuck we're
(01:35:28):
even using?
And I think for the average techuser, they don't, and I don't
think that's their fault, quitefrankly.
Stacy (01:35:36):
It's not designed to be.
accessible to the average layperson.
And some of that is simplybecause it's a complex process,
but I really do believe likewith proprietary stuff, that's
again, it's a feature, not abug.
They don't want you tounderstand how it works.
Right.
If you understand how it works,you can do shit like fix it when
it breaks rather than buying anew one or paying 100 an hour to
(01:36:01):
a tech who's outsourced throughthem, you know, essentially
you'll cut off a revenue stream.
Amy (01:36:09):
All part of the hustle
culture.
Basically, you cannot have ahobby anymore.
You cannot have a hobby anymorebecause, oh, are you making
money on your stories?
Yep.
And why are you posting them?
For fun?
Stacy (01:36:24):
Because I like writing.
EJ (01:36:25):
You're not exploiting your
labor correctly.
Apparently not.
Even when your only labor isyou.
Right.
I
Stacy (01:36:33):
am actually going to
write that like dystopian like
set now that what was I sayingis going to be groping at the
guillotine.
I'm going to write that thatstory is going to happen.
And it's going to start just aseverybody rises up and eats the
billionaires.
EJ (01:36:52):
Nice.
That being said, I feel like wecan, we can wrap it up because
I, I, I feel like, um, y'all
Amy (01:37:00):
should let go of FOMO.
And, and, yes, embrace, embracethe JOMO or let, what is it?
Embrace, embrace the DNF.
Stacy (01:37:14):
Yes.
Yes.
Yes.
Embrace the DNF and life is tooshort for bad writing.
Damn straight.