Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Erin Manning (00:02):
Welcome to the
Dead Pixels Society podcast, the
photo imaging industry'sleading news source.
Here's your host, Gary Pageau.
The Dead Pixels Society podcastis brought to you by Mediaclip,
Advertek Printing, andIndependent Photo Imagers.
Gary Pageau (00:18):
Hello again, and
welcome to the Dead Pixels
Society podcast.
I'm your host, Gary Pageau, andtoday we're joined by Dr C
Daniel Miller, and he's theCopyright Detective.
Hi Dan, how are you today?
Dan Miller (00:30):
Good morning, I'm
great, thank you.
Gary Pageau (00:33):
So, Dan, tell me
about your journey into becoming
the Copyright Detective.
You've got a long history ofbeing involved in copyright.
What made you decide that thisis the thing you needed to focus
on?
Dan Miller (00:45):
Well, I've kind of
changed focus a lot of times
during my career.
I spent a lot of time in highereducation, spent a lot of time
on funded projects.
I actually did a stint as thedirector of the NASA Classroom
of the Future when the internetfirst came out.
So I've always been interestedin technology, if you will, and
I actually love change.
So to me, it's not a freighttrain that's going to run over
(01:08):
you.
It's a freight train to jump onboard and see where it takes
you.
You may remember the housingcollapse back in 2008.
One of my stints in I wasworking with architects
designing internet technologiesfor buildings and facilities and
the market crashed.
So my wife had been involved incopyright and copyright
(01:30):
clearances and all of those kindof things.
So I got involved with her whenthat happened and sort of
redirected my focus.
So I've been doing copyrightclearance projects since 2010,
been doing presentations oncopyright and the basics of
copyright and all that.
More recently, I was asked todo a presentation on copyright
(01:50):
and artificial intelligence, sothat got me off on the tangent
that I'm currently on is tryingto figure out AI and what the
implications are if you use AI,and there's some serious
implications out there.
Gary Pageau (02:04):
So, before we get
into the implications of AI,
let's talk about some of thebasics of copyright, because
there's still a lot ofmisunderstanding, especially in
the consumer world notnecessarily my audience, who are
, you know, in the industry andreally kind of understand
copyright but really nothing'schanged despite the change in
technology.
Dan Miller (02:25):
No, basically
copyright law was designed from
the beginning of foundations ofthe country to encourage
creativity and the way that thecopyright law has encouraged
that is to give the owner of thecreative works a monopoly for a
period of time, and now it'sthe life of the author plus 70
years, so you know your heirs ofyour estate actually can afford
(02:47):
the value of copyrightprotection.
It's trying to increase thatNow.
Ai, of course, is changing thedynamics of all that, but it's
all very unsettled.
It's all so new.
You know some of the rights therights to reproduce your work,
the right to make a derivativeof it.
You know that's where some ofthe challenges are coming in now
when AI is getting involved inthat.
Gary Pageau (03:07):
So yeah, let's talk
a little bit about the
derivative work piece, becausethat comes up a lot.
Yes, where you would have.
Okay.
Let's say, for example, I'mrunning a camera store right, or
a photo lab retail, andsomebody walks in and they want
to put someone's school pictureand they've added some text to
the picture, they've modified itslightly or they prop it or
(03:29):
whatever, and they want to putthat on a t-shirt or a mug.
That's a copyright violation,right?
Dan Miller (03:35):
Yeah, because
they're making a derivative from
the original.
It's two levels of copyrightinfringement.
First of all, they didn't havethe right to do anything with
the photograph that someone elsetook that belongs to the studio
.
Or if you or I took the picture, we actually own the rights.
So possession of the photographdoes not give you any rights,
right?
And then I do a double whammyif I modify that.
(03:57):
So let's say, I take a picture,you send me your picture, I
paint on a beard and give youglasses and change the color of
your shirt.
I made a derivative of thatphotograph.
So that's another level ofinfringement, if you will,
because only the copyright ownerhas a right to make a
derivative of the original.
Gary Pageau (04:15):
But what about?
People are saying I'm doingcommentary or something like
that.
That's a free speechapplication.
How does that come into play?
Dan Miller (04:26):
Well, there are some
exceptions to fair use.
That's really what we'retalking about, and doing a
critique is one of the things inthe language of the law that
allows you to use a copyrightedwork, if you will, to make a
critique, news reporting,scholarly works and all those
kind of things.
That's an exception, really,and fair use is not a right.
(04:46):
It's actually a legal defensewhen somebody sues you for
copyright infringement.
Gary Pageau (04:51):
Right.
So the example I use would notbe considered fair use.
Even though I was saying I didsomething to the picture.
It really comes under thederivative works idea as opposed
to the fair use idea.
Dan Miller (05:04):
Well, fair use is so
confusing that I hate to
generalize the statement.
Like I say, it's decided in thecourts, but there are limits to
fair use and right now, that'sthe defense that AI companies
are actually using for all thestuff they have copied without
compensation, without permission.
(05:26):
So they're using a fair usedefense.
Unfortunately, it's going to bea decade or longer before these
things work their way throughthe courts and ultimately, my
prediction is the Supreme Courtwill make decisions and there's
going to be decisions both ways,because there's so many cases.
There are 40 cases right now onthe books for copyright
infringement, so hopefully I'lllive long enough to see a few of
(05:48):
those play out.
Gary Pageau (05:50):
So let's talk a
little bit about that, about
where that infringement occurs.
So, in a generative AIapplication like ChatGPT or
something, I would say I'mtyping into a prompt polar bear
surfing with a bottle of wine inhis hand and I type that into a
prompt and it creates it, andwhat it's doing is it's got a
(06:14):
language model that it's beentrained on on the back end and
those images had to come fromsomewhere, right?
And is that where theinfringement possibly could
happen?
Dan Miller (06:26):
Well, let me back up
.
It's real clear what the USCopyright Office's position is
right now, with all itsdecisions, and that's, if you
start with a prompt, and that'sall the human involvement that
there is in generating an image.
That image is not copyrightable, right, and AI cannot claim a
copyright on it, because AI is amachine, it's not a human being
(06:48):
, right?
So it's just out there.
So, if you create the image ofthe polar bear and you publish
it, so can I, because you didn'town it, the machine didn't own
it.
It's just in the public domain,right.
So that's the first issue.
They've taken a very hard line.
Now, there's only been oneexception.
I don't know if we'll have timeto dig into it, but there has
(07:12):
been an image that was createdonly using AI platforms.
It's by the Invoked platform,if you will, but that's an
exception.
That's out there.
Now the infringement part yeah,ai trained on images that it got
everywhere.
Now, can I actually find thepolar bear that someone produced
that's showing up in your image?
That's difficult, because thereare probably you know, there
(07:33):
are thousands of pictures ofpolar bears that AI has trained
on.
So the way I like to think ofit is all that stuff has been
run through the wood chipper andthere are many pieces out there
and AI has labeled all of thoseand there's a code for polar
bear, big polar bear, smallpolar bear, you know whatever
walking, running, standing,swimming.
So finding the infringementpiece is very difficult unless
(07:55):
you can actually find an exactimage.
Now perplexity actually isdoing something.
The initials are RAG and theyare actually going out in real
time when you do a query andthey will pull an image off of a
website.
So that's one place you reallyhave to be careful and in my
view, they clearly have created,have done copyright
(08:16):
infringement, so I can go aheadand find that image Now they
have clearly infringed.
Gary Pageau (08:20):
It's not generating
an image, it's just sourcing it
from somewhere else and copyingthey copied it, yeah, and fed.
Dan Miller (08:25):
There's not
generating an image, it's just
sourcing it from somewhere elseand copying.
They copied it and fed it backto you.
Gary Pageau (08:26):
That's infringement
clear and simple.
In the part of generative AIwhere you have the large
language model of a polar bear,we'll stick with the polar bear
example.
Why are people upset about that?
Let's flip it on its ear.
Let's say you've been shootingstock photos of polar bears for
(08:46):
years.
Why do you even care if a modelis being trained on your images
?
Dan Miller (08:55):
Well, you don't know
what piece of your work is
actually showing up.
That's part of the problem.
Again, if you think about thequantity of stuff that's out
there, anthropic has been in thecourts.
They actually got over 7million books, for example, from
pirate sources, and they'regoing to court for piracy now
(09:16):
for that.
So again, the problem from thecreator's point of view is how
do I know that that piece that Isee and what you generated is
actually coming from me?
That's the difficulty Right Now.
If somehow the generation ofthat polar bear image infringes
on my ability to make a livingwith that.
That's the basis of a lot ofthese lawsuits and that's one of
(09:38):
the things that the plaintiffis going to have to prove is did
that somehow impact mylivelihood?
Gary Pageau (09:44):
But is copyright
necessarily tied to livelihood?
It?
Dan Miller (09:49):
doesn't have to be
Right.
That's what I'm saying.
Yeah, you can use a marketingall day long If you're not
trying to copyright whatever youproduce.
What's the issue?
It's more a matter of you knowyou're taking work away from me,
so that would be how I'dapproach it.
Gary Pageau (10:04):
But the way the law
is written it's not necessarily
to protect the financialinterests.
I mean it is, I guess, guessfrom the way you explained it
earlier.
But I guess what I'm saying isthat is the end goal of someone
who's upset about their polarbear images being used in a
language model that they're notgetting, that there could have
been a time where their polarbear could have been used in
(10:25):
that picture.
In that composition.
Dan Miller (10:29):
Yeah, the uh, the
thing about loss of income is is
one of the fair use factors.
So this goes back to the wholefair use argument.
What right does an AI platformhave to scan anybody's work
that's out there withoutcompensation and without
acknowledgement?
Either one, so?
And again, part of it is thelack of acknowledgement.
So that's plagiarism.
I take your stuff, I use it, Idon't get your permission and I
(10:52):
don't give you credit, you know.
The last part is plagiarism,and that's what I see coming
down the pike is this wholething about AI disclosure.
The copyright office requiresthat if you're claiming a
copyright, if you're trying toregister, and now Amazon KDP has
a requirement for you todisclose AI-generated content,
and if you don't do that andthey discover it later, they can
(11:15):
pull your book off the shelf.
Okay, so you know that has someteeth in it in terms of
requiring the disclosure.
And again, if you don'tdisclose, you're implying that
you own it, you know.
So that's being dishonest.
So that gets to the heart of iton another plane, if you will.
Gary Pageau (11:32):
Okay, so let's talk
a little bit about enforcement
actions.
You said there's 40 lawsuitsgive or take happening at any
given time, it seems like onthis, yeah.
So what is the point of alawsuit in the sense that do you
think you're going to beremunerated?
Do you just want to stop thebehavior or get acknowledgement?
What's the objective of some ofthe folks with the lawsuits?
(11:55):
All of the above?
Dan Miller (11:57):
But copyright law
has some serious teeth.
So the case against Anthropic,where they pirated 7 million
books if the damages are foundat the maximum level, it's
$150,000 per infringement, pluscourt cases if you do it
properly.
So if you take seven milliontimes 150 000, it's a trillion
(12:20):
dollars.
That's with a t.
So that that's some seriouschange.
I don't care who you are.
Uh, now the judge has alreadysaid well, it can't be that.
That's ridiculous.
But the conversation is leftopen to the billions of dollars
in fines for that copyrightinfringement.
If that's proven.
Disney and Universal now hassued Midjourney.
(12:41):
Midjourney is one of the bigproducers of images out there
and if you look at the courtfiling they have 199 instances.
They have a Homer Simpson andthen you see Midjourney's Homer
Simpson.
You can't tell the difference.
There's C-3PO from Star Wars inthere.
One of them has longer legsthan the other one Beside that.
You can't tell the difference.
(13:01):
So, midjourney, they're upagainst some heavy hitters with
Disney and Universal Studios,and it's the first time a studio
has sued an AI platform.
That just happened in June.
The first time a studio hadsued an AI platform.
That just happened in June.
So you know there is someserious reproduction.
If you do win a suit ofcopyright infringement and an
individual can bring a suit likethat too Now you have to have
(13:22):
deep pockets to keep going afterthese big boys that are, you
know, the AI platforms, becausethey have some serious backers
like Microsoft as well, sure.
Gary Pageau (13:30):
So let's go to kind
of the liability on the output
side.
Let's say, for example, I go toMidJourney, I get a C3PO, I put
it on a coffee mug and I startselling it in my store.
What's my liability there?
Dan Miller (13:49):
I'm not an attorney.
I can't really get into that.
Gary Pageau (13:52):
Sounds to me like
there'd be a liability there,
though.
Dan Miller (13:55):
But if you look at
the terms and conditions of a
lot of these AI platforms, theysay you are responsible for how
you use the output.
Okay, they just put the monkeyon your back Right.
So you know if you use AI andthat's part of what I try to do
is I tell people you know theseare the dangers and here are the
advantages and here are thedisadvantages of using AI, and
(14:19):
you need to know you may beopening yourself up for
liability as well.
Gary Pageau (14:23):
Yeah, because
that's where I think you know
the thing is.
People think, oh, it's on theInternet, I can use it and you
know I can do something crazywith it and have fun and maybe
you can.
But if you start, you know,making products out of it, then
that's a challenge, right.
Dan Miller (14:39):
Yeah, when you get
caught, that's when the
challenge is.
A lot of people get away withit for years or forever, but I'm
very conservative.
I'd rather ask permission thanspend one day in court trying to
defend myself when it'sindefensible.
Gary Pageau (14:55):
Yeah.
So I think that's one of thethings I think people struggle
with is just, in general, is youknow, especially in the you
know volume, the you know volumephotography, the school
photography space that I'minvolved in, is that you know
what's getting with me?
I think for a second, justpeople you know screenshotting,
uh, you know, consumers arescreenshotting the pictures that
(15:16):
are on their screen and thenthey're, then they're using
their, they're violating thecopyright.
Then then the they try and putwatermarks on the picture, and
then there's all these littlelike facebook groups that are
removing watermarks and it justseems like, and then it's almost
like well, why should I spendtime even chasing that for a $5
picture?
Right, it's a challenge.
Dan Miller (15:37):
Yeah, and part of
the problem with AI there are no
guardrails on AI yet Right.
Gary Pageau (15:42):
But even then, for
even just basic copyright
infringement right, it's still.
It's because of digitaltechnology is still a challenge.
Dan Miller (15:50):
Oh, it's easy.
It's easy to do.
It's too easy to do.
That's part of the problem.
Yeah, I see it all the time,even on LinkedIn.
People will respond back andthey'll put a slapping image up
there, or they'll put a newsarticle in their LinkedIn post,
for example, and they don'trealize they just created
copyright infringement.
Gary Pageau (16:08):
So on the creator
side, what's been happening with
like copyright registration,and how does that protect you?
Dan Miller (16:16):
Okay.
So you basically have a 90-daywindow that you need to apply
for.
You need to apply for aregistration because you cannot
bring a lawsuit until youactually get your certificate of
registration or actually applyfor the registration.
Then, once you get yourcertificate, you can bring a
lawsuit.
So if you do it within that 90days, that's when you can get
(16:38):
statutory damages and that'swhere the maximum of $150,000
per infringement, plusreimbursement of your attorney's
fees if you win the lawsuit.
So that's a big difference.
So you have to go through theapplication process to be able
to actually take advantage ofthe protections that you have.
(16:59):
Your work is protected from themoment you put it in a tangible
form, but you can't do anythingabout it until you've actually
filed a registration.
Now the procedure is messy.
If you go into a copyright, ifyou try to register it on ECO,
on the US Copyright Office, itlooks like you're in the Pac-Man
generation from the graphicsand everything.
(17:19):
They're changing it andhopefully it's going to be a lot
better, but I don't know whenit's going to be coming out and
the the ai piece of that is veryconfusing as well.
You know they want a simpleexplanation.
I used ai to generate an image.
They want some simple languageand then you can add supplements
that you want to to actuallyexplain it.
(17:40):
So when when I did thatpowerpoint presentations on
copyright and age of artificialintelligence and I actually
filed an application to registerthat and I actually spent a
couple of hours with an examinerfrom the US Copyright Office
which astounded me that theywould do that and basically he
was trying to pull out of me howI used AI I didn't have the
(18:01):
language at that time to explainit simply.
I do AI.
I didn't have the language atthat time to explain it simply I
do now.
So essentially my PowerPointpresentation is copyrighted,
along with the verbiage that Iused in the presentation.
So it can be a kettle of wormsas you go through the process to
actually give yourself as muchprotection as you can by
registering the work.
Gary Pageau (18:19):
But you can do for
images.
I think you can do bulkregistration yes, absolutely,
which.
I think you can do bulkregistration yes, absolutely,
which I think is important.
It's a whole lot cheaper.
Dan Miller (18:29):
I did a single image
of $65, and I don't know what.
I don't know what you can do$10 or $100.
I don't know what the quantityis, but you can do large batches
, even things like blogs.
You can also register a groupof those at one time.
So that's the way to go is atleast give yourself something
that you can prove that you ownit.
That's another reason for doingthe copyright registration is
(18:52):
you have legal documentationthat you own that copyright.
Gary Pageau (18:57):
Well, yeah, that is
the challenge, because I think
someone can be found to haveviolated your copyright, but
they can't collect damages ifyou haven't actually registered
it.
You can collect damages, butthey can't collect damages if
you haven't actually registeredit.
Dan Miller (19:07):
You can collect
damages, but they're minimal.
You have to prove that you knowI lost this much income because
you did that.
Well, what's that going to be?
You know?
50 bucks, 100 bucks.
It's going to cost you morethan that just to file the
lawsuit, right?
Gary Pageau (19:19):
Where do you think
this is going in the near term
in terms of how people will beusing it?
Do you think there's going tobe like disclaimers on the front
end of these platforms to coachpeople?
Or are they just taking theidea that, well, you know, we're
here to generate revenue andit's the Wild West and we'll let
(19:40):
it go and we'll let the courtssettle it out?
Dan Miller (19:49):
I'm afraid it's the
Wild West right now.
There are platforms out therethat actually license the
content that they use to trainon Absolutely.
So I think trust is the bigfactor that's happening and one
of the problems we have rightnow with AI.
It's a thing called AIhallucinations.
They will give you an outputthat looks feasible and
everything else, but it'sgarbage.
They make stuff up.
So now, anytime I do an AIprompt in there, I'll get an
(20:12):
answer back and I'll ask itwhat's its confidence level and
what it output and it'llactually break it down.
Well, I'm 73% confident thatblah, blah, blah.
But I'm only 40% confident insomething else and a key to this
whole thing is the whole.
Reagan said it trust but verify.
And now I say don't trust andalways verify anything you get
(20:40):
out of AI.
So I look at the sources and Ihave one example.
I was trying to find the hourlyrate of a CPA and I put it in
and it gave me an answer.
And I looked up one of them andthe best way I describe this it
was a chat group from Bubba'sBait Shop in Biloxi Mississippi.
So I don't want to make anydecisions based on a chat group
in Biloxi Mississippi from abait shop.
So check the sources of all thestuff that you get out.
(21:01):
Ai hallucinations are gettingworse, not better.
There's a study I found that30% of the big major platforms,
30% of the output, ishallucination, and in new AI
platforms it's as high as 79%.
So if you think about that, 79%of what they tell you is wrong.
So you know, be very careful ifyou're using AI content.
(21:23):
Now it's a tool.
I use it every day.
I use it to find I'm trying toget copyright, permission or
something.
I'm trying to find somebody'scontact information and I'll go
in and I'll use two or threeplatforms starting with Google a
lot of times and I'll comparethe answer of one platform
versus another platform.
Gary Pageau (21:40):
Sure, another
platform, sure From an imaging
standpoint.
Dan Miller (21:52):
Is there a resource
for AI engines that are using
licensed content?
Again, I would put in an AIprompt and ask for it.
Gary Pageau (21:55):
That's true.
Why didn't I just ask an AI forthat right?
Dan Miller (21:58):
It would be 79%.
Gary Pageau (21:59):
Wrong though.
Dan Miller (22:01):
Yeah, well, that's
why I use several of them.
I use perplexity, I use chat, Iuse the Gemini, I use a Claude.
So I'll ask the same questionto all four of those and I'll
ask them what's their confidencein each one, and then I'll sort
and then I'll go to the sourcesthat they cite to try and find
the right answer.
So I don't again, don't trustand always verify, don't again
(22:23):
don't trust and always verify.
Gary Pageau (22:24):
It almost seems
like that's.
It's almost more work, I think,than a straight search.
Dan Miller (22:30):
If you think about
the old Google search, I might
get 10 million hits.
Well, by the time I get down topage three on those, I'm out,
forget it.
But at least with AI, they'regoing to take those 10 million
and they're going to boil themdown into 20.
Right, so if I'm using fourplatforms, I might have 80
different sources that I'mlooking at instead of 10 billion
.
It saves work in that regard,but I get a lot of very useful
(22:52):
information.
I use it to brainstorm ideas.
If I get stuck on something,I'll run something by it and get
a reaction from it, and a lotof times it'll jar something
loose in my head and let me moveforward.
For that.
So it is useful.
It's a tool.
Tools can be abused, though.
It's like a hammer you canbuild a house with a hammer or
you can tear a house down.
Gary Pageau (23:14):
So in that sense,
we're all carpenters trying to
learn how to use AI as a tool,you know one of the things that
I think is kind of causing ourconsternation is, of course, is
the way it's being used.
But you know, people in thephoto world, people have been
manipulating images for years, Imean for hundreds of years.
Actually go back, you know, toeither staging photos or but
(23:36):
dodging and burning and croppingand negatives.
And then when Photoshop came,that was a digital way to do it
and there was people who aregonna say that, you know,
photography was over becauseeverything could be done in the
computer.
And then, now that AI is here,they're saying it again and,
like you said, it's just a tool.
It's not going to replace likereal people it's exciting world.
(23:57):
Yeah, stay tuned so peoplewanted to get more information
about what you do and learn moreabout you know kind of the
copyright advice that you have.
Where did they go?
Dan Miller (24:13):
It's easy to
remember
Thecopyrightdetectivecom.
Gary Pageau (24:18):
Awesome and you're
available for consulting and for
people to just learn more aboutwhat you've researched.
Dan Miller (24:25):
And there's actually
a page on there.
The workshops page haspresentations that I've done,
other podcasts.
It has recordings of some ofthose as well.
So if they want to find out andI'm doing blogs, I can't, you
know, you can't keep up with allthe changes in AI, so I started
doing blogs.
So that's an easy way to findout what I'm up to and just hit
the contact us and pop me a note.
(24:47):
I'll reply back Awesome.
Gary Pageau (24:49):
Well, thank you
much, dan, Great to talk to you
about AI.
I'm sure we're going to havethis conversation again in the
future when all this stuffcontinues to change.
So thank you so much for yourtime.
Dan Miller (24:59):
You're quite welcome
, enjoyed it, thank you.
Erin Manning (25:01):
Thank you for
listening to the Dead Pixel
Society podcast.
Read more great stories andsign up for the newsletter at
wwwthedeadpixelssocietycom.