Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:04):
I love. Welcome to another edition of a Lot of
Help with James I Jr. Who am I? I'm James Jr.
And I have ideas that I'm here to help, but
this we're is here to help to you because she
is actually a creative and AI specialist. She's out there
doing text stuff for the I space and AI stuff
and AI stuff in the text space. You're doing both
and we just we're coming fast friends and I and
(00:26):
I thought I didn't want to bring it wrong because
I want to try and exercise with her and I
want her to explain some stuff to you out there.
That's that's going on. That's that's the truth, and that's
what's going on. What's going on? Help me. Welcome to
Caroline Holdeno girl.
Speaker 2 (00:39):
Hi, Hi James, thank you so much for having me.
I'm super excited to be here.
Speaker 1 (00:43):
Yes, okay, so I'm going to do this. You asked
me to do something by something and so this is
part of the exercise. We're going to start this off.
I went to chat GPT and asked who she was.
Speaker 2 (00:57):
Yes, said yes, I did it.
Speaker 1 (01:00):
I did it. It's a little bit nothing. It's a paragram.
Speaker 2 (01:02):
I made sure to include my middle name right.
Speaker 1 (01:04):
Yes I did, and this is what I came up with.
And so this is an exercise fuk. So this is
gonna be this is directly laid to our talk. She's
gonna dissect into her whole thing. So here we go. Kids.
So I said, can you tell me about Caroline Swift
holding him? It is Soota. Caroline Shift Holding lives Minnesota,
(01:30):
but she works in the fast moving world of artificial
intelligence in her early thirties. It's telling all your business
in your early thirties. Caroline is sharp, curious, and a
little rebellious. I see that. Yeah, I see it. I
told you see what she said, like, Oh yeah, I
see that. Someone who's just as comfortable in a sleek
corporate meeting as she is coding late tonight at a
(01:53):
cozy offic coffee shop coding.
Speaker 2 (01:56):
Okay, sure chat sure is next to.
Speaker 1 (02:00):
It is the next one. She grew up in a
smaller town outside of Minneapolis. No, so we talked about that.
The daughter of a librarian no no and a mechanical engineer. No.
So it says your middle name, Swift came from her grandmother.
Speaker 2 (02:23):
Almost but not quite not quite, who.
Speaker 1 (02:27):
Always said Caroline would move fast and think faster. No,
and she did.
Speaker 2 (02:35):
That.
Speaker 1 (02:35):
You said you won state science fairs. Okay, it made.
Speaker 2 (02:41):
Up a lot of things. This is funny.
Speaker 1 (02:44):
Graduating early.
Speaker 2 (02:46):
No, Okay, I see what happened here, So it just
made things up. I mean, maybe there's some things that
are in there that are real. But I think if
you ask it to like search the internet for what
you know about.
Speaker 1 (03:00):
You just read this first, and we're gonna get into
that whole. This is kind of fun. That's why I
want to read. It's hilarious. Man, lots of yourself into
the tech world. Carolyzed specializes in ethical AI development.
Speaker 2 (03:16):
Yeah. Sure right.
Speaker 1 (03:19):
She's deeply passionate about making sure the future of AI
is transparent, fair and accessible.
Speaker 2 (03:24):
Yeah.
Speaker 1 (03:26):
He often speaks at local universities.
Speaker 2 (03:31):
I'm given my first talk at the university on Monday.
Speaker 1 (03:34):
So sure, I love your reactions. I love you reaction.
Volunteer says, back to this coding thing, volunteers teach coding
workshops to girls.
Speaker 2 (03:44):
No, if I do not teach coding workshops, I don't
know how to code, but.
Speaker 1 (03:49):
Says occasionally writes thoughts about the human side of technology.
Speaker 2 (03:54):
I do that a lot that you do do right
in her.
Speaker 1 (03:57):
Personal that's probably that's kind of funny in her personal lives.
Speaker 2 (04:01):
My personal life, what's going on in my personal life
in this alger ego version of me.
Speaker 1 (04:05):
Well, you have a dry sense of humor. What you do? Yeah,
because you love kayaking on the lakes.
Speaker 2 (04:14):
I like kayaking. I don't kayak often. That's not like
a regular thing. I think more accurate would be sailing.
I'm a bigger it has sailing too.
Speaker 1 (04:26):
I don't know how to sail when I was thirteen
years old. I have a big boat, a big book person,
and as a soft spot for old vinyl records.
Speaker 2 (04:39):
I wish I did. That would make me so much cooler.
Speaker 1 (04:42):
Yes. The last thing it says is she's currently working
on a project that combines AI and storytelling. Kind of right,
trying to build an engine that helps people create interactive narratives.
Speaker 2 (05:00):
No, unless you disagree, I don't know.
Speaker 1 (05:05):
So I wrote I read it as is because now
I want you to break it down because folks are like,
AI is just too smart. Clearly it isn't. But there's
a reason why you have to prompt and so I
want you to talk about that because people are like,
what do you mean it can work for you. I
did this basically tell me about who she is? But
how could have I made this better and more accurate?
Speaker 2 (05:27):
Okay? So what was the prompt that you made again?
What was it?
Speaker 1 (05:31):
Tell me about? Caroline Swift? Hold it who lives Minnesota?
Speaker 2 (05:34):
Gotcha? Okay, and it got the AI stuff, which is interesting,
so it must have known something about me. But the
fact that it got and it made up stuff like
and I get that, like it probably was like, oh,
I want to know more about Carolyn's childhood and like
that's not on the internet, so like it makes sense
that it would be confused about that. So I'm a
(05:55):
I'm a littlement of mystery. Nobody knows anything, especially not AI.
But yeah, I think the big thing was Sometimes I
will ask, I've done this before for my podcast when
I'm trying to write a summary or like a description
of a guest, I will ask, chat cheept o, can
you tell me about this person? But usually I know
enough where I'm like, okay, this is made up. But
(06:17):
sometimes if I ask, oh, what can you tell me about?
So in this case, Carolyn Swift tolden it just kind
of like it might say something that knows or it'll
make something up if you say, can you search online
and tell me what you find about Carolyn Swift told
and that would have given you a lot more because
(06:39):
it's yeah, oh.
Speaker 1 (06:41):
We're gonna do this, but don't do it exact prompt
And I've noticed before a shot out. I'm like, that's
why I was so excited for you to be on
the show.
Speaker 2 (06:48):
Yeah, and also I noticed, so I don't know how
much chat knows about me being in Minnesota. It's possible
it does, but I would actually just cut that out
because I wonder because a lot of the things that
it put in there were like very Minnesota coated things, right,
like loves to go kayaking on the lakes. That's like
a very Minnesota activity. And then like there were a
(07:09):
few others in there, like the vinyl records, Like people
are really into vinyl here, So like I think there
were some things where it was like, oh, that's a
stereotypical Minnesota thing. So just cut that out. Just just
ask it. Tell me what you search the internet. Tell
me what you find about Carolyn Swift Holden.
Speaker 1 (07:27):
Folks were doing this live well live for us right now.
Search the internet. See that's why I'm learning stuff from her,
I did that on purpose and tell me so literally,
what you're saying is type how you talk, almost like
you would say to people right on check Yeah about
because you don't think if people think it's some magical
(07:48):
like some like. I'm like no, it's like, just just
tell it directly what you want, right. Yeah.
Speaker 2 (07:52):
One of my favorite things that I've been hearing from
big tech CEOs is uh that they should A lot
of big tech CEOs are saying, oh, don't major in
computer science, you should major in English instead, because just
being an excellent communicator is much more important and it
works with natural language, right like, it wants to hear
(08:13):
how normal people speak, not like a Google search or
SEO where you're trying to be so particular.
Speaker 1 (08:20):
So this is this is, this is a paragraph. I'm
gonna read the whole thing. Let me read the paragraph. Yes, careless,
we have. Holden is a multi faceted creative technologist, entrepreneur,
and content creator with a diverse background spanning startups, venture capital, AI, comedy,
and filmmaking.
Speaker 2 (08:37):
There We Go Here.
Speaker 1 (08:39):
She has built products, strategies, and content to help startups
and VC scales. They mentioned your career highlights a comedy
comedy and filmmaking, online presence where you are a substack, YouTube,
social media. This is way more, way more you. I
love it.
Speaker 2 (08:56):
Yeah, that is all accurate. All that I've heard so far,
that's all rue.
Speaker 1 (09:02):
A lot of us who don't know how to use this. Yeah,
that's what if they just stopped while I stopped. Well,
that doesn't sound like her, and I know that's not
her at all. I'm screwed. Where you a valuable lesson?
You guys out there, they call it prompts. I guess right,
and call it prompts.
Speaker 2 (09:20):
Yeah, AI prompting or whatever, Yeah, prompt whatever you want
to call it. But it doesn't matter. It doesn't matter.
Speaker 1 (09:29):
So actually that's what I know that because I talked
to her. I talked her off camera even I knew,
I was like, oh yeah, that is her. We talked
about that, that and that, So it's much more accurate.
So like he wants to it doesn't want to be
so formal, it wants to be informal. Is that one
think thinking is that we will get that way?
Speaker 2 (09:45):
Yeah, I mean the more important thing is to be direct, right,
like if you're ever prompting an AI like chat ept
or perplexity or cloud or whatever it is that you're
using you want to there's definitely strategies that you can use,
but just be being very direct and clear is the
most important thing. So and then also sometimes something that
(10:07):
will help is if you say exactly how you want
it to approach things or what kind of a person
you want it to be. So you can tell chat
GBT you are an AI expert and you know everything
about AI, can you break down what AI prompting is
(10:28):
to me like I'm a five year old and then
it will break it down at like a much easier
to understand level, but also going in there with the
context of it being like I'm an AI expert versus
tell me what AI is right or how to do
AI prompting.
Speaker 1 (10:45):
Oh you said when we were talking off camera, while
we can go talking about this, you said something, I
just I think it's very fascinating. I want to get
into you a little bit, actually, AI just like it's
because you work, work, work, and have worked in the
tech field, and they're very much rout information blah blah.
(11:06):
But there's a whole other side to the rest of
us are looking and as and I feel like you
helped bridge the gap. Can't talk about that little just
kind of like how they're viewing AI certain way, we're
viewing I certain way, and you're like, here it.
Speaker 2 (11:17):
Is, yeah, absolutely. So I like to describe myself, as
Chad GBT said, as a creative technologist. And that's because
I started my career in the arts and then pivoted
into tech and startups later. So I was a film
major in college. I got my first job in advertising
at McCann where I worked for some big mega brands there,
(11:39):
and then I quit to go be a comedian. And
so I was writing, performing, making web series things like that,
and I placed in various national competitions and all that
jazz right, And so growing up I was a highly
creative person working in the arts. I spent some time
in LA I was in New York for a long time.
All that jazz right, and I ended up in UH
(12:03):
tech by accident. I was turned, I tried. I turned
twenty six tragically and had to get health insurance. I
was like, what am I going to do? And I
got a job at a venture capital firm, so a
financial firm that invests in startups of all places, and
I fell in love with entrepreneurship, and I think a
lot of that has to do with people who are
making art and people who are building startups are just
(12:25):
very similar kinds of people, and there's more money in it.
So that was exciting. But what I have been noticing
is there are at least two, if not more, completely
different conversations when it comes to AI. So in a
lot of the tex space, not all of the tech space,
but in a lot of the tech space, and people
(12:46):
who work in AI, they are so excited about AI.
They're talking about all the ways it's going to change
the world for the better, and how you can do
this and this and that and all of these really
incredible things like being able to improve research in medical
spaces right, or in scientific research, or being able to
(13:07):
get rid of a lot of administrative headaches that people
don't want to deal with that kind of stuff, right.
And then of course they're excited that they are able
to create images or maybe like a video or text
for the very first time, as these are not necessarily
the most you know, creative individuals. But then on the
creative side, what's really interesting to me is the conversation
(13:29):
is completely different. It's very, very negative, and I completely
understand it. Really why again, because I've been in both
of these worlds.
Speaker 1 (13:36):
Right.
Speaker 2 (13:36):
So, on the creative side, we have a lot of
people who are really scared that AI is going to
lose take their jobs. We have a lot of people
who are extremely concerned about the energy impacts that AI
has on our world, which makes a lot of sense.
And then there is the whole copyright aspect of it
(13:58):
all that yes, are our data was stolen. It's true
that happened, and they view using any AI tool as
maybe not everybody, but there seems to be a movement
of if you use an AI tool, you are an
unethical person. And so I'm looking at this situation and
(14:19):
I'm kind of like, wow, Okay, We've got two completely
different conversations. And with social media, we're in our little bubbles, right,
people are not seeing eye to eye. So the tech
people are so confused about why creative individuals are so angry.
They don't get it. They're like, don't you see that
(14:39):
we're able to create personalized medicine. Don't you see that
we're able to make major impacts on fixing the climate crisis.
Don't you see that we're going to be able to
do all of these really cool, incredible things, and then
on the creative side, people are seeing lots of job
loss right and from Yeah, so I think that those
(15:00):
are kind of like the spaces that I have been
trying to build bridges for because at the end of
the day, AI is just software. Like it's so funny
to me because depending on who I'm talking to, I
either will say that, you know, AI is the most
amazing thing that's ever happened, or it's really not that
big of a deal. Guy, It's really not that big
(15:21):
of a deal.
Speaker 1 (15:23):
Right, And everything you just said, you've said it perfectly
because we are I'm in different spaces to just from
being an interviewer, but I'm also a creative I was
marching a couple of years ago out there and could
we couldn't work in certain spaces. But then also I'm
reporting on it, and then I'm like and then my
colleagues are using it, so it's like, I don't what
(15:46):
you mean. Like I'm like, I caused like the Sandwich generation,
Like we're somewhere between, like we're like sandwich between. We're
scared and oh my goodness, to I said, don't be scared.
It's one of the others, like, don't be scared, scared?
Scared scared? I like you said, it's like, it's just
it's just software. So how do I like? But some
(16:08):
ofilmes think it really isn't just software. They're scared, they
really are scared. So what do you say? What more
do you say to them? Once helped me last time
off the phone while we were on the phone, How
do you How do we talk to people about that
creative especially.
Speaker 2 (16:22):
Well, it depends on what people are focused on, and
so I and I think the first thing first is
explaining a little bit more about how the software works.
I'm not gonna I don't know if I'm going to
dig into that too much today maybe if we have time.
But let's address kind of like the biggest concerns that
we're seeing on the creative space. So talking about AI
(16:43):
taking jobs, I think that's the first one. I think
that's probably the biggest one that is really scaring people,
especially because a lot of the work that is in
AI that's getting the most publicity is on the creative side. Right,
Like people were astounded that Surah could make people were
astounded that chat GBT can make images, or write or
(17:04):
do all of these things, and that you know, causes
a lot of fear for people. But I think there's
a few different things going on here, right, Like when
it comes to AI taking jobs, and especially from a
Hollywood perspective, which is the part of the industry that
I know best. We've been seeing the collapse of Hollywood
for the last ten to twenty years, all right. It's
(17:26):
been slow moving, it's been dying for a long time.
Sales have been going down, and the industry has tried
to adapt by making the same superhero movies over and
over and over and over again. And it's been incredibly
difficult for studios to exist, and so everyone's trying to
buy each other out all of this kind of stuff, right,
(17:50):
And so for me, what I see is the bigger
threat to creatives in particular losing jobs is not actually AI,
but it's the executives.
Speaker 1 (17:59):
Right.
Speaker 2 (17:59):
It's We've got these big companies that are trying to
pay a ton of people and they can't afford it
anymore because they are not making the same returns that
they used to. It's not keeping up with inflation all
that stuff. But what I think creatives are not seeing
that they need to realize, Like, now is that with
these tools, if you want to use them or not, Right,
(18:22):
it is now easier to build your own business. It
is now easier to make your own films. It is
now easier to make your own music, and to use
all of these incredible tools to not have to deal
with these assholes anymore. Right, you don't have to work
at Pixar to make an Oscar Award winning animated film.
(18:43):
You do not have to work for Warner Brothers. You
do not have to do that. You can now grab
your five most talented artist friends. Right. You can find
your favorite writers, your favorite animators, your favorite artists, the
people who've always wished you could work with, and you
can make your own project. And it might not be
(19:04):
able to go into the theaters though, you know, if
Taylor Swift can get around those gatekeepers, I'm sure we
can figure out a solution. But you can publish things
on social media. People don't seem to recognize that YouTube
is bigger than literally any other streamer. My god, guys,
the world has changed, and so like, imagine if we
(19:28):
could have not you know, two to three really big
animated films every year. What if we had fifty, can you.
I think what people don't understand is that AI can
coach them on how to build a better business so
that they can learn the basics and figure out how
to get started before they're able to hire on real
business people who can help them really scale things right.
(19:51):
And they don't realize that just because Hollywood and old systems,
these old gatekeepers like it are collec it doesn't mean
that your career is over. It means that it's just beginning,
all right. This is incredibly exciting stuff. And I have
been frankly devastated to see so much content on places
(20:12):
like TikTok of like animators sobbing in particular and being
like not necessarily sobbing, but you know, basically saying that
they feel like their life is over because they're not
going to be able to live their dreams. And I'm
just sitting here like yelling from the rooftops, guys, guys,
you don't need to work at Pixar. You can build
your own Pixar. This is what's happening right now, and
(20:34):
it's so exciting, and I just really want people to
hear that in particular.
Speaker 1 (20:39):
Well, you know, it's like a while back when I
remember my first start of my business, and it's a
long time with here. I was on Yahoo business. But
they had software where I can build my own website.
And that's what the first time we okay before you
to hire somebody, yeah for you. It was like they
had these two amazed. So I feel like this is
(20:59):
the same ideal. It's like now we just have again
more advanced technology. We just make they can do more
for you. Like before, you know what virtual assistance and stuff.
But when that came out, you know, somebody from far
away it can help you. Well, now you don't necessarily
need somebody from far away to help you. Like now
you can literally tell I what you need. And you know,
(21:19):
we already have Alexa and Siri. You know, I think
they've been around for a while now already.
Speaker 2 (21:24):
They just weren't very good.
Speaker 1 (21:26):
Right, turn the lights off and they're like the oven.
Speaker 2 (21:30):
No turn the lights off, light of the lights blue?
Speaker 1 (21:34):
Now right exactly? They don't never regnized my voice never happens.
But the whole point is we've always had these things,
like I remember, I remember when you know, when CDs
came out. I remember when DVDs. I remember all that stuff.
It's like it was just it was always scary at
the first. I like my walk man, I want to care.
I like, you know, I remember when iPods came out.
(21:55):
It was like two thousand and six, and I was
I was I was a grown man by then, but
I used to have a discmand that I would carry
of CDs i'd play and my discman my headphones, and
I'm like, oh do I want to like I love
my thing with James. It's all digitally, just put it
in the little It's just a little thing and your
music's in there, like my music's in there. I remember
(22:16):
being like just like amazed. So but also I think
I think our natural tendency is the robots are bad.
They're gonna turn on us. It's eye robot, They're gonna
turn on us. But like literally, this is something that's
actually could be good, right yeah.
Speaker 2 (22:32):
And I think, uh, you know, every time that we
have some sort of a technological revolution, there's always beer.
And it's because change is scary. Not knowing what's going
to happen is scary, and not knowing what the future
is going to look like it's scary. But at the
same time, when you were operating in a technological revolution
(22:53):
or a financial revolution. That also means there's a lot
of opportunities, right, So whenever there is new technology, if
you're able to be first, you're able to have really
big gains. Like I was thinking earlier about who was
it like Greta Gerwig and Noah Bombach. You know, they
were some of the first to be using digital cameras
(23:15):
and all of that stuff and doing these super super
low budget mumblecore dramas that nobody had made before and
that normal studios weren't necessarily funding. But they were able
to do that. And now Greta Gerwig just did Barbie
and is also like doing Narnia coming up like these massive,
(23:36):
big budget things. So I think we need to really
reframe how we are looking at this situation when it
comes through the job things, and recognize this is not
the end of the world. This is actually open doors
and opening opportunities for more art and more better art
to be created. Because frankly, I don't know about you,
but I'm pretty sick of the gosh Darren's superhero movies.
(23:58):
I would love to have more original work, and I
think that this type of technology is going to make
it so that we're going to have a creative and
artistic revolution. It's not the end of the world, right,
But I want.
Speaker 1 (24:15):
To bring this up to you because you were, you're
you're also in the space, and actually the tech space
is the same thing. Competition. I feel like, see now
it's going to even the playing field, so to speak,
because I mean, and I think folks are scared of
that also too in the creative places, like well, no,
I'm the first, we're the first, all of the parts,
(24:35):
I can do it. But now it's like, well, if
James Junior and Inglewood can do it from his home then,
and I feel like it just means I think I
don't see competition anyway and say everybody's is more than
Merry or like you did more than Marria. But I
also think, well, maybe I'll make you think harder on
how you do your next project. Maybe I'll make you
maybe I'll help you elevate your game to you on
(24:56):
some level. But always, but I always, I always feel
like that's also a part of this too, is that
if everybody.
Speaker 2 (25:06):
Can do it, I know. I think a lot of this,
you know, also has to do with how these creative
industries have been structured, right, And so I was in comedy,
and the expectation was you were going to spend five
to ten years working for free and then maybe if
(25:27):
you were lucky, you would have a big break or
you'd have to give up, go home and do whatever
the other, you know, whatever, your backup plan was right,
and that was like kind of it, and you weren't
really able to build your own business or find other
ways to make money. And now we've got people who
are absolutely killing it on YouTube who were able to
(25:50):
build their own paths. I mean, I keep I need
to keep asking around how many people are watching drop
out TV, which is, you know, entirely just creative people
bopping around like that's not Hollywood, or maybe some people
consider Hollywood. I don't really know what the full context was.
I think maybe it was previously college humor or whatever
it was. But I can't tell you how many of
(26:12):
my friends who are very disconnected from a Hollywood or
filmmaking or comedy or all of these things are watching
drop out TV here in Minnesota. Like it's becoming the
mainstream in that regard. And I think, I think when
we reframe it to this is an opportunity and like
(26:33):
do your best to like take a step back and
take the emotion away from it. Right, Because the reality is,
just because anybody can have the same idea as you,
just not mean that they're gonna have the same execution
as you. Right, just because Joe Schmall who is like, wow,
I can do a comedy set, I can use chatgbt
to make it, doesn't mean that's gonna be any good.
Because the other thing to know about these tools is
(26:55):
they are really really good at doing the same things
that have been made. They're really really bad at creating
completely new and novel ideas. Right, So Joe Schmoo might
use chatchepet to come up with an entire comedy set,
but that doesn't mean it's gonna be any good. You know.
Maybe he's gonna tweak it and make it so that
(27:16):
it is really good, but it doesn't mean it's going
to be really good. Whereas somebody who spent the last
ten years learning how to do comedy, learning how to write,
learning how to structure jokes and pacing and all of
these things, you might not want to use chat gept.
I frankly don't use chat chepat for a lot of
my writing because I prefer to do it myself. I
usually ask Chatchepet to check my work afterwards and tell
(27:37):
me where to shorten things, because I always write to
gosh darn much. As you may have noticed by the
amount that I'm talking right now, right, it has a
habit of taking out my jokes, which is very rude.
I have been working on this. Eventually, hopefully it will
stop deleting my jokes. It just like keeps all the
exposition and they're and I like, guys, you're missing the point.
You're missing the point. This is rude, right, But if
(27:58):
you are able to us to learn, oh, how do
I create a successful business on YouTube? How do I
create how do I go and start touring? How do
I reach the right people? So I'm able to do
comedy in this regard, right, that's where the opportunities are.
And just because Joe Schmill has some good ideas doesn't
mean that he's going to execute on them as well
(28:20):
as you.
Speaker 1 (28:20):
Are, right, because I love that's a great answer, and
that's a very good point focus out there. And I
just think that it's you know, when he started doing
self publishing book and every book is bad and every
book is there's some great sub publish, some great sell
published books out there. I told you, well better there's
some the ones that I get published. I've seen publish
books had errors and I'm still this is editing mistake.
(28:43):
So I mean, I just I think it's I like
the playing fields. I think I like that. I feel
like when a pandemic hit and we're all on zoom,
I mean, everybody equal. It didn't matter if I was
Jimmy Kimmel or me, like, we're all doing the same.
We had the same glitches, we made the same you know,
it was always all the same. It was like it
was like I feel like, I I'm tired of this
kind of and this happens and what happens to tech
(29:03):
Tuo what happened, But it happens in creator space where
it's always this hierarchy. Everybody's trying to position themselves and
you know this too, and it's like I'm here and
you're here like that kind of thing. It's like, no,
why can't we all just be around here? And you know,
and some folks are better than others, some people are
more natural, and I think AI I'm learning about it,
(29:24):
going Okay, this is actually just a tool, and it's
just it's just making me helpful. But but it still
comes down to me. I mean, I'm the human element. Still,
it's still you don't know if it's if it's you
or not became the creative stuff. It's like, this is me, Andy,
and maybe I can I can create a pitch technically quick.
You know, then that before it was just me. I
use I use. I used to open clips and my
(29:46):
clips like I can sit and clip. I like the fact,
know I tell it what I want clip and it
clipted it for me. It does save me time, and
it's still still us talking. It's still my clips. No.
Speaker 2 (30:00):
And something I'm noticing among my artist friends too, is
that a lot of them seem to think that the
only things that AI people are working on is like
doing creative work, which couldn't be further from the truth.
Speaker 1 (30:13):
You know.
Speaker 2 (30:14):
That's kind of with generative AI that ended up kind
of happening, but it wasn't really the full intention. Like
and a lot of the ways that people are using
AI most successfully is to help them with administrative tasks,
right booking calendar meetings for you so that you don't
have to go and text your friend over and over
and over and over and over and over and over
and over and over and over and over again, to
(30:34):
try to actually get like this time that works on
the calendar, right, or making it so that it's easier
to I don't know, write pitch emails that are like
really technical and hard to write to different producers with
your favorite screenplay, right, Or to just streamline your taxes right.
(30:57):
I watched a podcast the other day where some author
were right saying, why can't AI do my taxes? And
it's like, that's what TurboTax is, you know, you know,
that's what a lot of these different systems are, and
you can use them, and you can use all these
other different tools. Like I think a lot of people are,
(31:17):
especially in the creative space, are like, oh, they're trying
to get rid of everything that I do. And it's like, well,
it might be able to do some of the things
that you can do, but really the way that people
you should be using AI is to help fill in
the gaps for the things that you don't know how
to do very well. Right, So just think of anything
that you don't know how to do particularly well, and
then ask chatchipt or ask perplexity or ask claud whatever
(31:37):
system you want to use. Right, there's a lot of them.
So if you don't like sam Altman, you can avoid
sam Altman. That is a possibility, right, that you're going
to be able to, uh like learn how to do
the things at least at a decent level, so that
you're able to move forward and do things that you've
never been able to do before. Right.
Speaker 1 (31:57):
Recently, for questions for a contract, I was entering a
space that I didn't know very much about. I just said,
can you give me seven questions that I should ask
that should be included on a contract blah blah blah
the time of contract? And get great. It was great,
And I was like it just it saved me trying
to call somebody to find somebody, you know, I mean
(32:17):
all that stuff that you do tell me. I was like, okay,
and they saved me money. Let me trying to get
most lawyers you can't talk to me. So I was
just like I wanted but it gave me. It said
seven things that I really need to know that I
just I didn't know about this type of contract. And
so now when I went to the meeting, I was prepared. Yeah,
I was like, oh, I said, and what about you
(32:39):
know what I'm all and what about blah blah blah.
Speaker 2 (32:41):
You're like, oh, okay, yeah, Like you can put contracts
in and be like tell me, like, what are the
most important things or is there anything weird that's standing
us uh standing out to you about this. I will
even use I don't do it that often, but I
have used AI to help me put together extremely simple
contrac because that's not something I'm particularly good at, but
(33:02):
I just need someone. I just need something that says, hi,
you will pay me this much. In these days, like
at the end of the day, some of these legal
PaperWorks like yes, you want a lawyer. Sometimes other times
it's just like I just want an agreement on paper
that we like did this thing right? And so being
able to go on the chat GPT and be like, hey,
I need like a template for a contract that includes
(33:24):
this type of stuff, this type of stuff, this type
of stuff, and then it just makes it. And then
what you can do, if you're feeling extra fancy, is
you can copy and paste everything that's in that contract
and then you either use the same AI tool or
a different one to check your work. So you can say, hey,
is there any ways that you would improve this contract?
And then I will tell you how to improve its contract,
(33:45):
even if it already made it. Because it's not very clever.
It doesn't. It's not it's not a person that doesn't
like have self awareness. It just knows. It just wants
to answer your question. The other thing to keep in
mind is it does these systems only want to please you, right,
so they they want to give you the answer that
they think you are going to want to hear. So
you do need to sometimes instruct it to be self
(34:07):
critical and exactly in which way. Otherwise it's just gonna,
you know, be like a really happy puppy dog or
like an intern who's like, I did everything you said,
but I don't understand anything that actually exists in the world.
Speaker 1 (34:20):
Right I did? I love that. Yeah, I'm learning it
to be very specific. I think the mind and I
think that's another thing about that. If you guys didn't
catch this the whole show, being very specific in your
own language, ye will help you more and you'll get
more out of it. Because another person told me that
(34:40):
AI just really will always try to It'll keep going forever.
You will keep trying to improve and prove kep asking
to improve. It won't stop. So at some point human
you have to stop, go this is good enough or
this is right.
Speaker 2 (34:55):
Yeah. I will ask chachip to like review my newsletters
these days, and it will review the news letter and
give me like ten things to fix. I'm like great,
and then it'll give me five things to fix, and
then it'll be like great, and then it'll give me
five more things to fix. I'll be like great, and
they'll give me five more things to fix, and I'm like, ruh,
chill out, we can be done now.
Speaker 1 (35:15):
I love it. But let's let's let's let's just a
flip side. Do you see any downsides to AI?
Speaker 2 (35:24):
Yeah, okay, I do want to address the two elephants
in the room that I mentioned earlier, which are the
copyright issues and the environmental impacts. One of them I
have a relatively good answer. The other one, you know,
not so good. So when it comes to copyright, as
I said before, yes, these guys did absolutely steal from
(35:47):
us and they're continuing to do so. If you're using
Facebook or Meta Okay, Sockerberg, whatever, like, they are using
all of the content on their site to better improve
their system called Lama. Right, Google, it's not confirmed. It
appears that Google is going into your Google drive. So
(36:10):
if there's an.
Speaker 1 (36:13):
I believe yep.
Speaker 2 (36:15):
So if there's anything too sensitive in there, maybe to
get out because that's not great. You know, both open aye,
So chat, gipt and Google uh scoured all of YouTube.
Reddit has gotten a lot of money because you know,
they took a lot of content from Reddit. But but
and so the question is what do we do about this?
(36:37):
All right? The good news is there are several lawsuits
that are in play right now that are basically saying, hey,
AI companies, you stole data. So the one that comes
to mind to me most right now is the New
York Times. New York Times is like, yeah, you guys,
you guys stole this is ridiculous, right, So what's going
(36:58):
to happen is the courts are going to side, all right,
and there will eventually be some sort of a process,
hopefully where people are going to be compensated. And that's
where we're at, right. So as an individual, if that's
too much for you, I totally get I totally respect
that you don't have to use AI tools. Me coming
on here is not telling you that you have to
(37:19):
use AI tools. You can use something if you want to,
but at the same time, there's not much that we
can do about that. But the pros of using these systems,
which are doing things like helping diagnose patients with ninety
percent accuracy compared to doctors who have seventy four percent accuracy,
and creating personalized medicine so that you know, women are
(37:41):
able to get better care rather than just assuming that
every single person is, you know, a straight white man. Okay.
Like what we're seeing in a lot of these different
pros that are coming out of the AI space are
just absolutely incredible stuff. All Right, We've accelerated a lot
of the research that has been needed when it comes
(38:02):
to medications, pharmaceuticals or disease studies by in some cases
decades to one hundred years worth of research has now
been conducted in a few short months. It like the
pros out of this are really really, really incredible, and like, yes,
it sucks that this that data theft was involved. The
(38:26):
question is are we going to hold it against them
forever or we're gonna let the courts deal with it, right?
Is it really in your best interest to avoid this forever.
It might be might not be right. That's a decision
that's up to you. Now. The other con the other
elephant in the room is climate and how much energy
(38:47):
these systems are using. So, yes, that is bad news.
It is. It is bad but I and I don't
have a lot of easy things for that. But I
do have a few reassurances. First, individuals, prompting a chatpot
doesn't do that much when it comes to energy usage. Right,
(39:07):
If you as an individual, the amount of impact that
you are having is not that big of a deal.
I don't have the numbers for that, but I know
that's the case. Training the models, so updating these new
systems to make them smarter or better does have a
big effect, all right. So if you're worried about a
very single prompt, it's probably going to be okay, all right,
(39:29):
But usage for images is going to be more. Usage
for videos it's going to be more. So that's something
I keep in mind. However, and here's a major caveat
We are now building these systems far better than we
used to, and they are becoming more energy energy efficient.
What do I mean by this? You may have seen
some headlines about a Chinese LM called deep Seek. Deep
(39:51):
Seek was able to build a very similar model to
chat GPT for a fraction of the amount of compute,
which means a fraction of the amount of energy used.
So now all of these systems are trying to figure
out how can we copy deep seek and build our
systems so that they are more sustainable and cheaper and
(40:13):
more effective. Okay, So although the numbers are going way
up right now, they are coming they will start to
come down. Yes, as more people use it, the numbers
are still going to go up. But this is something
that is going up and down. It's not something that's
going exponentially yet. Okay. The other thing to keep in
mind is that so data centers are a lot of
(40:36):
the biggest emitters, and a lot of it has to
do with needing to cool down all of these computer systems. Right. Yeah,
I am currently in one of my projects. I am
doing research on how these companies are trying to cool
these centers. Many places are switching from air cooling so
(40:58):
HVAC like air conditioned cooling or these machines, to liquid cooling,
so using water in usually closed loops or maybe near
an ocean or in Minnesota, maybe near Lake Superior, right,
and using more natural ways to get to pull out
more heat from these computers while using less energy. So
(41:22):
there are a lot of there is a lot of
good news in that space. We're just it's just gonna
take a little bit more time to get there. So, yes,
it is a problem right now. It is getting worse,
but there are a lot of scientists getting better. In addition,
a lot of the tech community is very you know,
(41:42):
they want to save the climate. That's something that some
of these oligarcs are very into. And so in a
more controversial thing that to keep in mind about which
I don't think people are talking about enough and it
is controversial, is a lot of these companies are find
ways to build more energy centers. They're trying to use
(42:03):
more clean energy. One of the clean energies that's being
pushed the most these days it is nuclear. I know
a lot of people aren't comfortable with nuclear, but if
you actually look at the impacts that nuclear has, especially
more modern day systems have compared to a lot of
these other you know, energy developers, it is so much
better for the environment and it's so much safer than
(42:25):
people realize. So Cleo Abrams is a creator I really
like to follow. She had a phenomenal video on this topic.
I highly advise that you watch it. It really shifted
my perspective, like, yes, am I a bit nervous but
nuclear of course, but I am. I was very much
more assured after watching it, and I think it's definitely
(42:46):
worth checking out because a lot of things that I
thought I knew about nuclear were not true. So just
something to keep in mind.
Speaker 1 (42:53):
That's that's a big for you addressing those two things,
because those are two things we always talk about. People
talk about that, and you know, the carbon footprint is
something that people made happyople laugh at the other half
like it serious, it's a serious and I agree that serious.
I just I just think it's But I was like,
when you start something, well, we had these giant TVs,
(43:16):
they're huge, you know, console TVs, like, yeah, yeah, microwaves
pretty huge, and starting things. As they move along, hopefully
things get better, right, they get more efficient and more
and more user and more friendly to the environment and
user friendly. So I was feel like, that's that's how
it starts out, and then they kind of and.
Speaker 2 (43:34):
People forget that, like in order to get the technology
that we had to have today, Like things were really
messy along the way, right, So, Like it used to
be that to use electricity in your home, everyone would
have like a single light bulb that was basically where
(43:55):
chandeliers go. And then if you wanted to use any
other machine in your home, you would get like a
really big extension cord and plug it into the light
bulb socket so that you could run a washing machine,
like and that was the only source of electricity in
your home. So and it was very and you know,
that was aggressively not safe, right, Like the house would
(44:16):
go on fire because people were trying to plug everything
into their chandelier that was way up on the ceiling
that they couldn't reach. And it was this the whole thing, right,
So like and so things can be. Things are often
messy at the beginning of things, but tend to get
better over time. And it is something that the industry
is taking very seriously and talks about often. But it
is something that I think creative people in particular don't
(44:40):
recognize that people in tech this is something they are
absolutely really focused on and look focused on finding solutions
and that there have been proven solutions that will make
things better. It just takes time to implement them all.
Speaker 1 (44:52):
Yeah, so folks, I mean guys, I can be at
this hour by so fast, you know. But the thing is,
if anything major happens in this space, you got to
come back on that to that shoot.
Speaker 2 (45:06):
Me on for like a monthly segment, a quarterly segment like.
Speaker 1 (45:12):
Good my good girl down for that, because there's aways
theres always something going on. It's always it's a it's now,
it's not the future now.
Speaker 2 (45:21):
And I think like the biggest thing is again the
job stuff, right, Like the combination of AI and social
media is making it so that suspect because I know
you have a lot of people in Hollywood and New
York and whatever who are in the Hollywood space right
in the in the industry right there is so much
more opportunity than what you're hearing. And if your social
(45:46):
media feeds right now only talk about AI in a
really negative content text, I really urge you to look
for more types of content. So if you want, you
can subscribe to my stuff. So it's called swift Start Go.
Speaker 1 (46:01):
Swift. I love that formard name. Go ahead, tell me
what I can find your substack and all that.
Speaker 2 (46:08):
Yeah, so you can look up my substock. I think.
Actually I haven't checked to make sure that I'm like
Google friendly, but apparently I'm AI friendly as long as
you make sure to search the internet. But if you
go on the substock and either search my full name
Carolyn Swift Holden or Swift start Goo, I'm pretty positive
you can find me. I also have a website which
is just carolynswiftth Holden dot com. Pretty simple. I also
post a lot on LinkedIn, so if that's your thing,
(46:30):
you can find me if you look up Carolyn Holden
Carolyn Swift Holden. I will say, if you want to
Google me and you're trying to find me, you do
have to use my middle name. I'm sorry, I know
it's annoying, but if you just do Carolyn Holden, You're
just gonna get a lot of photos of Yasmin and Baywatch.
So you want me, I'm not a lovely lady in
(46:50):
a swimsuit. You need to look up Carolyn Swift Holden. Okay, can't.
You can't just do Carolyn Holden? Are you screwed? You're
just you're not gonna get me? Or maybe it's a
gift to get yes and believe instead of me. Perhaps
it's not better.
Speaker 1 (47:03):
She was beautiful back in the day. I don't I've
seen her recently. It was just beautiful back in the day.
Speaker 2 (47:07):
She was beautiful back in the day. Man, I wish
I had that body. That'd be so great.
Speaker 1 (47:11):
I was had that body too, And the other story.
Speaker 2 (47:14):
You know what I would do if my job was
just slow motion running on the beach, what I would do.
Speaker 1 (47:20):
I actually know Pamela Anderson like, actually, that's my that's
my I've met her several times over the years.
Speaker 2 (47:25):
Tell her that. Carolyn Holden says, Hi, I know.
Speaker 1 (47:27):
I love her. She's one of my actually one of
the nicest people. I love her. Thank you for being
on the show. Of course, so I told you back.
And this is a lot of help. But that's what
we're trying to do and trying to help. We're trying
to share ideas and make them more of in a
just not so scary form, just like, don't be scared.
(47:48):
It's like we don't. We're gonna hold hands, We're gonna
go through this together. It's going to be okay.
Speaker 2 (47:54):
It's just software. It's just software. It's gonna change the world,
but just software.
Speaker 1 (48:02):
Just said I'm like, we'll end it here, a lot
of helps on Facebook. Of course, shames are everywhere else.
We'll see you next time.