Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:01):
Welcome everybody to
another Tactics Tuesday.
Today, what we're going to talkabout is AI for Amazon and
specifically talking about thehype versus reality.
I think we bring some veryunique perspectives to this,
both from the brand perspectiveand then also from the agency
perspective of how much is AIreally moving the needle on
Amazon, what does that look likeand what things out?
(00:24):
There are still very much hypewhen it comes to AI for Amazon
sellers, and so with that, I'llactually turn it over to you,
matt, because I know you spendquite a bit of time immersed in
this world.
As far as connecting AI andAmazon, what do you see on the
field right now?
Speaker 2 (00:43):
In terms of AI for
Amazon, I think we're talking
about images specifically, andso I've seen a lot of really
really.
Now, chatgpt just updated theirimage generation tool, which
for me, in my opinion like Ididn't play around with
mid-journey all that much asanywhere near as much as I know
that Michael did but for me, theChat, chat GPT image generation
(01:07):
tool, when they upgraded itabout a month ago, is night and
day better than it was before,and I did a lot of extensive
playing around with it.
Now, for some reason, aidoesn't understand certain
products, like pool cleaningtools, for example.
It just does not understandwhat a pool brush is used for or
how it's used, no matter howmany times you ask it to change.
(01:28):
And that's still the case evenwith the upgraded version of
ChatGPT's image generation tool.
However, there are other typesof products.
A girl in my mastermind sellsflower seeds, for example.
Flower seeds, for example andthe images that she's been able
(01:48):
to make with the chat GPT imagegeneration tool, but also mid
journey, are fantastic.
I mean, better than what shewas getting from her graphic
designer before.
So there's still a lot ofreally really good use from
these image generation tools forcertain products, but none that
we work with directly.
Unfortunately, it just doesn'tunderstand what the products are
being used for, what the usecase is, and none of the
(02:10):
products really fit well withinjust put this product in a
different scenery, which is, Ithink the image generation tools
are really really good at that.
But yeah, so for the productsthat I work with directly, it's
just it's not good enough yet tounderstand what the use case of
the tools are, so I haven'tbeen able to really use it in
that regard, at least for thebrands that I'm involved in.
Speaker 1 (02:30):
Well, to kind of
summarize that, at least from
what I'm seeing, is, first ofall, the tools have gotten a lot
better.
So, if you haven't used them inthe last six to 12 months,
they've gotten dramaticallybetter.
When we talk about,specifically, ai generation, and
what I would say out of that is, is that kind of like what you
were talking about, matt, whichis, if you look at doing things
(02:52):
like changing out backgroundsand kind of things in the
backdrop, if you have, you know,a photo of your product and you
want to change out kind of thatbackground, it does work well
for that.
Um, however, with that said, ifyou're looking at people or
you're looking at complicatedproducts, those types of things,
(03:12):
um, there's still somesignificant challenges there.
Um, that said, I do think thatthere's a great opportunity for,
you know, even if you alreadykind of have your set images,
you know, um, you know, we werejust working with a client the
other day where we were lookingat, uh, a father's day promotion
, and so, you know, that's aperfect example where I think,
um, you know, there's anopportunity for chat, gpt, where
(03:35):
the the main photos and we'vegot some um resources already
built there that then we can usesomething you know, chachi
Batty or another AI imagegenerator in order to create
those kind of, you know, promospecific images that can make a
lot of sense.
But I do think that we're stilltoo early for the days of.
(03:56):
I know I've already gotten thisquestion from a few clients of
like, hey, do I need a photoshoot anymore?
Can't I just, you know, whip itup in AI?
And it's like, well, we're notquite there for most products
yet.
You're gonna have to do somesort of, you know, photo or
video shoot.
Now the nice thing is is Ithink that you can extend those
resources, once you have them,from a decent photo photo shoot
(04:16):
to do a lot more with it thanwhat you could do in the past.
So think about it more from ayou know, getting a lot more
bang for your buck as opposed tocompletely replacing any sort
of photo or video shoot.
So that's what I'm seeing fromit, especially on the image side
.
You know, mike, I'd love tohear you know what you're seeing
from the brand ownerperspective.
Speaker 3 (04:37):
Yeah, I think a few
things.
One thing that I would say is Ido think that we are moving
into a space like you just said.
You're not going to be able toskip the photo shoot or the
video shoot, right?
I think that for some products,that might not be true.
I think there are products thatyou probably could skip the
(05:01):
photo shoot.
You might not be able to aseasily skip the video shoot, but
, by the same token, you coulduse UGC for that.
You could use influencers forvideo stuff potentially, as
opposed to necessarily a videoshoot.
Now, it depends on what kind ofmarketing you're trying to do.
It depends on where the videosare going.
(05:22):
Maybe you still need the videoshoot, but when I say that you
might be able to skip the photoshoot, I think that we've gotten
some really good renders doneof our products.
Now, again, as Matt said, ourproducts are not really AI
friendly.
I can't just give it a pool netand say this is our pool net.
You know, put it into this.
(05:43):
You know it doesn't do that, itdoesn't know what to do with it
.
But, that being said, there area lot of products that AI can
do that with, can really take,you know, especially you know
like supplements.
You know like a supplementcontainer or a drink container
or a you know.
You know kitchen gadgets orthings like that, where it kind
(06:04):
of understands the scene, itkind of understands how people
use it.
You know it can place it in thekitchen with the mom and the
kid it's getting really good atthat kind of thing and I think,
especially for lifestyle imagesthat are more about creating an
emotion than they are aboutshowing a specific feature of a
product or something like that,I think you definitely can do a
(06:25):
lot of that with AI right nowfor some products, and if you do
some renderings which are wayfaster and way cheaper than a
photo shoot, a lot of thoserenderings can then be turned
into AI imagery.
So I think you might be able toskip the photo shoot for some
things.
We definitely can't, Um, but asyou said, what AI is doing for
(06:51):
us is is providing a much easierway for us to repurpose the
digital content that we alreadyhave.
So if you're a brand thatalready has some relatively
decent imagery but you'relooking to update it, I don't
think you necessarily need a newphoto or video shoot.
You may be able to use AI torevamp those images, to give
them better lighting, to makethem look, you know, change the
(07:14):
positioning or something tochange out the person Like
Gemini.
Actually, their newest versionfor image generation and editing
is actually quite good.
You can feed Gemini an image.
Now I think you might have tobe on a specific paid plan to
(07:36):
utilize this feature.
I'm not positive, I'm on it soit works for me, but for some
people.
But you can feed it an imageand change out a very specific
component of that image and itdoesn't change anything else, it
just changes that and itactually does quite a good job
at that.
So, and it's not justbackground replacement, although
you can do that, it's also I'vetaken images and put them in
(07:59):
there of a particular personusing our product and I
basically said, hey, make themolder or make it a woman or make
you whatever.
It actually does quite a goodjob at that.
I think for some products youmight even be able to change the
positioning or how they'reholding the item or how they're
using the item.
You know, change the backgroundand the person.
(08:20):
So I do think we are movinginto a space right now where,
even today, you can definitelyuse ai to repurpose a lot of
imagery that you already haveand make some of those edits
without having to be a photoshopexpert.
You know to be able to do thatand make it look right.
Because one of the nice thingsabout ai this is the thing that
I think this is probably thebiggest thing that makes ai
(08:42):
useful for image editing isbecause a lot of times if you
try to change out a person orchange the positioning or add an
item to an image or something,the biggest thing that's
difficult if you're not anexpert in Photoshop is getting
the lighting and the shadowright.
It's really different, you knowdifficult getting the
(09:03):
perspective right on that objectand having it be able to fit
into the image properly so thatthe eye doesn't catch it and be
like, yeah, that something's notright there.
Ai has actually become reallygood at doing that.
You know where it can insertsomething into an image and give
it the proper lighting,shadowing and perspective so
(09:25):
that you can't really tell it'sbeen inserted.
It looks like it was there allthe time, and so that I think is
really useful.
The other thing that I want toadd is just a practical
component, and that is if youlook at Amazon.
There are a lot of productsthat are selling massive volume,
have really good reviews andare doing exceptionally well on
(09:50):
the platform, and if you look attheir images, it is obvious
that they are renders or AIgenerated.
You know like you can tell, butpeople are still buying those
products and still giving themgood reviews and they're still
getting good sales volumes.
So don't get super wrapped upin making sure that your image
(10:11):
is perfect.
I do think you know it'd benice, you know like, if you can
create an image where somebodycan't tell, I think that's
better.
I don't, however, think thatit's absolutely necessary,
especially in certain productcategories, that it's absolutely
necessary, especially incertain product categories, if
most of your competitors are,you know, are using images, that
it's obvious, it's AI.
If you could just do a betterversion that maybe it's still
(10:33):
somewhat, you know, noticeable,but it's not as obvious, that
might be enough in your category.
Speaker 1 (10:39):
Yeah, well, and I
think you know, if we set aside
the words AI, you know we couldhave said the same before AI
came along.
You could say the same thingabout, you know, I think we've
all seen them over the years ofproducts that do really well,
even though their images areugly or what we would consider
ugly, and I mean we've had thathappen to where you know we've
(11:00):
had tested.
You know what we, what is aprettier image versus an uglier
one?
And sometimes you know the notso perfect image wins when we
start looking at clicks andconversion rates and really
driving sales and things thatmatter.
So, you know, I think that'ssuch a great point, mike.
The other thing that I wouldadd, you know, to what you said
(11:21):
that I think is really importantis you know how you can adjust
those images, because what Ifeel like maybe doesn't get
enough attention in thisconversation is you know we talk
about going from hey, do I needa photo shoot?
Do I not need a photo shoot?
Those types of things.
I think the big win really inthe next year is going to come
(11:42):
from being able to rapidly A-Btest.
You know, main images,secondary image, stacks, video,
all these different things toreally, you know, dial in the
click through rate andconversion rate on your listings
with a heck of a lot lessresources than you needed before
.
Because, if you like you said,if you can change out that image
to where you're like, hey, Ihave the same image and I'm able
to make this person look older.
(12:03):
Or, you know, make it a womaninstead of a man.
You know all those kinds ofchanges.
Now you can.
You know, make it a womaninstead of a man.
You know all those kinds ofchanges.
Now you can.
You know, really A B, testthings a lot faster and with a
lot less resource, because youdon't have to do a whole new
photo shoot or have a graphicdesigner spend a couple of hours
, you know, in Photoshop cuttingout that person and adding
(12:25):
another person in and all thosetypes of things.
You can put it into, you know,gemini or chat, GPT or something
like that, and have somethingback in you know 30 seconds and
then be able to go test it.
So I think there's a lot ofvalue that comes from that
testing aspect as well.
Speaker 2 (12:43):
Yeah, my Photoshop
skills are miserable.
My stick figures areunrecognizable, but what the new
version of ChatGPT's imagegeneration tool has allowed me
to do is iterate, like you said,a whole lot faster, john.
For example, we have a clientthat sells hangers, and there's
different types of hangers.
One of them has a bar withclips on the bottom of it.
(13:04):
I needed another image, aduplicate image of the hanger
that didn't have those clips.
So instead of it, I neededanother image, a duplicate image
, of the hanger that didn't havethose clips.
So, instead of going back andforth with my graphic designer
and waiting a couple of hoursfor the revision, I just
uploaded the picture inside ofchat GPT.
I asked it.
No, this was just a test.
I didn't know if this was goingto work.
As a matter of fact, I didn'tthink it was going to work.
I just asked it to remove theroll bar with the clips, and it
(13:26):
did a 90%.
It did a way better job than Iwould have been able to do.
It cut out the corners of it,and I think that was just
something that, if I would havewas better at prompting for
images, I think I could haveresolved that, but 10 seconds
later I had an image that wasgood enough for me to at least
split test where I was using itfor.
So and that's probably saved menot four hours of work but four
(13:48):
hours of communication back andforth with a graphic designer,
and I was able to get somethinggood enough for testing in a
span of about 15 or 20 seconds.
And this is coming from someonewho doesn't know the first
thing about Photoshop andactually gets very frustrated
because I can't get things tolook how I want it to.
So for me, it's made me athousand times faster in just
iterating and getting somethingthat I can split test to see if
(14:11):
it converts better than what wecurrently have, so way, way
faster than the how I used to doit.
Speaker 3 (14:17):
Well, it's also
important, I think to to to
recognize that because of that,if you have anybody on your team
who is even marginally goodwith Canva or GIMP or.
Photoshop or something like that, like just basic skill, then
that person becomes anindividual who can do a lot more
(14:38):
for you in that space, now thatthey have AI to do most of the
heavy lifting and they can justdo a little bit of editing after
the fact to repair whateverlittle artifacts there might be,
or something.
That's way easier than themhaving to do all the work
themselves.
And so, again, like you said,you don't need that.
(14:58):
You know super high-pricededitor, you just need somebody
who needs some basic skills.
Either you learn some of thosebasic skills If you're a
one-person operation.
Take a few days to learn how touse Canva.
Well, canva is pretty powerful.
It's not Adobe Photoshop, butit's pretty powerful and it's
got some really sweet tools inthere that I'm not even sure
that Adobe has.
(15:19):
And so if you just took a fewdays and got to know how to use
Canva, that, combined with someof the other AI tools out there,
make you a super ninja.
You know, in terms of beingable to create.
You know graphic content andmake adjustments.
You know here and there.
Speaker 1 (15:36):
Yeah, and I think you
know, just kind of wrap up the
image, ai, image conversation,because I think this brought
something to my mind, based onwhat you were talking about,
matt, which was, you know, wehad a client that accidentally,
you know, for one of theirfamily of products accidentally
deleted the listing, and thenthey also deleted kind of the
(15:57):
backup files that they had forthose images, and so we
basically had to recreate thoseimages by looking at one of
their other product familiesthat was similar, and, I mean,
the only reason we were reallyable to do that was because of
AI.
And so, you know, from that usecase, I think there's also an
opportunity to look at yourcompetitors, and you know,
(16:19):
probably, I think you know,circling back to what you said,
mike, that I just want to doubleclick on, which is look at what
your competitors are doing andif there's something that you're
like, wow, I wish we had animage like that or we could do
something like that, and thenone step better, like you can do
that so quickly with AI to takethat image and say, okay, now
add our product into it, addthis one additional feature,
(16:40):
benefit to it, and now you have,you know, an image that's even
better than what yourcompetitors have with, you know,
less than two minutes worth ofwork, so I think there's a huge
opportunity there for you knowkind of adjusting images using a
lot of the AI tools that areavailable now.
Speaker 3 (16:59):
One of the things.
Well, go ahead, matt.
Speaker 2 (17:01):
Look like you're
going to say something I was
just going to say.
Yeah, so I mean for me, like wedo still have a graphic designer
, there are still things that weuse that graphic designer for,
and, john, you mentioned thefather's day image, as a matter
of fact, for that client today,I knew what I wanted in my
mind's eye the band, the badgeto look like on the packaging,
and so I just went into chat GPT, knowing that it wasn't going
(17:21):
to be the final product, justthat I was going to give
something to kind of spark theimagination of our graphic
designer.
It took me three seconds Now, itdidn't?
It kind of it changed up a lotof the elements on the box to
make it.
It wasn't usable, but itallowed me to give, instead of
having to type out on a Slackmessage, what I wanted to see.
It allowed me to really, reallyquickly use chat GPT to give
(17:43):
him an idea of what it was thatI was looking for.
So again, even though I wasusing the graphic designer to
give me the finished product, itallowed me to show him what it
was I was seeing in my mind'seye a whole lot faster than me
explaining it to him.
Speaker 3 (17:57):
So the other thing
that I wanted to say is you know
, like we're talking a lot about, you know, using AI, with
Amazon images, but we're talkinga lot about generation of
images and editing of images,which is important.
I mean, like there's a lot ofthat.
That goes on and I think it'svaluable and just real.
Briefly, before I move on towhat I actually want to say, I
do want to you know, just doubleclick on that idea, john, that
(18:20):
you were talking about in termsof sellers.
And it might not be so muchbecause they don't know.
It might just be because in thepast it's been so difficult to
do and so expensive to do,because you'd have to pay a
graphic designer and you know,go through all this hassle that
it just hasn't happened.
And that is the difference inclick through on an image that
(18:44):
has a guy versus a lady, or ayounger guy versus an older guy,
or an image that has a dog init versus an image that doesn't
have a dog in it.
Like there are so manydifferent small changes that you
could make to an image.
That for the avatar that you'reafter could instantly improve
your click-through rate by 50%or double it.
(19:05):
That are very simple changesthat could be made with AI in
you know a couple of minutes.
That's the sort of area where Ithink there's a lot of power in
this in terms of thoseiterations is thinking about and
this ties into the next thingthat I want to say, which is the
other value piece with AI atthis point is it now can
(19:28):
understand images, and so if yougive it an image and you tell
the AI what the purpose of theimage is and a little bit of
background on who your customeravatar is, it can analyze that
image and give you some ideasabout how you might improve on
(19:49):
it.
And then you can take thoseideas, figure out which of those
are the easiest ones toimplement with AI, make those
changes and then do the splittests based on that information
that AI is giving you.
We're also moving into a spacewhere, if you know what you're
doing and you can create acustom GPT or maybe some sort of
an N8N workflow that does this,where you know there's we've
(20:12):
got tools like PicFu andIntelliV and things like that
out there where you split testimages beforehand.
You can now start the process.
There are ways to do this whereyou can split test two images
with AI and basically tell itwho's the avatar that you're
aiming at and you know what areyou trying to accomplish with
(20:33):
the image, what's the benefitthat you're trying to solidify,
what's the emotion you're tryingto generate.
Whatever, Give it the twoimages and have it do its own
poll and you would be amazed Ithink people listening would be
amazed at.
There are some models out therePeople are generating these
models that almost all the time,the AI is picking the true
(20:55):
winner.
Like, if you run the split teston Amazon, the results from
that split test match what theAI is telling you is the better
image.
Speaker 1 (21:06):
Yeah, I think we're
just kind of touching the
surface on that.
We've talked a lot about imagegeneration, which I think is
great, because I think that'swhere we've seen Well one.
I think there's a lot of valuein it for brands that are
looking at increasing clicks andconversions and ultimately,
sales.
I think the other thing I wantto kind of touch on as well,
especially, mike, as you broughtup some of the custom GPTs and
(21:29):
that type of thing that peoplehave put out there again, you
know, I would I say this with aword of caution which is those
GPTs are only as good as thepeople that have built them and
they are, you know, static,meaning they're only as good as
the time that they were, theywere built.
If we're talking about customGPTs I know Mike you're working
on, you know, say the name of itagain.
Speaker 3 (21:50):
N8N.
So, nancy, the number eight,nancy, n8n.
Speaker 1 (21:55):
Yeah, n8n, which is
much more closer to you know,
generative AI, where it learnsas you go, and having, you know,
access to what are essentiallyAI agents, I think is kind of
the terminology that's beingused for those right now.
Agents, I think is kind of theterminology that's being used
for those right now.
But even with, you know, take acustom GPT, you know, I think,
(22:15):
something that we have foundreally helpful and looking at.
You know even ads and you knowstrategies of like, hey, we're
looking at how do we boost salesfor this brand and asking the
AI a bunch of questions and,frankly, you know, for someone
that has been doing this forfive years, probably 80% of that
, the ideas that it gives you,you're either like, oh, we're
already doing that or that's nota good idea for this brand
(22:36):
because of X.
That said, there's another 10 to20% that, like you know, just
jog something in your memory.
You're like, oh, yeah, we canrun that type of campaign.
Or you know this type of couponor whatever happens to be that
tool in the toolbox that helpsjust kind of prime the pump and
help you.
(22:56):
You know, think through ideason how to boost sales, how to
increase conversions, different.
You know ad types if you'rerunning your own PPC, but I
think that's another area wherethere's a lot of value in having
those conversations with.
You know chat, gpt or some ofthese custom GPTs to help brand
owners get some ideas andanswers pretty quickly on the
(23:20):
problems that they're facing,with some good answers.
Speaker 2 (23:24):
Don't even get me
started with custom GPTs.
I have become addicted tobuilding them now and actually
I've probably built two dozen ofthem in the past week and a
half, two weeks and one of themactually talking about Amazon
and optimization.
I built one that I.
First of all, I learned how tooptimize a listing for Rufus.
(23:46):
I went through a couple of deepresearch prompts and learned
that, but then also use thatdeep research to create a custom
GPT to help me rewrite listingcopy for Rufus.
You know, it probably took meabout an hour or so to learn
what I wanted to learn and usethat learning to create the
custom GPT.
But now I've already done itfor, you know, three or four
(24:08):
client listings just input ofthe existing title gave it a
little bit of information aboutthe product and the avatar and
it rewrites titles for me.
So I mean custom GPTs.
I have been using them forabout six to eight months to
make my life a lot easier, butthey're so easy to create that
you know, like I said, every dayI think of another use case for
(24:29):
another custom GPT and, to behonest, most of them are all
involved around optimization onAmazon and I mean it's made my
workflow so much easier usingthese chat GPTs.
And now, after a couple ofconversations with Michael,
we're going to turn some ofthose custom GPTs into, add some
automation to it using N8N, sothey're going to be steroid on
(24:51):
steroids here very, very soon.
So there's so many custom GPTsalready out there, but also
they're not that hard to create.
So if you have an idea for oneor have a use case for one,
learning how to create them aresuper easy and they're so so, so
useful.
Speaker 3 (25:05):
The other thing that
I think is important on the AI
front and with regards to customGPTs and a number of other
aspects, is that AI I feel likeone of the things that people
tend not to recognize as a valuecomponent, let's say, is using
(25:32):
AI to tell you what you don'tknow and what you need to know
to move forward with something.
So, in other words, reverseengineering a problem, let's say
so.
Let's take the situation thatMatt's talking about.
Let's say you're an individualwho, number one, doesn't know a
(25:56):
whole lot about how to optimizean Amazon listing.
Number two, also, doesn't knowa whole lot about how to build a
custom GPT.
You could step into ChatGPT andsay I'm an Amazon seller and I
want to update my listing andimprove the click-through and
conversion rates on my listing.
What you know, how do I do it?
(26:16):
What are the things I need toknow that I don't know?
What are you know like?
Let the AI train you on whatneeds to happen in order for me
to be able to fix that.
What are the things I need toknow?
What are the things I wouldneed to change?
It will tell you those thingsand then you can back up again
and say, okay, how do I learnthat, or how do I do that, or
(26:40):
how do I, whatever?
So you just keep stepping backas far as you need to step until
you reach the point whereyou're at the place where your
knowledge foundation is enoughto move forward.
From there you just keepbacking up until it reaches the
point where you have theknowledge base you need.
So you know again.
Same thing with the custom GPT.
Let's say you now, okay, the AIhas told you these are the
things that generally, you'dwant to pay attention to in
(27:02):
order to update a listing Foryour customer avatar and your
product category.
These are the things that you,things that you would want to do
.
Okay, how do I do that?
All right, this is how I dothat.
Then, all right, now I want tobuild.
Now you tell ChatGPT all right,I want to build a custom GPT
that will help me do that.
What would be the prompt?
(27:24):
What are the instructions thatI should put into that custom
GPT so that it will do what youjust told me are the right
things to do in order to updatea listing?
It'll write a prompt for you.
Then you can ask it how wouldwe improve on that prompt and
it'll write a new prompt.
It'll improve on itself.
What sort of documents should Iupload for the knowledge base
(27:44):
for this custom GPT?
Where would I find thatinformation?
How would I put the documentstogether?
Ami will walk you through theentire process of building that
custom GPT.
Where would I find thatinformation?
How would I put the documentstogether?
Ai will walk you through theentire process of building that
custom GPT if you don't know howto do it.
So don't get intimidated by theprocess.
You will learn a lot just byasking AI how do I do this?
What do I need to know in orderto do this?
(28:05):
And just let it walk you backuntil you reach the point where
you're like, okay, I know enoughto move forward from here, and
then just do it.
Speaker 1 (28:13):
So I think we spent
quite a bit of time talking
about and I feel like all threeof us feel like that.
You know, ai is definitely abig value add for brand owners
that are, you know, building onAmazon and beyond.
I do want to touch a little biton, you know, what we feel like
is the hype out there where AIgets overhyped.
(28:35):
And I'll start with the firstarea that I feel like AI gets
most overhyped is probably inthe, the PPC or the ads realm.
I still feel like there's thispipe dream out there where
people like, well, I'm justgoing to put in my brand and my
product and AI is going tomagically do all the work and
adjust all the bids and thattype of stuff.
Now, there's a lot of goodmachine learning and there's a
(28:56):
lot of things that AI can doaround.
You know, analyzing data anddoing a lot of that work for you
comes down to actually, youknow, steering campaigns and
that type of stuff I've stillseen, still feel like there's a
lot of value of having the humanin a loop rather than just kind
(29:17):
of handing it off to AI when itcomes to Amazon ads, at least
as of right now.
Speaker 2 (29:21):
Yeah, I think AI can
do bid optimization.
Essentially, bid optimizationis a math equation that
computers are good at, but whereI think I've used I've used a
couple of tools for Amazon PPCmanagement and I think for me
where I would second guess usingAI is to like keyword
harvesting, for example.
(29:41):
You know it does keywordharvesting, just based on rules,
and if a keyword converted in aresearch campaign, it'll
automatically create a manualcampaign for that.
And when I did that it justwent nuts.
It went absolutely nuts andadded a whole bunch of
irrelevant keywords.
So I think having a human inthe loop in terms of advertising
is super important.
I don't know if that's going togo away anytime soon, but bid
(30:02):
optimization I think bidoptimization is something that
you can use some sort of a toolfor and be pretty confident.
But, yeah, human in the loopwhen it comes to advertising for
sure, at least in the shortterm.
Speaker 3 (30:15):
I think the other one
is well, in a general sense, I
would say anytime you see, youknow, a new hype video come out
or things like that, you'reseeing best case scenario of
what that AI can do.
And I can tell you fromexperience virtually every
(30:36):
single video that I've seenwhere somebody was like look at
this really cool thing.
You know, whatever I tested itand if I was to pull back on the
expectation by about 75, youknow like maybe 70% of what they
said was real, then yeah, no,it can do that.
And if 70% of that is useful toyou, then great, like that's
(30:58):
perfect.
But recognize that in almostevery case the things that
you're seeing people doing withAI, it's a very specific use
case.
It is not generalized well toother use cases, other products,
other, you know like it's thisthing.
You know it was able to do thatreally, really well.
Now, in three to six months,it's highly likely that whatever
(31:22):
it was able to do in that veryspecific instance, it will be
able to generalize and do betterin a whole bunch of other
instances.
So I don't think it's.
You know.
I guess the one caution I wouldissue is don't ignore the hype,
(31:45):
just recognize that whatever thehype is today is probably
reality in six months.
So be ready for it.
Make sure you understand whatthat was that was able to do
that.
If you're looking at that thingand you say that's something
that would really be valuable inour business, and take a step
back and make sure that's true,that it's not just shiny object
like oh wow, that's really cool.
Just because it's cool doesn'tmean it will be valuable in your
business.
So make that evaluation first.
But if you believe it would be,if you think that it would pull
(32:08):
some lever in your businessthat right now you're not able
to effectively pull, then keepthat in the back of your mind,
write it on a notepad, put it ina sauna to come back and check
it, you know, in three months orsomething, because there's a
good chance in three to sixmonths you will be able to apply
that to whatever it is that youwant to.
Things are moving that fast, buttoday it's probably not going
to generalize to what you want.
(32:29):
So that's one of the things,and I think video is one of
those.
Right Like you can create somepretty impressive eight-second
videos.
Very few of us is aneight-second video going to be
very useful to us.
Now we might be able to piecetogether, you know, three or
four or five eight-second clipswith a couple of little clips
(32:49):
that we already have of our ownvideo and create a longer, you
know video that maybe we coulddo something with.
But also, again, you know, ifyou want the right person or if
you want your product in it oryou know things of that nature,
video is not there and it won'tbe for a little while.
So there are definitely someareas where the hype is, you
know, unjustified in the moment,but you should be paying
(33:11):
attention to it, and one of thethings I would actually caution
and suggest that people do isyou need to create some sort of
AI workflow agent, custom GPTtask, whatever.
There's a bazillion tools outthere to do something like this.
That is, keeping tabs on whatis happening in AI out there
(33:34):
that you might not know aboutbecause you don't have the time
to be researching it 24 7.
What are the new things thatare happening that apply to you
and making sure that you'regetting a digest of those things
and maybe a summarization ofwhat it is and a link back to
the reference so that you haveopportunity to keep up on that
thing.
You know, whatever those thingsare, so you're ready for them
(33:55):
when they you know, when theyreally are prime time.
Speaker 2 (33:58):
John, before you wrap
up, I want to just real quick.
Arif has been commenting and hehad a question that Michael
posed about split testing usingAI.
So I just want Arif to know.
I'll have Michael go in andanswer your question in the
comments of this, so just wantto make sure that we get that
covered, because he's beenasking a couple of questions in
the chat.
Speaker 1 (34:17):
Perfect, yeah, yeah,
and that's the advantage.
So for anybody who's mostpeople listen to this kind of
after the fact that's thebenefit of what we've started to
do is at 12, 15 central time,around that time, post live on
LinkedIn.
So this is a good example ofwhy to show up live and listen
live on LinkedIn, because wewill answer your questions live
(34:41):
as a part of the podcast.
So that's awesome.
Thanks for bringing that in,matt, and I do think that this
is a great place to kind of wrapfor this episode.
I think we've covered quite abit of things that, again, I
think are really helpful forbrand owners.
And, mike, I think you hit thenail on the head as far as, yes,
a lot of things you're going tosee out.
There are best case scenario ofhow these AI tools are being
(35:01):
used, but really they're a goodindication of probably where
things are going to be in 6 to12 months, and if you can get at
least 70% of that value out ofit today, it's probably worth
something spending some time on.
And then what I would wrap upwith for my pieces is that the
one piece of advice that I'dhave for listeners and brand
owners are like hey, how do Iget more involved in this AI and
(35:23):
that type of stuff?
Whatever problem you're facingin your business, take a little
bit of time to look on YouTubeSomebody's probably built an AI
solution or a custom GPT orsomething like that for it and
use an AI tool in order to solvethat problem, and you'll
probably find, a good portion ofthe time, that you can find a
(35:44):
solution, or at least be prettyfar down the path on a solution
from AI tools or even a customGPT that's completely free.
That's already exists out there.
So I would just, you know,really encourage brand owners to
do a little bit of thatresearch into how can I leverage
an AI tool in order to solvewhatever the problem is that
(36:07):
you're facing in your brandright now.
Solve whatever the problem isthat you're facing in your brand
right now.
Speaker 2 (36:18):
Any other last pieces
of advice from you, matt or
Mike.
I'll just reiterate custom GPTs.
If you're not familiar withthem, if you haven't used them
at all, go inside of your chatGPT right now and click on the
little explore GPTs button,because there are literally
thousands and thousands of themthat exist.
For any question, any specialtythat you're looking for,
there's already a custom GPT outthere that exists.
So if you haven't played aroundwith them yet, definitely do
(36:39):
that when you hear this.
Speaker 3 (36:40):
Yeah, and I would say
, as you, test it.
If you don't like it, test adifferent one.
Oftentimes there's five or 10different custom GPTs for what
it is that you're trying to do,and if none of them are right,
build your own.
I mean, as Matt said, they'renot as hard as you think,
especially if you allow AI tohelp you reverse engineer the
process.
And then I would also just comeback to make sure if you're not
(37:03):
using AI for anything else.
Use it for deep research it'sreally useful for that and use
it at this point for imageiteration and split testing,
because, honestly, if you cantake an image and in 30 seconds
or a minute, create a new imagethat just changes out the person
or the color of their shirt, orput a dog in the image or
(37:23):
whatever, and you can test that,I guarantee you're going to
find small incrementalimprovements in your CTR and CVR
.
And if you just keep doing that, that alone, if you did nothing
else in your business, is goingto improve your sales.
Speaker 1 (37:36):
Well, I think that's
a great place to wrap for
today's episode.
Thank you everybody forlistening and we look forward to
seeing you next Tuesday onanother Tactics Tuesday.