Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
What's up everybody, mid journeyfast hours podcast episode 55,
the week after Halloween, we're back.
Rory Flynn, Drew Brucker. Let's get this out of the way up
front. Go ahead and like and subscribe
and. Tell your your.
Mom, your friends, your grandma,your dog, your cat, you know
(00:20):
anything weird? Any any weird?
Any. Anyone or anything?
How about that just cover? The dog Walker.
If you have dog walkers, need love too.
So someone that you see on the subway or on the train or on
just riding your bike. Just be like, let them know.
Just that's us. Just let them know I.
(00:40):
Mean I should check this thing out?
55 episodes in Rory Flynn. I mean, I was just talking to my
wife about it last time. I was like, I can't believe 55
episodes in. I mean, we've just, we've been
doing this for a year and a half.
It's been a long journey. It's flown by at the same time,
it feels like we've been doing this much longer than that.
(01:04):
And I feel like we've talked about so, so many things.
Obviously, like if you guys havejoined us recently, right?
Like we, we kind of keep you up to date with mid journey.
We talked through sort of like the changes, the updates, the,
the things that are coming. And then we also couple that
with sort of like the broader picture of things happening in
the AI visual space. But just a reminder, there are
(01:29):
some super, super deep dives in the earlier episodes where we go
into like The Thick of It, wherewe go into levels that not a lot
of people go into in these specific, you know, all the
parameters, how to use each one,micro decisions that we make,
you know, prompt theory and structure, like so many good
(01:50):
things. So just I just want to throw
that out there reminder that those things are still out
there, still very, very relevant.
So go ahead. We should also say that when we
started doing this, there wasn'ta new major release from every
tool every five days. It was like we could go a couple
weeks without something really happening and we'd like touch on
(02:11):
it, but that was when we were that's when we were deep diving.
And it's like now this is this is like real time what it's like
with Drew and I trying to keep up with everything while we do
this professionally and then tryto also like continue to bring
content here because there's so much to talk about.
And it's like, we used to just be like, we might go six weeks
without something major. Now there's no, you don't go six
(02:33):
weeks without anything, you know, six days without something
major. But and everything got good.
Like that was the thing. Everything was everything was
mediocre. And now it's good a lot more.
People talking about it, I mean,like at first it was just a
select few people talking about.I mean, all those things were
we're true as in tradition, we always like to lean into to mid
journey first. So maybe we start there.
(02:55):
There's not a lot of changes or updates since the last time we
talked Roy something and maybe we just touch on kind of like
what happened with office hours and then what's coming next
week, because they did say therewere some things coming next
week. And then we kind of, you know,
zoom out, we can talk about someof the other things that are
happening in the mix. So.
Let's let's do it. Let's see what's going on in mid
(03:16):
journey verse. Let's do it.
So office hours this week a little bit different.
So in case you guys ever want tocheck out office hours, this
happens every week on Wednesday.Think it's noon Pacific Time, 3
Eastern Standard Time. I don't know where in the world
you are, so just kind of adjust accordingly, but that's
(03:37):
available and open to anyone. If you want to, you know, get in
there and access that. They're streaming that both on X
through the mid journey account and they're also doing it in
discord for the for the Ogs. So you can access it either way.
This week was a little bit different though, because they
invited like their developers onand we're leading it.
And so it was a little bit different of a format, little
(03:59):
bit different questions coming in A lot of people asking and
just kind of suggesting bug fixes or do this.
Could you do that? Blah, blah, blah.
One of mine that I was just, I've been thinking about a lot
lately is just like an easy way to like, like photos in the
create screen. So like if I'm just creating
(04:19):
something, you can right click and like, but honestly, this is
going to sound very like complaining of me.
That's too much work. I don't want to right click and
do that. Like I, I just want to be able
to just click like click like boom, boom, boom, boom, boom.
Cause a lot of times you'll findyourself in this sort of screen
(04:39):
here. And I love that they, I love
when they added this, you know, subtle and strong variation
here. That was awesome.
But I think, you know, just adding that would be a huge,
huge fix. But there were a lot of like
things like that, people asking about, hey, could you do this?
Could you do that? The developers were making notes
of those things. There's really not much else to
to say besides that. I would say maybe the one thing
(05:00):
worried that you know, was interesting for for us in the
audience is this one right here where we're just talking about
active development of web profiles in the new style tool
creator system. No immediate releases expected
this week. So that was on Wednesday,
multiple feature drops and updates anticipated for next
(05:23):
week. OK, So next week sounds like a
week where we might have something and we'll be able to
talk to talk about it and share with you guys next week on the
podcast. So that's kind of where we're at
there. I mean, like very, very short,
the office hours just was very, you know, question voice
oriented, very conversational with the dev team.
So it wasn't your typical sort of office hours that you might
(05:46):
get with with the depth and withDavid and sort of like the more
I would say like interesting, interesting topics.
It was more or less like what's on your mind?
What have you been wanting? What's not working well blah
blah blah blah blah. But that's that's good because
they they've been I feel like the last release was really like
style creator. I mean, or style style explorer
(06:07):
feels like it feels like a whileago.
It's probably. Does feel like a while ago.
Yeah, yeah. But that's, you know, they were,
they were on like a rapid pace after V7 and it was something
every week and they were being talked about all the time.
So I think it's probably time that they kick that back up
because it's got to be they've been working on something,
right? Like have to at least because
(06:29):
it's been, it's been a while. Since when when was V7 released?
I don't even remember it was. Yeah.
When was V7 released? We had to go back in.
June, something like that. God no.
Has it been? Has it only been June?
Well, let's. See.
Feel like it's been longer than quick?
Look, when was mid journey V7 released?
(06:51):
I don't know if you're using, I don't know if you're using any
of the AI browsers. I really like Comet.
Now for April 3rd 25. OK, so it was April.
OK. Yeah, I mean.
So it's about time. So I mean if V, if they don't do
7/1 and they go to V8, by the time V8 can be basically 10
months. Yeah, I mean.
(07:14):
That's a long time for a moderate one, considering the
way they were moving before. They've added a bunch of stuff,
right? Like it is.
It is what those micro updates that kept us happy for a long
time. You know, that it's, if there's,
there's a few things I feel likeI'm, I'm confused about what
comes next. They, you know, the, the I'm
(07:35):
interested. I'm, I'm probably the most
interested outside of V8 in the new like UXUI, Like how much is
that going to change? You know, like, am I going to
have to relearn a lot of stuff or is it going to be relatively
simple? They've never had a problem with
how they've done it, but just curious of what that's actually
going to look like because thereis a lot of stuff now.
It says this does feel like a like there's other tools where
(07:56):
it's just like you prompt and hit enter and this one is like,
it feels like a whole creative studio in here now, which isn't
a bad thing. I just, I'm just curious how
they're going to organize it. Yeah, yeah, I think I'm super,
super curious about the UI stuff.
I mean, because, you know, a lot's changed in terms of like a
workflow with how people are generating images now, even
(08:20):
videos, right? So how does Mid Journey adapt to
that? And typically when they touch
something, they'll, they'll giveyou like AV1, but then they
won't come back to it for a while, if if ever, you know.
So I, I hope it's, I hope it's good enough, right, to
accommodate for more of like theworkflow efficiency, connecting
(08:40):
the dots without, you know, jumping over to different tabs
or screens and this and that andthe other or even tools and, you
know, just kind of pushes, pushes things forward in that
next phase for us. So here's what's interesting,
right? What has what has Rory been
complaining about for the last couple weeks on this podcast
with mid journey? Is that number one, we just need
(09:03):
something that's like nano banana.
If you look at this leaderboard here, how many of those problems
are solved by a by a tool like that that?
Came up in one of the questions and that was one of my questions
the previous week on office hours too.
So I, I really hope that that isthe case 'cause you're right.
So look at this. Multiple angles of a character
(09:23):
pose of a character coherent or improve as in multiple
consistent characters, more advanced facial expression
controls multiple consistent characters at the same time.
All things that nano seed at it Reeve, ChatGPT even which not as
quality, but like yeah, you're right.
All. Of these things that they can do
already. So that's what everyone wants.
(09:45):
So just build it, if anyone's listening, not just not like I'm
saying, bro, just just go build it in five seconds.
It's just like that's what the market wants.
That's what we all want. It's it's, it's very, it's a
very simple because it's not like, oh, it doesn't have to be
like the anti marketing thing. Like we don't want marketing and
businesses in here. It's just like people want to
(10:05):
tell stories. They want to be able to play
with their characters and they want to play with their
environments and they want to beable to.
Do ultimately you want people touse it the way they want to, you
know? Yeah.
I mean, you can't totally control that.
I, I forget this is in here. Need to go back in and start
rating some of these things again.
It's been a long time this you know, this is in tasks.
Anybody can access this? Oh, wait, go back one second.
(10:28):
Hold on. I was going to say that's a
world building in there. I saw a little thing that said
world building 'cause I was thinking about that.
Remember, remember that release?I forget, even forget when it
was like their little. It was like the.
I forgot what they even called it.
It was like patchwork maybe. Yeah.
But it's like that just disappeared.
(10:49):
That was like a quick, yeah, that was a quick flash in the
pan, but it was on the same track.
It was on the right track. You know, like that's where
everyone's ending up now, right?Is that sort of like one step to
the next to the next in a visualspace or a canvas, right.
Like that's sort of like the idea.
We'll talk about that later. Yeah, I think like some of this
stuff too is I feel like right, Like some of this stuff is
(11:14):
already doable. It's just the problem is it's
not very fun and easy or accessible to do right.
Input a scene images to create images in that scene.
Sounds like. Yeah, or or even advanced
facials. I mean, you can, you can get
great facial expressions if you if you actually do it.
(11:37):
Most people don't do it. You know, I just did a whole
post on that. People just don't do facial
expressions. I, I, I never see it, but they
can be done. I mean, I, it's not perfect.
I will say that there are some that just don't work that you
would hope would work, but thereare plenty that do work, you
(11:59):
know, So I don't know if I can maybe show this really quickly,
but that's always one that I'm focused on, which is, is that in
here or here? I think it's in this one
expression. I mean, you can get these all
(12:20):
day, right? Like it's just people don't do
it. People don't do it.
And if you don't have these ideas, just go, you know, a
couple prompts with an LLM, you can get plenty of ideas to test
and then test which ones work right?
Like keep the prompt maybe simple at first just to see if
it's going to get it. And then you can, that gives you
(12:41):
then the, the freedom or the sort of accessibility to be
that, OK, now I'm going to like work this into a longer prompt
because I do know that it works right?
So that would be my advice when you're testing any of this stuff
is like start with a very simpleprompt coupled with what you're
testing. See if it works first, then you
can. Then you can sort of play with
(13:02):
the fact that it can be includedin a longer, more detail or more
complex prompt. Totally, 100%.
I think that's like I said, likeif you look at all these things
that people want also, it's justlike I want to be able to save
an image that like not save in terms of like download.
Like if I like an image but it didn't have the facial
expression really like I really like it.
(13:23):
Like I just want to be able to edit that exact image.
I don't want to like I don't want to go and try to recreate
it and get it and run Jen over and Jen over and Jen over to get
the same thing. It's like I just want to do what
I do with Nano or, you know, seed editor or any of those
other ones. Seed dream just like do
something with this exact thing.Don't don't reinvent the wheel.
(13:44):
I don't want to reinvent the wheel.
Just want to change the facial expression of this person.
That's it. Change the color of this car.
That's it. You know, like that's it sounds,
it sounds. It's time dude.
It's it's time though, because remember we were, I mean we were
force feeding things in. Remember we would go like into,
(14:05):
you know, you'd go into like an image and you know, you'd be
like, OK, you know, let me, let me edit this and you know.
Image reference in a red jacket.I'm just about to say I was like
going to say like image reference it in or whatever you
got to do text reference plus the image like weight it, you
(14:27):
know, and play around with the different weights.
I mean, it's just like it's time, man.
It's time, you know. I like the precision, precision,
that precision control like I like that we have all the
parameters and all the things. I just want like for certain
things where it's just like, youknow, make that guy smile
instead of instead. Of just quick on the fly
adjustments, I would say that's got to be the number one, that's
(14:51):
got to be the number one thing you would want right now that's.
All I want then then it's like, why do I need to go anywhere
else right Like that? That's that's it.
Just take this, do this for that.
That's all. I I also will say this don't
like if if you need more fast hours, don't sleep on this.
(15:12):
I mean like go into tasks. I would say you could rank these
new style grids or images do like 50 to 100 and you're going
to get a free hour easy. I mean, how simple is that?
You know what I mean? Like don't don't sleep on that
either. It's just like boom, OK, like,
you know, go go through and do that.
So if if you're, you know, forgetting these things, it's
(15:35):
just important that like a lot of people stop doing this.
So you don't really need to do that many in order to get these
free hours. So maybe just something to keep
in mind there too. Good point.
My, my complaining is done for today.
By the way, that was just me. I had to.
I had to get it out. Hey.
You know what? I'm not.
I'm not complaining too much because after that last week and
(15:57):
figuring out that I can use my personalization codes from V6 on
V7, I haven't looked back. Brother, I haven't looked back.
We're looking for it. I like it.
That was a good one. That was a good unlock.
And I, I have been using my old ones again and I'm like, thank
you. Thank you for this wonderful
(16:17):
gift that you've provided to me.Incredible.
Well, you want to zoom out a little bit too, like I don't
know if there's much else. Is there anything else that you
want to talk about? You know, maybe really, really
specifically into mid journey remind anybody of?
I mean, there's nothing really new here at the moment.
(16:38):
No, I think I let's, let's let'szoom out a little bit.
I like that. I like that phrasing.
You need a sound effect. For that.
Yeah, man, where you want, whereyou want to start 'cause I know
that we were talking about a fewthings before we we jumped on
here. You know, maybe is it maybe we
want to start with the node nodebased phenomena.
(17:02):
Let's talk about the node revolution as as to what's been
going on, I think it's, I think you.
Give us the Emancipation Proclamation of the node based
phenomena that's going on right now.
It's it's sort of like what I'vealways wanted, but never knew
how to ask for it because it's just I can use everything all at
(17:24):
once in the same space and I just see it like, I don't know,
I like mid journeys like in addition like to in regards to
like other platforms, right, like mid journey versus others.
I like creating in mid journey the best, like from just a pure
usability standpoint, right? If I'm doing one thing, which is
like images and videos, right, just because it's, it's easy,
(17:45):
it's fast, it's it's I know it'sreally well nodes.
The other is, is like, I can seeit all, which is what I like
too. I don't have to scroll through
like to find stuff. It's like all there.
It's like if you know, I'm putting all my ingredients on
the table and then making a mealwith it, right?
I can see that's why I personally like it.
(18:05):
But also it's just like the branching realities of things
plus sort of like history tracking, if that makes sense,
like going back to things and like it's not if anyone's
familiar with designing in Photoshop, right?
Like you can design one image, right?
And you do what? Work on one image at a time.
And then you can, when you had to like go, you wanted to go
test something. You had to like, you know, wrap
(18:27):
up all your design layers and doa folder, duplicate the folder,
you know, unvisualize it and then, you know, work on the new
thing. And if you didn't, like you had
to unclick that and go. It's just like a really clunky
process to like to test different things.
So yeah, I like dragging some nodes and saying like, go this
way, this way, this way, this way, this way.
(18:47):
If they all suck, I can just delete them and go back and
start from that image again, right?
So I think it's just like people, I think it's very
familiar to people, like in terms of like how they actually
create, like visually, it's here's where I started, here's
where I go, here's where I keep going.
Oh, let me try this. Now that didn't work.
Like that's, that's sort of likethe creation process in general.
(19:08):
You never just go like, here's my paragraph.
Enter. Go.
This is, I mean, this is something that we were we were
we've been talking about a little bit now we've seen this
mass like rush to implement thisinto, you know, existing tools,
right like runway in there, you know, like so, so backing up
(19:33):
right. This was comfy UI by itself or
seemingly by itself for a long time, but you know, it was just
so intimidating and so complex, probably overkill for a lot of
people. Very like intimidating in the
sense that it wasn't really early adopted by this larger
base yet. So it just kind of felt like
(19:54):
this, this niche thing and a different way to do it, but
didn't feel necessary at the time, at least from my
perspective. I'm just going to speak about my
perspective. And then you started to get
these. Locally too, that was the other
thing was early days needed to run it locally, so you also had
to have the hardware to do it soit.
Wasn't like there you go. Based or like you had to have
like a decent like a. My Mac could not handle it.
(20:18):
Interesting, OK, so I didn't know that part.
So that's interesting. And then you had right, the
floras, the weevies, right that then I felt like made it way
more accessible, less intimidating, lowered the
learning curve a little bit likestill learning curve, but like
enough to get more people in there and experimenting and
(20:39):
trying. Right, 'cause there I think
there is like this barrier that if, if it's too high, people
just you got to be in this rightmindset to knock, you know, to
go over and knock down the barrier.
And if it's too high, people aren't going to come back.
You know what I mean? And so like, I feel like with,
with Flora and Wevy and like tools like that, they did a
really good job of making that accessible.
(20:59):
And then they had, you know, people that were on it early,
like you, that were really like just breaking it down, sharing,
you know, how you can sort of like make this a workflow and
what that actually means in terms of like scaling and
efficiency and blah, blah, blah,right?
And not having to have, you know, again, like to maybe like
Korea's credit, you know, the, the idea that you could then
aggregate all these tools in oneplace and, you know, you don't
(21:21):
have to have all these separate subscriptions, right?
And this can be a very costly thing.
So like there's been this whole evolution.
Now you've got, like I mentioned, runways got, you
know, playing this game. Now Adobe, we talked about last
week playing a game. Now free pick is playing the
game now, which love free pick. Then you've got Ria in beta
(21:45):
right now for this and and Roy, this could be fun maybe because
I haven't been in Korea yet, butI do have access to the beta.
So maybe like, maybe I could share as we walk through this
and then you can sort of like narrate some of what you've seen
so far or maybe even share what you've got.
I have some I have some good stuff.
It's, you know, also just to like, before we get into this,
(22:07):
I, I wanted to talk to, I just wanted to say something like
overall general, I feel like theAI, it was like this is like an
organic sort of like way for everything to come together, if
that makes sense, right? Like where early on there was a
few tools that we were like sortof that were like fighting for
position, which was OK, if you like image generation, it was
(22:28):
like mid journey or stable diffusion or Dolly, right?
So you had to like buy the subscription and then it was
like everyone went to that model, right?
And everyone's like, OK, we needsubscription, subscription.
And I have to have one to runwayand mid journey.
Now I have to have one to runway, mid journey.
And you know, Photoshop for Jen,Phil, right?
(22:48):
Like going back to those days, then it just started to explode.
And there was so many tools and they were all good at doing
different things that I think probably this is just a, this is
my guess. You tell me how you think about
this from a business standpoint that everyone's subscription
started to shrink because they were spreading out to a more
broader, like it wasn't like thesubscription numbers, daily
active users, kind of things like that.
(23:09):
It's broader. So then these tools, what do
they do? They start releasing AP is so
that you can get like the residuals like, oh, if I'm not
going to get subscription, then like whatever, wherever people
are using the AP is, they'll getthe API hit money.
Then everyone had an API. So you see like the rise of
tools like Korea free pick Higgsfield, all these tools to like
(23:29):
aggregators that people don't have to have a subscription to
every tool. They can have one subscription
and use them all right. Then like it just became like,
what's the easiest way to use every single tool?
And it feels like that birthed like OK nodes, you use them all
in one space. You can use every single one.
You just do it the way you want to do it.
And like it doesn't have to be some pre packaged thing.
(23:51):
So that's why to me it's like this natural progression of how
it gets there and it's all sort of based on our problems, like
too many subscriptions, too manytools, right?
How do you solve that? You put them all in one tool and
then you make them all accessible in the same space
without having to make it complicated.
So that I just wanted to I was thinking about this the other.
(24:13):
Day how we. Got there.
So I think that's right. I think that's right.
I'm going to add I, I think I don't really have much to add,
but I think you you pretty much nailed it.
The The thing is that I'm also thinking about is you don't have
to learn all these different website UIUXS, which is also a
problem. I mean, for me, it's like I use
tech everyday. I'm in new tool, but that's
(24:33):
that's also an exhausting process to figure out how things
work on each different platform.And then if I haven't been there
for a while, has thing have things changed and moved around
like that's a problem. So this solves that problem.
You talked about this. We've already, you know, you and
I have already talked about the subscription cost, the tab
hopping this also then I think allows now, if you don't have
(24:54):
this, you're sacrificing memorability and stickiness with
the product. You, you almost have to have
this now to be to be a sticky product because with the
velocity of the changes that come through here, I mean,
you'll be forgotten instantly ifyou don't have this because this
is going to be that way again tojust access all the tools
(25:17):
without the UI, you know, like white label tools essentially in
the same place. You don't have to log in.
What's my password here? Do I have it saved?
Oh my God, like I need to up my credits, blah, blah, blah.
Did something move? I can't find it.
Like you don't have to worry about any of that shit, you know
what I mean? So this is this is really
interesting. And then I think just the last
(25:39):
piece of it, right? Just everybody knows this, but
it's like every tool is good at something different.
This solves that problem, right?So.
It makes us tool agnostic. That's sort of where I think it
goes. Is right.
You don't have to. It isn't like I, what should I
choose between this tool A or tool B and like which
subscription, which one's betterfor what I need it for?
(26:00):
It's like, no more of that. You just use.
Should I run them both at the same time?
Set up the separate? Yeah, exactly.
So, but, and Korea to their credit was on to this really
early. They were just like, OK, we have
our real time editor and a few other things.
Then it's like, well, let's juststart bringing everything else
in. And they made it easy.
And it was like to me, they'd sort of like started the
(26:21):
snowball effect here with the aggregator sort of sense.
But I'm I'm really liking, I'm really liking their nodes.
One thing to know about Korea nodes that I will say is that
they can do batch processing, which I think is a
differentiator now in the node space.
If you don't know what I'm talking about, meaning like, and
(26:41):
maybe I'm not calling it the right thing, I just call it
batch processing. Like if when you run in nodes
something like things like floraand things like free pic,
they're, they're super pretty usable easy.
The one thing they can't do is batch processing.
Meaning I give it one prompt, itcreates 10 more prompts and then
(27:02):
goes and creates 10 separate images.
With things like flora and free pick, you can give it one, you
can only give it one prompt and then it can go create 10 images
of that same prompt. It can't do variation.
So because there's no splitting or like splitting function to
actually, like if you had an LLMfor example, like create 10
(27:23):
prompts, you can't like go into that LLM box and then create 10
different. I mean, you could technically,
in theory go and have something that goes and creates a prompt
and then makes them all different by like there's no
sort of cohesiveness in terms oflike the lol.
It's complicated. You'll see.
We'll we'll go through. I'll show you what I mean.
Yeah, sorry for that that complex rant, but no this.
(27:46):
Is no, it's good man, this is good because I think like you
know we it's time we did a deep dive on nodes.
I think this episode is going tobe really depth focused on this
node piece, right? So backing up to the screen,
there are codes to get into the beta.
So I would say maybe go on X follow Korea.
(28:10):
I think they were throwing around a bunch of codes.
I was just looking. So I think like, if you're
looking for something, maybe do that and then you know, just
like weevie did, just like free pick did, which will show too.
You've already got some nice little they're calling them
blueprints or you know, workflows that are already kind
(28:32):
of set up for you. So my experience in learning
this couple two things, of course, I reached out to to my
guy Rory, who spent like an hourwith me, kind of like walking me
through the basics, but one of the other great ways to learn
this stuff. Well, to I'll give you a couple
different ways to learn this stuff, but one of the best ways
(28:52):
to learn this stuff is to click into these and just like reverse
engineer them in your mind, right?
Like like actually breakdown what's going on here, right?
So like, let me zoom in on this.OK, so we've got like an
original product photo. They just added this photo in.
Now let's like see what they're doing inside of here.
OK, So you know, they're basically taking this.
(29:18):
Well, they created this photo, it looks like in Korea.
Here's the prompt for that, right?
And they did this in, you know, in nodes here.
So right here, this is my first time looking at it looks like
you've got some other settings here, you know, aspect ratio,
style, blah, blah, blah. Here's the option to run the
(29:40):
node so you can run that. Whole thing.
Yeah, and you can also run the whole thing which basically
connects all of those dots. But then they generated this,
right? Then they then they basically
opened up another thing. This is Nano Banana.
You can see all the sort of likemodels here and it's taking this
replace the sneakers with the teal colored ones, right?
(30:00):
So then it took this original image, connected that over here,
connected this here. So it's taking the pose, the
background, everything about it.And it's just swapping out the
shoes, right? Then it's connecting this one
again, add a white title in a thin old school serif font
saying run with style, just connecting the dots, right?
(30:21):
Like it's all it is. Is this dragging here to here
right, so that's all that's happening here and then you have
these other settings available here.
This does look pretty sharp. I'm I'm excited to like jump
into this a little bit further. And then they did this nice
little before and after thing, which is pretty cool.
(30:42):
So. I like this little piece right
here for the for the upscalers Ilike.
I like the slider in the node. That's good.
It's good little good little UXUI sort of piece right there.
Love that, super cool. Not available in any other tool
so. So I mean, that's pretty, that's
pretty sweet. And then like, let's let me see
if I can like, oh, here we go. OK, so then we go here.
(31:09):
What do we do here? We took this isolate the
sneakers, separate them, make them look like they're floating
in the air case. So this is like text prompt
coupled with these shoes. They took this, they said
elegant product app for two sneakers floating mid air
sneakers should fly into the OK,so this is a video so we can
click this boom, right. And you're just connected like
(31:30):
it's just a simple connection, right?
So if you're, if you haven't used this yet, you're just
dragging this here, you're goinginto settings, you're selecting
the duration, blah, blah, blah. And I would say even here, like
I just right clicked, this is where you would add, you know
what you're going to use for this.
So it looks like they've broken this out by image, video,
etcetera, right? So they just did this or maybe
(31:53):
that's enhanced. So that's not actually it,
that's for upscaling. Let me see.
You can also search through the nodes.
You can just type in there too. Up at the top, whatever true
good one let's. Do they got Higgs field in here
for example maybe? Yep.
No. Nope, they.
Don't no Higgs field, OK, They do have hell yeah.
(32:16):
OK, so I mean like you can select the any any of these they
have Sora 2 in here blah, blah blah.
You would select that that's going to give you this box and
you're going to have the same option, right, with the settings
based on the model you choose might have slightly different
settings, but essentially you'rejust connecting these things.
And then Rory, you know, like what makes this so great?
And I know you can speak to this, which which is just the
(32:37):
ability to then change somethingvery quickly and have it affect,
you know, the entire workflow for scalability and efficiency.
And maybe this isn't the exact best example here, right?
But it's essentially like, hey, maybe I just want to swap out
these shoes at the beginning andmaybe it does 80% of the work
(32:58):
for me. I might have to go in and adjust
like a few quick things. Maybe it's I want a different
color, for example, or whatever on the background.
But then it's like, I can, I cando all that.
What's what's really cool though, are the deeper examples,
which you and I have talked about, which is like, you know,
you're getting into something where, you know, you're talking
about all the different camera angles too, right, where you're
(33:19):
getting these new shots and blah, blah, blah, blah, blah,
blah. So yeah.
The one and then one thing to think about with like workflow
building, right? When you look at this one, this
workflow is very specific to this individual use case with
these pair of shoes. So like if you want to go and do
this just to be like project wise, like I think there's,
(33:40):
there's two ways to think about this, like individual projects
and then like building workflows, like this is an
individual project, right? You can't just drop in a
different pair of sneakers and the whole thing runs how it
should. Like a, a workflow would be like
built a little bit more generous, if that makes sense.
So I drop in any pair of shoes and then it does this, these
steps, regardless of what the shoe looks like.
(34:02):
So you know where they're sayinglike teal background and run
with style, like those are all individual to this specific
image. So if you're building a tool,
you'd have to think about it a little bit differently, like a
workflow to be utilized over andover again, if that makes sense,
because it can't be. That's a great.
Point. It's it's specific, but genetic,
like flexible and versatile enough.
So instead of teal shoes, it's just like you're just saying
(34:24):
shoes, right? And then maybe you've got
another node box where it's justmaybe that's where you would
change the color, right? Or you know, whatever it is.
Or maybe that goes back all the way to maybe this system prompt
that you have at the beginning that has a describe, you know,
everything that's that's there based on that particular output
(34:47):
or outcome that you're looking for.
And then that makes the rest of the workflow a little bit more
automated, right? Like that's what I've seen you
do in the past too. So that's that's where we get
into and I can show I built, I went nuts on Korea for like 25.
Minutes. Oh well, yeah, show it, man.
Because these are just these arejust the Korea examples.
Let me stop sharing here. Yeah this was this was me like
(35:08):
the first thing I went to with Korea.
I was like, let me, let me see if I can do batch processing,
which I can. So I was pretty amped about
that. Delete this and this here.
But like this is me like, oh, I can take one image and then just
create whatever from it, right? And I wanted to just I wanted to
(35:29):
auto prompt it. And then I have actually, this
is a very unique feature for thevideo side of things, which I
was liking too. But like essentially because you
have like this, it's called an LLM call, meaning you can just
use an LLM. But as you'll see in like the
LLM call, it's basically going to generate it's going to use
like a custom GPT, right? Like I'm giving it a prompt, I'm
(35:53):
giving it the input, like the image input.
And then I have the system prompt, which is basically
saying analyze the image, try tocreate, you know, the image as
close as possible, make it as detailed as possible with
A200300 word range written in natural language paragraph,
right? So up here it'll create once I
run this, it'll create 10 prompts for this car.
(36:15):
Like I didn't get see how generic it is.
Like I'm just saying 10 prompts to this image in action
situations from different compositions, right?
I could use this for any car nowbecause I'm not saying which
specific car it is. I'm just using it as car
situations compositions. And then it's coming up here
like low angle tracking shot, you know, Twilight.
Yeah, you could even say subjectinstead of car, right?
(36:38):
100% that makes it really scalable, right?
So that's like the that's the whole what I'm talking about,
about making it generic. But this is the this is the key
piece, right? This is what weevie has, this
line splitter, which will essentially what it's saying is
like every time there's a line break, like every time, OK, we
have prompt one and then let me see if I can expand this.
(37:01):
Can I? I don't think I can.
So like after prompt 1, you see there's a paragraph and then
there's a line break, right? So that's 2.
So that's how it knows with the line splitter what to split.
So it's like, OK, prompt one, then all right, break.
And then prompt 2 is a different1.
So you know, I have this runningto 10 different prompts and then
(37:23):
I actually ran these. I don't know if you've played
around with I I really like Korea one.
It's not like it's not going to translate this one to one, but I
do like some of the Korea one aesthetic.
So like I just ran into Korea one just because I wanted to
check, you know, just want to play with Korea one for a little
bit to give it like but basically image prompting.
Nothing like not like sip like using the same subject here.
(37:44):
So these nine were Korea and then I used C data or Sea Dream
to actually place the car in different situations, just
running these and getting different examples, right.
So all of these are attached. Just I can I can do this all in
one click now if I wanted to. Just I was going to say, do you,
do you mind like running like, you know, show these people too,
(38:07):
like if they've, if they've never done this, like how do you
then run all these images again,you know, like how do you create
new variations of these really quickly?
Yeah. So if I wanted to run this whole
prompt again, right, I'd just goand select the whole thing, see
and then see how it just says run nodes up here, run nodes.
(38:30):
And hopefully this works when we're on live.
I don't know if it will boom. Boom, boom.
So everything's running like those.
All of those images now that he's got there are generating or
they, they they should be. That's the thought and it's
possible. I I think you're doing that
right would be my guess. But that's that's essentially
what we're doing here. I mean, this is so going back to
(38:51):
that, that really important point that you mentioned at the
beginning, which is like making you're going to have maybe
specific use cases to you. But if you make things a little
bit versatile upfront, now all the sudden you don't, you know,
Roy doesn't have to go back and look at every single box, you
know, an update text and this and that.
Now we know that it's versatile and flexible enough and specific
(39:12):
enough, yet vague enough to thenjust run this and then
hypothetically, right. I don't know how many images are
here. Let me say 624 images, maybe if
my, if I can still do math, which I was never good at math,
but here we are. And you're not, you're not going
to get 24 new images, right? And you know, you don't lose the
(39:33):
other images that you had. They're they're in there,
they're scrollable, right? So like Zory Roy zoom into one.
So you see five of five. Those are all things he's
already run in that. So, you know, you can try and
test, Hey, can I get a better image than this, you know, and
blah, blah, blah. You don't lose what you've
already created there. So it's there for you.
(39:53):
So this is like me running. So this is like mid journey
running permutations, right? Almost, but like at scale with
much more differentiation, right?
So like that's sort of like the idea of here, but this is the
this is the cool part. This is again another I think
you can probably do this in comfy, but you can't do it in
most of the other tools. So here it's like I'm using the
same sort of system prompt, except I'm saying create video
(40:15):
prompts and then I'm attaching the images for, you know, it can
add up to five images here. So I attached four of them.
So it's taking those four imagesindividually, writing 4 prompts
for them, and then creating videos automated, right, based
on the image itself and the system prompt.
So system prompts basically saying look at the image, write
(40:38):
the prompt. That's basically it, right?
So you know, these are, these are automated.
So you don't, you never know what you're going to get.
I didn't look at any of them before I ran it but you can see
like some of them not bad like this.
Is this 1's? Pretty, is this a van?
Is this a van's style? Is, is that what I'm picking up
on these van? OK, Yeah.
(40:59):
I just picked this. I saw this car Pinterest.
I was like, that's cool. I like the way.
That was I thought I saw that. Hey, you know, dude, one last
point on that. When you were showing the
videos, you know, you know that said that you could Only Connect
five there, right? But like you could then just go
back and clone, duplicate it andconnect five more, right, guys?
(41:20):
So like that's how this stuff works.
You're not you don't have to then go and change all those
again. You just copy, clone that whole
little section, which Rory's shown you already where you can
just drag clone it and then justreconnect the new nodes right to
the to the other five that you want.
And all of a sudden you went from 5 videos to 10 videos,
right? Like all of this is sort of
built out like that. So this this looks intimidating,
(41:43):
right? Like this is very intimidating.
This whole node situation. I built it in weevie and it can
be very you know, it'll be much stripped down, right?
Like this is the same same workflow, same system prompt,
same everything. What's going on here is you have
the image, your prompt system prompt same way it is here as
(42:05):
image prompt system prompt, right?
Once it runs, as you can see like in the LLM here, same
thing. It's just going to create 10
prompts for us. Then this goes to the array,
which is, you know, split each one of them based on, you know,
the individual prompts. They also have this thing called
the text iterator in Weevie, which is like instead of doing
(42:26):
it by the list format, you don'thave to go and break out like
into each individual. Like if I want to do like list
selector, like if I want to do this, I don't have to go and,
you know, create ten of these tohave 10 different prompts.
You can just run it into one of you can just run it into one
node and they'll all generate right.
So if I just go, you know, hit run, it'll start to generate the
(42:49):
10 prompts that are already here.
So this can be stripped back a lot, right?
Which is yeah. So it.
Doesn't look. Maybe it doesn't look so
overwhelming. Yeah, because then this is like
if I'm building an app, right? Like there's something to share
with someone that they're just going to use it this way, right?
Like this is so much easier now than than doing it that way
(43:10):
because I can let me just lock this up here.
We'll make this app a little bitmore visually aesthetic for
everyone. So like this is what you can do
on weevie that the other things,you know, other tools you can't.
You can use it on comfy, but like again, I just hit run and
then you're going to see it here. 20 are just going to
happen at the same time just based on sort of what I asked
(43:30):
for and we'll get decent stuff that comes out of this bases the
same exact same exact prompts from from what's it called from
Crea. It's just a it's it's looks at
that now. I feel like this is what it
comes down to, right? It's like who makes things look
the best and who makes things the simplest 'cause if
(43:51):
everyone's got everything and you can do every tool in on node
based create, then it's like whomakes the system the easiest and
the cleanest, right? The easy, the like, the most
simple to use. So this is what I love about
this thing, you know, it's just,it's just awesome.
But you know, that's that's alsoradically different than free
(44:12):
pick, right? Like free pick is also
different. So have you been?
Have you tried free picks nodes yet?
Haven't tried it yet. I haven't tried it, but I've got
I've got it up. What's like, let's get into it.
Here's me with the cowboy hat. I just, I had to show you that.
You know, hold on, wait, I felt.Like you needed to see it.
These are fun by the way, dude. I mean like you can create any
(44:35):
image you want in here, like I'll even take I'm even taking
some images that I'm doing for like clients and then I'm like,
hey, like what is the prompt forthis?
You know, it's like more contrast saturation, more analog
film look right, like harsh flash type, you know like so I
mean this is possibly something that you could scale in spaces
(44:57):
too or weevie for that or Korea for that matter.
It doesn't matter 'cause they all have nano banana right?
But like when you like this is avery big need too that I see
from clients. It's like hey, look like this
picture looks too perfect. I wanted to like feel like
somebody took it. You know what I mean?
Like that looks significantly different than maybe what I had
(45:19):
in there. But let's go, let's go to free
pics piece here. I don't know the best way to get
there other than going to all tools down here at the bottom
spaces and then I opened it up in.
That thumbnail today the cowboy hat drew might make its way into
the thumbnail. Not that I'm thinking about it.
Get get my cowboy boots on man. I've been I've been rocking the
(45:40):
boots lately. It's cowboy boots season, man.
Very impressive. Going back to free pick.
Hey, loving the templates. Loving the templates.
I haven't clicked into these, but again, same deal, right?
Like we were just talking about the nodes, the workflows, you
know, the blueprints or whateverCrea was calling them.
(46:00):
I mean, you got the same thing here.
So it's like, hey, is there something here that feels maybe
relevant to what you would work on right now?
And then again, like going in here and taking a look at how it
was built. There's a good template 'cause
this is This is also the differentiator now for for a
(46:20):
free pick, right? Like you can do a lot more in
here like a canvas style, like it's not just nodes in creating
like this is more like I can, you know, add text and like
design it a little bit. And this can be like a
collaborative space which is also good from the free pic side
for smart feature on their part Smart design.
For them, Yeah. I this is my first time seeing
(46:41):
it. So this is this is cool.
All right? This is Yeah.
This just. I mean, this is really
interesting. OK, so you've got the nodes here
as you zoom in. It's also got some information
here. Like you could.
Share a reference as like a you can share this.
(47:01):
Yeah, this feels like a creativebrief.
This is what I have a feeling. If you're going to try to
interpret what Weevie and Figma looks like, this is probably
going to be how it it looks and feels in a very.
Similar look at this 3D model and connect boom boom, boom
yeah, it's just man, I just findit so fascinating to see how
(47:22):
each of them kind of put their own little little spin on it,
you know super clean. I'm excited to play around with
this. I I think I made a case for this
last week, but I like, I like what free picks doing.
I mean, I, I, I really do. I, I think like, you know, the
(47:45):
image generator thing is it's nice, you know, like the you can
go like I'm on a plan. I think I pay 40 a month and I'm
paying month to month. I think I'm on this one.
So it like, obviously it's cheaper if you, if you were to
just do an annual upfront payment, 40 a month and I get
(48:08):
unlimited image generation on Nano Banana, for example, or
yeah, and so, and, and here's the other part like, so that
speeds up the process also when you're using Nano banana.
So let's go here. Let me just show something
(48:28):
really quickly. But you can also just like go
ahead and click 4 on demand. So every time I'm like, hey,
make this, you know, instructionI'm prompting for some iteration
I generate, it's going to generate all four.
OK, which is also nice, right? Because I get unlimited.
So I could just hit that a couple times and it's like, OK,
if I hit if I got 8 iterations of that and it still didn't nail
(48:51):
it, I probably need to tweak my prompt.
I didn't get something right. I need to do something
different, right? Also, I mean, this is we're kind
of getting a little bit tangent here, but like, let's let's
circle back to this nano banana two.
I've been hearing rumblings coming this.
Month waiting for it, so I'm very excited.
(49:12):
I'm whatever just going to be like what seed sea like I've
been really liking Sea Dream forthis stuff.
It's like really I think Nano's better at nailing the product
consistency and like object consistency and like person
consistency. I feel like Sea Dream is better
at like changing angles and likebeing a little bit more dynamic
in terms of like visual aesthetic.
(49:32):
But it also can produce in 4K, which is great.
So a lot of times it's really good.
But yeah, there's there's definitely some differences.
I'm sure Google's right, you know, got something up their
sleeve, as usual. I mean, nano banana 2, probably
Bo 4 maybe, maybe end of the year before end of the year.
(49:55):
I don't know. I'm just guessing.
But like, that would be crazy velocity what Google's doing
right now. This.
The shift for that was that's massive.
If I can release that that quick, I'm sure they all got it.
I'm sure they got Nano 2.5 in the pipeline.
There's, you know, Nano three, Nano 4.
It's the way Mid Journey's probably working on.
Here you go too, like 9, just different, you know scenes of
(50:19):
the same shoes with the different prompts, right?
So you could, yeah, it's kind ofinteresting connecting those
things. This is like a social media
flow, create social media content, not super complicated,
but these are just three different prompts, right?
Leveraging this is an image reference essentially.
And what is the, what is this? So they've got copyrighted.
(50:42):
Copyrighted instructions, yeah. Going into an LLM to then write
copy based on the images that are in there.
So it'll probably write you 3 pieces of copy based on the.
Where is that? Where is that input coming?
Where is that input coming from though?
That right there they're they'rejust writing it.
This is the this is the OK, OK, OK, I see now.
(51:04):
So this is yeah, this is what they're pulling in.
It's interesting, but this isn'trun through an LLM though, is
it? Like this.
They could have copied and pasted this in here and then.
That's the that's, that's the nice part is that you can do
stuff like this where it is, it's like a little bit more
living, breathing sort of designspace than just using the AI
(51:26):
tools, right? So here, like for example, this
can stay the same on every single, you know, piece of, you
know, this, this can be like a workflow piece, right, like the
finishing piece where it stays the same.
You just import your, your copy guidelines, you run it to the
LLM. You also run the three images to
the LLM. It'll be like, OK, I'll read the
(51:46):
three images or look at the three images and write 3 pieces
of copyright. So this is this is how like we
have to start thinking about. And, and these are the pieces of
it, Rory, that you want to spenda little bit extra time on to
make sure you got something that's a banger, right?
Because if you're going to use it time and time again, nothing
(52:07):
against this. This is structured really,
really well, but it's like, I want to give it more than that
for the hook. You know, I, I want to talk a
little bit more about what I'm looking for with the hook, you
know, and, and provide a little bit more like specific
constraint in, in direction, ButI like visually the way that
(52:28):
this is, this is displayed like 'cause in, in again, we're just
comparing, right? Like we're not picking on it.
We love all these tools, but like if you go in here, you
know, like it's just, it's just harder.
This is just harder. It's.
Just harder probably. You know, like.
Going to Figma, it's going to besomething more that looks like
(52:50):
like free pick, I would assume. Yeah, free pick.
Like I'm just, I'm just. Saying like this, this display
of it is really, really interesting.
This feels like this. This this whole thing feels like
a creative brief the way that it's.
It's laid out which. I think is interesting.
It's smart. Can you go into that LLM box
(53:13):
that over there on this? I just want to see if you can
write the last the last box overthere.
Can you switch the model? What's the?
I'm just interested. In I gotta let me see.
I I think I have to change that.Let me, let me bring this.
OK, Like can that be clawed instead of ChatGPT or is it just
GPT that we're that we're using here?
(53:34):
Oh, I do want to do your tutorial, but not right now.
OK, so let's do let me just kindof poke around here.
Prompt result, copy export result, Text settings.
Yeah, see only. Can use GPT 5 OK?
(53:55):
OK, and you can. Dang, that's weird though.
Why don't isn't that? Is that surprising to you?
You know, that's that's also like the benefit of tools like
like Wii and you know, they haveall of them.
You can. Use all any of them.
Which is, you know, for copywriting, like Claude to me
(54:17):
is my, is my go to for, for. And if I want something that
sounds more human, less, you know, step into timeless style
and effortless comfort, which isso ChatGPT that I can't even
like, yeah, you see it. And you're just like ChatGPT
wrote that. But.
That's, you know, like it's really pretty, let's put it this
(54:37):
way, like it's really well designed.
It's there's a lot of different things that mood board one that
you were looking at. Yeah.
That's all. That's all really, really legit.
Like I like how you can go in there and make it like a design
piece and like share that. And I'm pretty sure it's
collaborative in terms of workspace.
I don't know if it is or not, just a guess, but like I can
(55:00):
share that with a client I don'teven have to go and like
externally design that. Sounds.
Good you know that's pretty pretty done and all that stuff's
being added after the fact. That's not just like all
generated that text is is added individually.
You know some of those those boxes are added individually,
right So it's that's I I like this so free pick.
(55:24):
Well done on on this one. I'll show you hold on.
You want to see our T-shirts? That we did did last night?
About two seconds, please. Please, guys, I think we're I
think we're getting I think we're getting close to to merch.
I think we're getting close. I mean, I think like we're on to
some things. You know, we've we've got little
(55:45):
cute stickers, cool, but I thinklike the wearables component is,
is really what I have in mind when I'm thinking merch.
And obviously, like we looked atRory's thing last week with Dart
and Scatabo, you know, and just kind of putting that on a shirt,
bringing it into real life, which was which was dope.
(56:07):
And then Rory was messing aroundyesterday with some with some
fast hour shirts. So so I did this in two.
Like 2 seconds to respond to Drew's comment on on X and I was
like, let's let's go back to theto the.
Oh bro, what are the? What is?
Well, this was I was like, let'sget the base of the shirt,
(56:28):
right. This is the other thing that I
want to mention about free book,which I really like is that this
sounds like a little tiny UX thing, but you can, you can
prompt within the node, right? Which is great for just like
doing one thing at a time, or you can prompt outside the node
and bring the text in, right? So it's giving you both.
That's not something that's available on most because.
(56:49):
It's doing that. It's it's a little thing, but it
makes a difference. So this was this was, let's see
that one it now this one was it.Let's go back one.
There we go. So I was like, you know, Drew
said, where's our T-shirt, you know, in terms of our design?
And I was like, screw it, I'll make it in five seconds.
(57:10):
So he's using Nano here, taking our two pictures, making it like
a rap album cover. I should have put parental
advisory on here. Now that I'm looking at it, I
know drop the T-shirt example from as pull up.
And then we had this. And then Drew responded, Hey,
can we just, can we use our characters like our caricatures
(57:32):
and I? Was like, alright we'll.
Just run the whole thing over again the same.
Hell yeah, look at that dude. Look at the black and whites.
Even I hadn't. Seen I know they're cool like I
I like this. This was Aww.
Dude. You throw, dude, you throw an
explicit content parental advisory on there.
It's over. I need to, I'm gonna, I should
throw it on this T-shirt, right?And then we're, and then we're
like, all right, cool. So we can, you know, we can run
(57:54):
this stuff relatively easy, but then I actually had it put it on
us. That's.
So dumb, but this one again, like this is I I really like.
I really like some of the thingsin here, right?
Like I like, you know, again, like if I'm using image tools,
(58:15):
right, like I'm able to sort of branch reality here, which I
like, like doing, you know, I have like my standard image.
I can, you know, use seed at it to get the close up.
We can remove the person and then we can get sort of key
frames, right? So we can always have like, like
this is what's easy about doing stuff like this, which is nice
because it there's often times too many things that are that
(58:39):
are missing here. Like this is a perfect example,
right? Like taking the image of the car
here, but I want the car pullingup, right?
So you remove the car with C data and then have it go in.
But it's just so using keyframesand like it's so simple in tools
like this to do that instead of downloading both uploading to
another platform. Like I can also just swap this
(59:00):
if I don't like it, you know, keyframes.
Same thing here, like to get like the perfect sort of.
Like still shot of whatever. Yeah, that's all.
So hey, I really, I really love the the color grading on that
that classroom school day, man. Oh dude, that's that's all.
Like go into. This deck, that's that's what
(59:21):
we've been, but I'm prompting for that, right?
So here's a here's a fun one foreveryone.
Been using sort of sketches as storyboard images, like
prompting sketches and like getting this sort of like
shading and coloring sort of howI want it in there and then
taking it into Mystic and then using structure reference to
(59:42):
then animate it. But I'm also totally working on
the color grading, like consistently.
Hey, I I want to go into that for a SEC, Rory, because I like,
I don't like you were telling meabout that before we jumped on.
I don't, I don't know. I mean, I'm sure there's
somebody maybe doing that. I just I hadn't heard of that.
When you're when you're prompting for sketches, are are
(01:00:04):
you doing that in? Are you using mid journey for
that? I guess that's my first
question. No, let's, I'll go to.
I'm actually. Using I would, I would love for
you to like just kind of show that process.
I think that's an interesting way because the, the way that
you described it to me was, you know, you were able to get some
things maybe mechanically or like physically that felt like
(01:00:26):
maybe it would be much harder todo just by prompting for
photorealism. So like, yeah, just maybe talk
through that. I'm curious like.
Talking it up now. So you guys, whenever you guys
have, if you guys have seen my Wii V workflows before, you see
the cleaned up version. This is the non cleaned up
version. This is the chaos, right?
This is where I'm doing like my workspace in here.
(01:00:49):
So like for example, wow, this is just you guys, all you guys
are good to see all the chaos. What I'm doing here for these
prompts, right? I have I'm doing like a stranger
things aesthetic. So I wanted to sort of get that
to come through by doing storyboard sketch, cinematic
composition, Stranger Things aesthetic 19 nostalgia film
tone, soft graphite shading withselective color wash, dramatic
(01:01:14):
lighting, film grain, whatever. This is what I'm using on every
single prompt to create the the sketch tone, right?
And then it's just basically adding the prompt like whatever
the whatever's going on in there, right?
So it's it's sort of like that'sthe one.
And then I'm running it for the exact same, you know, over and
over and over again. But it's giving like this very
(01:01:38):
consistent sort of like structure and style to the whole
thing, which is what I like because this is yeah, you can,
if I go through here, you'll seeit's all pretty much exactly the
same. And the character stays fairly
consistent with the sketch. Like the car stays fairly
consistent with the sketch. But like this one, I, I, there's
(01:01:58):
a few in here that I really likein terms of the car.
Think if I get through, you know, like that one, like the
sketch is good because it provides good structure.
Like the light is coming from the right direction.
You have sort of the, if I was using black and white, I might
not get sort of this, you know, the hellation and the
refraction, right? It might be a reflection of the
light on the smoke. So I, I was thinking about this.
(01:02:21):
I'm like, oh, this provides like, instead of me going to
like actually like recreate thisimage with the same prompts.
Let me just go and use structurereference with Magnific or
Mystic. So here's like a few a few tests
that I ran with the other tools where I'm like, oh, let's use
Reeve. What I'm doing here is, you
know, where's the prompt that I'm using This one keep the
(01:02:44):
image construction characters. You know, we want to use, we're
going to use a specific camera. We want to color grade it with
the contrast. We want the the color scheme, we
want the lighting. Yeah, this is a little bit more
detailed, but in running those right, like I sent it to Reeve,
which actually didn't do a bad job, right?
Like went there, but it wasn't like visually memorable.
That looks like way too desaturated for me.
(01:03:07):
Seed added or sea dream sort of brought it to life and I was
like all right, cool. Not not exactly photorealistic.
Gemini didn't really do it that well.
But then I ran it to Mystic and I was like you that just
totally, totally turned it. And these were sort of like the
the parameters I was working on.So you have to this is if you're
(01:03:29):
using just mid or magnific specifically on their website,
use the reference image of the sketch as a control image or the
structure reference. And then using, you know,
control image strength, adherence, HDR resolution, like
these are the sort of the settings if you guys want to
screenshot these, but model realism, magnific Sharpie,
(01:03:50):
creative detailing around 33. But it was working for just
about everything, which I was amped about because I would do
like this one's a great example.This is one of my favorites,
these two together like that sketch to that right there, like
picking up on the light and picking up on sort of the again,
(01:04:11):
like the atmospheric haze and the smoke.
And the structures? Awesome.
Yeah. And it's really clean, like I
actually, I'll show you this. Where am I in this is a good,
good way to visualize how good this structure is because I
wanted to just see it was like, all right, let's see how it goes
like when it oops, you can see it just transform.
(01:04:35):
I'm like, all right, that's that's really good.
Like from a structure reference standpoint, it's basically
nailing it, changing the lighting, right.
So that's I've been looking for something like this.
I guess I've just been sort of missing on Mystic for structure
reference and things, but yeah dude, there's so much going on.
(01:04:56):
I mean you could have you tried that in mid journey?
I mean I know you can't do it inthe node based but.
I did. It wasn't, it wasn't the same.
Like it wasn't really. It wasn't as clean as that.
Like it was. Yeah.
It was like overly saturated or like overly green and like
everything was green. Like it wasn't sort of.
(01:05:18):
The retexture would change the face, the retexture would change
the clothes. So it was kind of like, but
yeah, yeah. But we just ripped on a bunch of
stuff. That we did RIP and I know and I
know we're tied on like we got to wrap this baby up.
But yeah, man, I, I, I, I enjoyed this episode.
Hopefully this was valuable for you guys too.
I, I, I know a lot more people are starting to dip their toe in
(01:05:42):
the water with the node based stuff.
So you know, we even got a comment last week on one of the
videos asking for it. You know, if, if this is helpful
too. I think there's a lot more we
can go into here. And I think if this continues to
kind of be that thing, right, like we probably should, we
probably should go much deeper in on this.
(01:06:02):
So let let us let us know if youagree.
In the meantime, Rory, I'm hoping maybe we get a little bit
of sprinkle updates from mid journey between now and next
Friday that we can that we can play and experiment with and
share with you guys on the next one.
So. Totally what's everyone, you
know, stay sane with your you know, with the introduction of
(01:06:24):
everything that's going to happen over the next two months,
who knows? It's just going to be like rapid
fire. So we'll try to keep it real
here. We'll try to like put in the
hard work for you guys so you don't have to go crazy.
We'll try to figure this stuff out for you.
But if you're looking at any specific things like that,
especially we talked about today, drop in the comments,
right? Node based stuff, questions.
If you want more of that, if youdon't want more of that mid
(01:06:45):
journey stuff, whatever your questions are, always leave them
on the on the if you're still here, if you have questions,
just leave it on there. We we see it, we respond to it
and we'll we'll answer it in theshow too.
So, you know, don't, don't be scared.
Also thinking we need a live edition soon, Rory, maybe, maybe
we should do that next week. We should do that sometime
November. We should do a live session.
(01:07:06):
I think AMA would be good. We've got a lot of different
requests, a lot of different questions, so maybe that's
something we do too. Let's.
All right. Well, hey, good to see you, man.
Good to see you always. I'm excited for next week and
we'll keep things rolling. Thank you guys for joining us.
We'll see you guys in the next one.
See ya.