All Episodes

October 6, 2025 66 mins

Midjourney Fast Hours — Episode 51: “Sora 2, Veo 3 & The AI Video Gold Rush”

Drew and Rory crawl out of their creative caves after a month-long “break” (read: total burnout) and immediately get smacked in the face by Sora 2, Veo 3, and Midjourney Style Explorer updates that make everything they said last episode obsolete.

They break down what’s actually working in Sora 2 (full edits, audio, dialogue, even IP?), why Veo 3 can’t seem to go fast anymore, and how keyframes are quietly becoming the holy grail of AI video.

Expect rants about IP chaos, Mr. Rogers cameos, and why OpenAI’s guerilla launch strategy might be the smartest marketing move of 2025.

It’s equal parts therapy session, tech panic, and creative caffeine rush — exactly how they like it.

Keywords: Sora 2 update, Google Veo 3 review, Midjourney V8, AI video generation tools, Luma AI, keyframes, AI video editing, AI marketing trends 2025, generative video, Style Explorer Midjourney

---

⏱️ Midjourney Fast Hour

00:00 – Intro + return from creative hibernation

00:42 – Sora 2 drops: “the floodgates just opened”

01:27 – Sora 2’s new tricks: editing, sound, dialogue, and IP

02:17 – Veo 3 vs Sora 2 vs Ray 3

03:20 – Rory’s South Park AI episode + the “bar-only” friend

05:00 – Why dedicated AI-only spaces actually matter

05:48 – Remix culture, Mr. Rogers AI, and the rise of comedic timing

07:00 – The “Remix” button, mass adoption, and kids using Sora

08:43 – The IP gray area: Minecraft meets Mr. Rogers

09:43 – Cameos, access codes, and mobile vs desktop creation

12:25 – Sequential storytelling: AI understands chronology now

13:29 – Toyota ads, guerrilla launches, and OpenAI’s flood strategy

15:00 – IP risk vs reward — how far can brands push it

18:00 – AI performance comparisons

23:00 – Fight scenes, motion control, and why keyframes matter

27:50 – Workflow troubleshooting and micro-decision fatigue

34:50 – Too many tools? Runway, Aleph, and the Weavy advantage

35:45 – Shout-out to Weavy + tool consolidation predictions

36:00 – Higgsfield pivots, Pika memes, and the marketing gap

37:00 – Visual Electric’s acquisition and the coming consolidation wave

38:20 – Midjourney updates: Style Explorer, Smart Search, and new unlocks

40:30 – Playing with EXP mode + hidden color/style refinements

42:30 – Style Finder, Style Creator, and mood-board personalization

43:57 – Style ranking feature + --r 40 nostalgia meltdown

46:00 – Midjourney V8 speculation & dataset rumors

50:30 – Google’s product chaos: Gemini, Nano Banana, Flow, and Veo 3

53:00 – Why Google can fail (and still win)

55:10 – ChatGPT’s image text features & the next AI video wave

59:30 – The Weavy renaissance and workflow automation discussion

1:02:00 – New creative problems worth solving

1:06:00 – Why “easy” AI creation still stings for creatives

1:08:30 – Closing banter + “hit the button” outro

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
(00:00):
What's up everybody? Mid journey fast hours podcast
episode 51. We're back after a long hiatus.
I think we took maybe a four weeks off, something like 4
weeks. Something like that.
Sorry guys. Yeah, we, we're sorry, guys.
We, we missed you. We missed this.

(00:20):
This is our therapy each week and we've both just been busy
and had things going that just conflicted.
But we're back. We are back.
We have a lot to talk about. Obviously lots changed in a
month. Surprise, surprise.
We've got several new video models.
We've got several new sort of like ways to think about how you

(00:43):
would, you know, get to end results of things.
We've got mid journeys style explorer going further and
further, you know, So I mean, there's a lot that we can cover.
I think you and I were talking about it first.
I think we got to start with Sora too.
Right, Yes, yes, like bombshell again, just, you know, every
week it's something that is, I don't know, landmark, watershed,

(01:06):
whatever the holding term is youwant to call it.
Because this one, it's just again like there's, there's
certain, there's certain elements that always feel like
holes in the in the tools. Like 1 is editing like 1.
Can't you know that you can't really like edit a bunch of
clips together, you know, with one shot?
I mean, you can, but it's never that good.
Yeah. You know, Sora comes out with

(01:27):
this thing and it's just like full edits like and you can also
do sound and dialogue and also IP allegedly.
And apparently you can do whatever IP you want.
So again, it feels like this is just a the next step and now
again, because they put this out, everyone's going to want to
do it. And it's I don't know if you
have you gotten to play with it at least once or twice to see?

(01:48):
Sort of capabilities I have beenwatching closely, but I have not
been in the tool yet. I've say I like I've saved a few
things that I've I've really enjoyed, but I think like just
in general, again, like I've, I've been wrapped up with other
things. So I haven't gone full, full

(02:09):
into Sora yet. I think it's interesting.
I knew Sora 2 was coming this year.
I didn't know when, you know, Sothat was kind of interesting to
kind of just see that fly up outof nowhere.
Last week they did take Sora, I think out of the navigation on
Chachi BT like last week. So it was like there was maybe
some whispers that OK, here comes like maybe they're going

(02:31):
to swap that out. So that was interesting.
And then you know, like I think Google is gonna have probably
VO4 by the end of the year. I would I would imagine they got
so. Ready.
They're just waiting. There we go.
Here we go, man. Yeah, they're winning for Q4.
They're waiting for Sora to dropSora too.
Let's Sora get their couple weeks and then they'll get their
a couple weeks. I mean, it's like very, that is

(02:53):
the one thing about stores. It's fun, man.
Like I, I like been playing around with it.
First off, I've been trolling myfriends a bunch.
It's epic. Like you can, you can go in
there. Look, we have a, we have a buddy
who, you know, he, we make fun of him because he, he only likes
to drink at bars. He doesn't like to drink at
people's apartments or houses orlike in his own home.

(03:13):
He likes to only go to bars and get drunk and then he screams
like a goat. So I made like a like a South
Park episode preview. How old is this boy?
He's he's getting close to 40, but it's, you know, we still got
to let our inner child out. Dude, I don't, I don't
discriminate against anyone's kinks.
They want to party. They want to.
Party What's the deal with not not drinking in the home?

(03:35):
It's just a, it's like a runningjoke that it's always just like,
where are we going out? And it's just like, well, you
know, sometimes we don't want togo out.
We just want to hang out with the boys, but he wants to go
out. So but we made this whole, I
made this like episode preview where it's like he's like, I put
him in the episode and he's now Randy Marsh's new drinking
buddy. And it's just like, who is that
guy? I don't know his name is Dweeb,

(03:56):
but he likes to scream like a goat.
It's like it can come up with this.
You can give it a very minimal prompt.
Could be like, you know, South Park episode preview, this
character is introduced and we're making fun of him and
that's it. And then like the whole script,
the whole cuts, the edits, everything.
So it's very attainable, which Ithink is, you know, if you want

(04:16):
to talk about like AI adoption, I have a feeling this will be
like the best tool for AI adoption since probably mid
journey where you can go in withminimal background, with minimal
sort of knowledge, type in anything and get something good.
And then it's like fun. You can share it.
And obviously it lives in its own little social feed that's

(04:37):
there. Which another point I think is
also interesting is that everyone's been like, why do we
need like a dedicated sort of like AI only space, right?
But I think it's good 'cause it's like it doesn't pollute
everything else. Like it'll get shared on all the
other tools. But like when you go there, you
know it's all fake. You know, it's like, it's not
like anyone's trying to break a reality, like.

(04:59):
Everything was made. Yeah, yeah.
So it's like a little bit of a sandbox in a sense, right, where
you, you know, the the context of everything that's posted
ahead of time. So then maybe your, your mindset
going into that is a little bit different, right?
Because I think we've probably seen so much, I would imagine
the same way. But you know, half the time I'm
looking at something just as a bystander and the other half I'm

(05:23):
deconstructing in my mind thinking about, oh, they did
this and they went to do that. And because you know, like they
had to do this in order to get that right.
And it's like, OK, would that, you know, is that something that
I would want to do with anythingthat I'm creating?
You know, like, I think about itfrom both sides of the equation.
And I think if you were going into that app, right, like I
would be much more focused on the latter of like almost like

(05:44):
learning through other people's creations at the same time.
Yeah, which is also which is like to me, if this is how AI is
going to go and it's going to belike totally just type in
whatever you want to me to me that's like ideas again win
because if everyone can do anything, then you know what's
ever going to rise to the top isgoing to be the best idea.

(06:05):
And then, you know, has the the one thing that's great about is
because like old Sora, like Soraone like had the remix feature,
which I always thought was like their best feature, which is
basically like the video to video stuff.
But you can do that with anyone's post now.
So like I saw one with Mr. Rogers explaining what RIZ is

(06:25):
like it's. Just.
Like hit the remix button. I'm like explain what 6-7 is
that stupid Gen. Z term.
And it's just like you can go and play off what people.
It feels very much like mid journey, like the explore page,
how you can take it and go and use.
It. Yeah, yeah, that's cool.
Kind of why I like it, but it's definitely going to be.
I've heard a bunch of people saytheir kids are using it like

(06:47):
they're having fun with it. It gets comedy, which is great.
Like it has like an actual comedic timing and like
understands that stuff, which I don't think most tools do.
So it's a, it's a whole new ballgame.
If that's what the real model isgoing to be like, if it's going
to be that good, then game on, like again.

(07:07):
Yeah, I mean, game on every week.
Every week. So for for people that haven't
used it yet are trying to get inyou're, you're saying that in
terms of like just creating things, this is the easiest it's
ever been from a video standpoint.

(07:28):
Easiest. Easiest ever been.
And then with the tack itself and kind of like where we've
come, right? Like this is probably going to
be a real eye opening moment fora lot of people, especially
maybe people that haven't gone super deep in video yet.
Yeah, and I mean like, I'll, I'll pop it open like this is
the desktop view. But if we go to like the home

(07:50):
page, it feels very much like the explore page on ChatGPT
where you can like go through here, you can look, you know,
everyone's got their profile, right?
They have you have your likes, but there's I have to figure out
how to I don't know if you can you can hear this right, the
sound. I think I I can, but the

(08:11):
audience won't be able to all. Right, we'll leave.
It's really annoying because anytime you hover over a video
it just starts playing like. I don't the audio, yeah.
Yeah, so if you see down at the bottom here, I'm not going to
hover over it so it doesn't screw up the audio, but you'll
see there's the likes, the comments.
Then there's the little remix button.
You can you can hit that and remix it, which is just like you
take it and do whatever you want.
Like if I wanted to change this character from Bender to someone

(08:33):
else, like you can just do that.Or you can be like, let's take
the next, the next iteration of this scene and go with that.
But yeah, it's like, to me it's and like the IP stuff, I don't
know how they're getting around this.
Right you. Can use anything.
Meanwhile, Mid Journey's gettingsued left and right.

(08:55):
Yeah. So that opens up a whole nother
conversation, right. It's like if they don't care
about it, do we care about it? Like this is at a mass scale
like Mr. Rogers, dude, I love Mr. Rogers too because you know,
dude, like you've got a kid now.But when you ever you start

(09:17):
letting your, your child watch, you know, some shows or you're
trying to find TV shows for yourchild, almost all of it's
garbage went down, went down this rabbit hole of Mr. Rogers
with the kids and they loved it.They loved it.
And I was like, gosh, this is the GOAT right here.
This is the GOAT. I mean he was on the air for

(09:38):
decades. I mean, give this man his
flowers, dude. It's awesome.
Like it's so it's so fun becauseyou can also do like when you're
in here creating, you can do, you can add an image and start
from an image. You can also do like cameos.
So with cameos like you can record yourself.

(10:00):
It's actually like a really pretty decent 1 to 1 transfer of
like yourself and your voice into the video.
So you can just do that. But if you're you can also allow
yourself like I don't know if they have the settings here.
No, I don't know if they have the settings, but like on the
mobile app you can say like allow my friends to use my Cameo

(10:22):
or allow like everyone to use my.
Cameo Which? So like if you put it out there,
like Sam Altman's is public for everyone, so you can just do
whatever you want with Sam Altman.
God, could you imagine? It's just like, yeah, let's do
it. Yeah, so I mean like this stuff
has been pretty fun, like. Super cool.
I created this one with. I'm I'm excited to get into it.

(10:45):
Oh dude. It's like, what about OK so and
then remixing it too. Is are they still doing codes to
get access? Is that still happening for
people that haven't been in? I believe that's the case.
So where do where do these people find codes at?
They're just floating around right now.
I'm pretty sure I gave away all of my codes.

(11:08):
Those were where. Let's see where they are.
Are they here? It's so much easier on the
actual mobile app than the desktop app.
There's like way more stuff to do.
I like it more on the mobile appthan here.
It's it's it's fun to it's like fun to what's the best way to
put this just like be out on a walk and have like a random idea

(11:28):
and then hit it. And then you have like a full
video, you know, like so like inthe like I'm, you know, I'm have
like Mario playing and 1, you know, St. hoops and one mix tape
style. Love it.
Dude. You can do anything like I'm
making you can make South Park episodes.
I have my myself in a cameo for an ESPN broadcast like.

(11:53):
What's the is the aspect ratio either 16-9 or 916?
Those are two choices, yeah. So I mean, again, like, I don't
know how this IP works. Like there's one down here where
I just said for Lord of the Rings, I'm like, give me Lord of
the Ring, Give me the perspective of Frodo for Lord of
the Rings for, for like all the major events or something like

(12:14):
that. And it was like, how is it?
How is it able to go through like here and generate that?
It's so random. It makes no sense like this one
saying, you know, I was like, give me a scene for Love Island
where they see. What about the?
Wendy. What, what about the sequential,
like the linear flow of this? So this is we're at a level now

(12:36):
with this where using that Frodoexample like it, it's able to
put things in a chronological order on its own.
Yeah. I said, give me Frodo's journey
from his perspective, basically like first person perspective of
Frodo's journey through the entirety of the story.
And it's like actually sequentially, right, Correct.
Which is. I mean, it's obvious, but it's

(12:58):
also like a big deal still because you know, like you think
about it, it's like, OK, well, these are things that you would
probably normally have to manually task yourself out in an
LLM to build out some sequentialorder or script or whatever,
right? So you knew it was happening at
some point, but to get that justfrom maybe a simple standpoint,

(13:18):
right where it's baked into the back end to give you something
chronological, linear, cumulative storytelling arc, you
know, that's pretty, that's pretty cool.
It's you can type in something as simple as like, you know,
Toyota ad, it'll create a full ad for you.
Like, I mean, it's, it's, it's scripted, it's edited.

(13:42):
It's sort of like fits their brand vibe.
What's? The deal.
What's the deal with generationsright now?
Do they have a limit on generations?
Do you have to be a certain planmember like they did with Sora
with Pro and Plus and all this bullshit or no?
I'm pretty sure it's 50 generations a day, but right now

(14:02):
I haven't been there's like no credit system.
So I I don't I'm not exactly sure what it is.
Maybe right now they're just it's free for all.
They want people to create, theywant to flood.
Then they maybe put some some posts up here shortly.
They they definitely want to flood.
This is definitely an interesting marketing strategy

(14:25):
where, you know, I would say probably one of the the better
marketing strategies I've seen in a while where it again, it's
just gorilla where it's like, all right, we'll just probably
blast all of this stuff, get people to just use it because
they're seeing the most ridiculous things.
And then like we'll get a bunch of new users who aren't just the
diehards to then like on Sora 2 finally releases the full model.

(14:48):
Oh, I'm interested in that now. I got a question for you.
Yeah, we, we, we were talking about this a little bit before
we hit record too. But like so obviously like from
a social aspect, this is fun. Yes.
From a, from a brand business standpoint, what do you think
the appetite's going to be to use something like this?

(15:08):
Will it, will it be, hey, we want to start doing this right
away? Or is this like another one of
those moments where it's like, well, I, I don't, you know, like
to your point with the IP, it's like, what should we be doing
here? Do we need to like sit back and
wait to, to get some more clarity of the situation and
before we move forward? Because I think like a lot of

(15:31):
these brands are just they don'treally, you know, like this.
We're still so new in this space.
Unless they've got, you know, somebody that is an expert in
this area that is in, you know, their legal department or really
consulting them with how they should act that is well
knowledged and well versed in this space.

(15:52):
It's just kind of like no one really knows what to do.
There, there is no answer. So it's just to me it's all risk
tolerance. Do you want to do it or do you
not want to do it? And the, the thing with open AI
is like I, I'm pretty sure everysingle business that is viable
is using open AI in some way, shape or form.

(16:14):
So it's like you're already using it, but like, what do you
do not use it again at this point?
It's like for me, for the use cases for this, obviously, like
if you use it terribly and wrong, you're gonna get you're
gonna get shamed like people online if you're using this
stuff for social media and it sucks.
Like people are gonna be like, this is AI and it sucks.

(16:35):
But if it's AI and it's good andlike, you know, it's AI, which
is where I think I think a lot of stuff is gonna go personally,
if I was, if I was the head of abig giant billion dollar brand,
if I was going to use AII, woulddo it in the most ridiculous
fashion. So everyone knows that it's AI,
right? We talk so good.
Dude, this is this is something that we talked about even from

(16:57):
the early days of the video stuff.
It's right. It's like we all want like I
think a lot of a lot of us default to this like realism
standpoint with the video stuff,which is cool.
But like for the longest time that's also been so
unattainable. So it can be seen as a little
bit like misleading or you're trying to maybe trick somebody,

(17:19):
right? Where is like the opposite side
of that is just like pick a non realism style and just exploit
the technology for what it is and embrace flaws, you know, in
in sort of like a light hearted way.
And I think like you can't really lose in that standpoint.
There's less variables, there's less polarization, less

(17:40):
controversy. And honestly, you know, like, to
your point, right, like I I justthink it makes for a much more
enjoyable experience. You don't have somebody in in
their mind is not weighing subconsciously.
Do I like this or not? You know, like, in terms of like
ethically, morally. That's why the talking Yeti was
so big. Because it was obviously AI

(18:02):
'cause it can't, it can't be real, but it was like, it
doesn't matter because it's funny.
Do you know what I mean? It's like, it's that whole sense
of like knowing that it's fake and not trying to like deceive
someone by using AI. It's just like this is AI, but
this is what you can do with AI,which is the, the funny factor
which finally starting to get some tools that come around to

(18:24):
it. It can be a little bit more
goofy and not so serious becausenot every tool has to be a
Hollywood killer. Like just be fun.
Right. That's that's exactly right.
And brands who embrace that, I think who are just doing it and
just like having fun and testingit out and letting their
audience feel through it and notbeing so serious about it are

(18:45):
going to be way better off than you know.
It's like, it's like the brands that have fun on social media
anyway. Like you look at liquid death,
right? Like they don't care.
They just do stuff. They make they make fun things.
And like people just remember their them for being fun.
It's not like we have to be so tightly constricted.
We can only put product photos in there and talk about like
sales and things of that nature.Like no opinions, no anything.

(19:07):
No like, yeah. And then like, no one cares
because. It's style and.
It sucks so. Yeah, I think it's I'm, I'm
excited to try it. I'm I'm excited to jump into to
Sora 2. I think man.
Play with it on the run. We're, we're, we're what, first
week of October? I mean, when do we get Sora

(19:31):
three? You know, I think Sora 3 will be
here early next year. You know, the way that things
are going, VO4, we just mentioned going to be here this
year. I don't I don't have any inside
knowledge, but I'm just saying that I would be very surprised
if I don't see that before Thanksgiving.
Like what? You know what I mean like.
Yeah. Even that seems like a long

(19:52):
period between now and Thanksgiving, the end of
November. You know where we talked, they
were talking about 7 weeks between now and sort of two.
I don't think Google's going to let let that happen for a
minute. And then we also had.
You know, let's maybe transitionto we, we also got Ray 3 from
Luma last week, 2 weeks ago. Speaking to your filmmaker,

(20:14):
stuff like the the examples thatI saw were much more sort of
geared toward that side has quieted down a little bit.
Luma's sort of an interesting character anyway in in all of
this. I know you've been out, but did
you, did you kind of hear about this whole thing too with like
All their creator? Yeah.

(20:35):
Creators getting kicked out of the creator program and then
they like brought in just like afew back in and everybody's just
like, you know, what are we doing?
What are we doing? Like the you'd like no notice
This is how you piss people off right here.
Yeah, they just like kick peopleout because they were just like
you don't have a big enough following which look, I get it
from the brand perspective, theywent about it the wrong way.

(20:57):
But like I understand from theirside where it's like, OK, if
you're just using the tool and just using it and getting free
credits and like not doing anything in return like that's
and they're just burning money. It's like I I get it.
I mean that to me signals that the company might be a little
bit, you know, CFO might be getting in there being like
we're burning a lot of money on this stuff and we're not getting

(21:18):
anything from it. So like I see that, but also the
way they went about it. The communication sucks the, you
know, if that's the way you weregoing to do it from the
beginning, you know, you obviously didn't have the
foresight of what if we run intothis problem.
Yeah, yeah, that's interesting. The the Race 3 stuff.
I mean, how, how, how much have you played with that?

(21:39):
I played with it a little bit. I don't know that I played with
it at a super deep level to likereally be like, Oh yeah, but I
did get some interesting things,like what'd you get?
It wasn't it wasn't. It wasn't a milestone thing for.
Me 'cause I found I found it to be just.

(22:02):
I've always liked Luma keyframes.
I still think it has like so much morphing and artifacts, but
sometimes you can just hit gold in there, whereas you know some
of the other tools can't do the keyframes as well as Luma can.
But I do like, I do overall likeLuma's interface like in the way
that you're creating and it's like, you know, modify or more
like this or you're just sort oflike having a conversation.

(22:23):
You can flip the tokens like it's a very like, I like the UX
UI. I don't know if many people do.
I just think it's like very userfriendly when you're in there in
terms of like being creative butit.
Is lightweight. I don't.
I don't really particularly likeI, I think it's not as
straightforward as it should be.Yeah, it's it's only like if
you've been in there and you like have seen the evolution and

(22:45):
like, no, if I just open that uptoday, I'd never use it idea
what the is going on in here, Like what is going on?
This makes no sense, but I foundit to be the one thing I'll put
this I'll share this screen here.
This is the one thing I've foundis that it can do some it can do
some fight scenes and things like that, which a little bit
harder to produce. Like having this stuff, you

(23:06):
know, even these towards the endhere we get a little bit more
extreme with like flips and likecrouching tiger hidden dragon
stuff where it's like, you know,I haven't been able to do fast
action sort of fight scenes thataren't just jumbled messes.
Like these 3 here are pretty impressive.
This shot, that shot and then this one specifically at the
end, like a little twist, you know, so that's that's fun.

(23:30):
The other thing that's been goodis the key frames, right?
Like this one is a good example where it's like I can you can
build in like the speed ramping and almost the editing into it
where I was like, all right, that's pretty cool like that.
I can actually do that. But it's also again, using
getting the right keyframes right.
So I wasn't getting when I was doing this, this image here,

(23:51):
like this monster truck flying over the flying over the lip
into like, oh, I want it to I want it to land like like
basically skid into my face and then drive away.
It like wasn't doing it. So I had to go here and then use
Nano to get the edits. There's a whole process behind
that edit, which is, you know, alot.

(24:12):
More Are you? Are you prompt?
Do you find yourself prompting differently for Ray 3 versus
VO3? Yeah, well, VO3, a lot of times
there's more to control, like with the sound, if there's
dialogue or whatever. I found that the fast action is
better in VO3 or in Luma. Now I don't know what's going on
with VO3. I'm having trouble getting

(24:34):
anything to move fast anymore. It's all moving slow again.
And like what I mean by that is like fast, like fast moving
things, like cars moving fast, people moving fast.
It looks like everything's back in slow motion again.
I don't know why it's it's very odd to me because it wasn't like
that before, but for this project specifically, like this
entire thing, I was trying to get way more fast shit like I

(24:57):
wanted the car moving fast. I wanted it ripping it like high
speeds and I just wasn't doing it.
So this one was interesting here, this little pop up to a
wheelie like that was Ray 3 justlike kind of on its own and I
was took some frames out and then redid everything, you know,
sort of with weevie and things of that nature.
But overall, I thought it was just a it's a nice update.

(25:19):
I, I don't know how long it'll be relevant for until someone
else comes out with it, but still still find it weird that
people don't use don't like focus on keyframes.
People like keyframes. You know what I mean?
It's like very, very easy for a lot of people to under like
digest start here and here, right, easy to control because

(25:43):
so one thing with AI video tip again, if you've been here this
whole time, you know this if you're new listener, right?
Like anytime you come off like you have a video, right?
Like for that example that I just showed with the monster
truck flying over the over the lip, there's no context of what
the side of that car looks like.So if you're just generating as

(26:04):
a start frame, that car lands and you want it to turn, it's
going to be a totally different side of the car.
The wheels are going to look different, the paint job's going
to be different because it's just making it up.
It has no context. So key frames allow you to like
control that last frame where it's like.
Here's how the whole thing should look so you can keep
style consistency across it. So I think that's the next real

(26:25):
big unlock. Maybe maybe in combination, it's
right up there with with to me like the audio and inflection
and tone being correct, but the consistency with minimal
oversight from scene to scene, you know, or the progression of

(26:47):
a scene, right? Because to your point, like
there are so many of these micromoments that come up where
you're like, oh, did I, did I give enough detail?
Did I get enough? You know, did I provide enough
detail either in the text promptor the key frames or even just
going from video to video to eliminate the possibility of

(27:07):
things more finger changing or losing the consistency in
certain spot. Because even that that one
little thing, right, Especially if you're talking about using it
from from any professional or external standpoint outside of
just social is just like, well, you need to have that on point.
And that still feels a little bit painful of a process.
And that'll be one of those, I think, signature moments when

(27:28):
that starts to to take place because you know, it's like,
dude, we go down these, I'll go down and I'm like, I, I've got a
couple clips this, that and the other I go back.
I'm like, Oh, fuck, I just noticed something I didn't
notice first time around. Do I really want to go back and
create? Fuck no, sure, don't, because
I'm, you know, it's just for fun.
I really don't want to spend another half hour doing that.

(27:50):
And I got go pick up my kids from school.
You know what I mean? I would love for when that
happens. That to me is is going to be
huge. I think it's, I think it's
probably gonna be relatively soon.
I I can see you like a world in which these, you know you upload
like character or car or object and it like automatically turns

(28:11):
it into a 3D object and then that gets placed.
Yeah. And then you like approve it or
something like that where it's like, OK, well, that's now
locked, right? That's 'cause that if what Sora
can do on this super light version, which is this like it's
been pretty damn good on, on that stamp from that standpoint.

(28:33):
But I think it's like it feels like really these are the things
like that you don't you don't get people.
There's so many micro, again, micro decisions on the video
side because everything is so much more complex than images
because I, you know, images, it's like if I want to show the
character, I transfer the character now, now it's easy.
Now it's done in 2 seconds, but the video generators still need

(28:56):
to to like create a bunch of stuff.
It's no like locking a characteror a product or an environment
like it'll change because it's making stuff up.
So, so you need those little like micro decisions in there to
get it. And if I showed you like how it
actually it would for anyone who's like a designer or has

(29:17):
like done this stuff before, like for that that specific
shot, like I I was troubleshooting a lot on it.
Actually, maybe I'll just I'll show you the workflow and
weevie, because I have it just like sort of where the mind was
going. I think it's like useful just to
see this stuff because there's like this is the stuff you
encounter all the time with AI video, at least I do trying to

(29:41):
get this last portion over here.Where is it?
So where are we? All right, so I had this shot
right and I'm like, I wanted this thing to 1st off the dirt
jump is over here to the left don't want that That's not
that's not where it was when we went over the whip right, So I

(30:02):
wanted to be in the middle, so Ihad to change it center it to
the middle right. Then it's like, Oh well, the car
is different now. That's not the car that we've
been using in every single shot.So it's like, how do we how do
we go back and sort of edit thatright?
So then we're using, you know, painter here to sort of edit
this, take the the look of the car and then apply it.
And it's like, OK, it's not one to one, but close enough, right.

(30:25):
Then from here it's like, OK, well, this isn't going to give
me exactly what I want to need abigger field of view because I
ran it in luma didn't get me what I wanted.
So it's like, all right, let's zoom out, right, Zoom out, give
it a little bit more wider depthof field.
And then I was like, you know what, this isn't working.
Like I can't get the car to land, skid into the camera and
then drive away. So I was like, let's remove the

(30:46):
car. Let's get rid of it totally and
then let's add it back in as skidding in front of the camera.
Like so adding this car back in saying, you know what's, where's
the prompt here? Place the car in the second
image drifting towards the camera, wheel spinning, dirt
flying, real rear quarter panel visible, right?
So that it's like, OK, that's good.
That one didn't work. So I was like, damn it, let's

(31:10):
let's zoom in, right, Let's zoomin on the tire.
So that's the keyframe, right? And then that that ends up being
how I get the shot that goes from this to here to this.
So it's like keyframe 1, keyframe 2.
And I was like, all right, this is where it ended.
Then it was extend to get to addthe next keyframe.
So it it's not as straightforward as it seems all

(31:32):
the time. And people.
I think that's sort of like withAI video where it's like there's
got to be processes and things that you do in prompts.
It's like it's not. It's like tinkering a lot of the
time to get. There, yeah.
And then you've got the extra steps.
This, this reminds me of like that moment where it was like
early on in the mid journey dayswhere, you know, you can do
stuff, but we've got limited time and limited generations

(31:55):
where it's like you got to have,you got to go through five extra
steps that probably you aren't going to have to go through in
six months to a year, but you have to go through them right
now, you know, and I think part of it's good, like to experience
the pain of it 'cause you're just like, you have to think
critically. You know, it is a little bit of
a strain on the brain to like, think about some of these things
just like, oh shit, like how would I, you know, So I think

(32:16):
that part of it's good. But like, just from like an
efficiency standpoint, right? Like scaling things out, doing
things faster, bring less about every, you know, little thing
that you create is now a, a problem area that you have to
double check. And you know, I think, man, that
will be huge, but. The other thing with that shot,

(32:37):
I just feel like it's a great use in troubleshooting was that
like I couldn't when the car would turn, it would never be
consistent in terms of the paintjob and the colors.
It would always change the like the rims, it would always change
like everything. So I changed the prompt to be
like focus in on the axle hitting the ground in like in
slow motion so that it didn't look at the car.
You know what I mean? It's just like it's like little

(32:58):
film making sort of like, and it's there so you don't notice
it so that when it jumps to the keyframe of the tire, it's like,
oh, it's like it's the same car.Obviously it's like little
things, you know, there's no wayto make that or a process that
that's just standard sort of like troubleshooting, I guess in
that sense. But I don't, I don't know if

(33:19):
we're going to have to troubleshoot anymore.
Like I, I don't know, I feel like it's all going to change in
the next three months and got a bunch of other tools that are
probably going to come out in succession.
I'm excited to see what ChatGPT does with their next image model
because the last one was really damn good.
It's just slow. It's just.

(33:41):
Slow. I mean, we're not.
No one's no one's going to wait 2 plus minutes for a single
image, and certainly not for thesoft warm.
Green and yellow. Thing that I can't just, you
know, you can't avoid. So yeah, me too.
I, I think that's coming too. I think that's coming soon.

(34:03):
I would think that's coming before end of the year.
That's coming probably in the next month or two.
Because they weren't that far behind when Gemini 2 point O
released. That was back in March because I
was the March beginning of Aprilbecause I was in Belgium.
I remember that. And now I'm going back this
week. But it was it was that when I
was there and I was like, what is going on here?

(34:24):
And then they were like right after.
So it was, it's like you see the, the timeline start to line
up, right? Like it's funny how this works
like like runway Gen. 5 will probably be out pretty soon just
guessing because of like the waythat it's always Luma's first
like in the, in those in that pack race, right?
It's like Luma goes first on thedev cycle and then runway's

(34:47):
typically like 2 weeks after. Then it's like, you know,
runway's been quiet for a. While Runway has been quiet for
a while, which I think I haven'teven played with Aleph that
much. Is it Aleph?
Aleph? It's useful.
I mean, I, I think like there's a lot of there's, there's that's

(35:09):
where there's just too many tools.
There are too many tools right now.
It's hard to be, it's hard to find the time to be really great
at all of them, you know, And I don't think you need to be.
I think we're getting to an areawhere those things are going to
start to collide very soon. They still do kind of all have

(35:29):
their own areas where they're good.
But I think we're going to have to, there's still a lot more of
these sort of tools out there than there needs to be.
And they're not all going to survive so.
Well, you start to see it. You start to see the companies
that it feels like they're falling behind.
Where's Pika? Where's Pika?
I don't. Know it's just whatever I mean

(35:51):
you see some of the tools like like I thought Higgs field was
falling behind with their video model and then they made a pivot
they were just like screw it we're going to aggregate and
then Higgs field has started to because they can market right
like Higgs field can market and like they.
Are kind of the bad boy. They're kind of the bad boy of
the group. It's it's smart they pay.
I saw they did a paid sponsored post on Worldstar the other day

(36:13):
and it was like Higgs field acquires model and turns off the
sensors like oops. And it's like, and I saw it on
Worldstar that's sponsored. I was like, all right, this is
some gorilla stuff. Now you're sponsoring posts on
Worldstar. No other AI tools doing that.
Yeah, they're they're they're definitely, they've got their
own brand identity for sure. Yeah, and it's actually like, I

(36:36):
don't know if you've played around with Soul at all.
It's pretty good. Like I, I like it.
I don't like their video models,but I like Higgsfield's salty
image model. It's relatively creative.
It has a good color, it's nice and you know, it's got really
good texture in there. It's like it's like mid journey.
And then there's like Reeven Higgsfield, like or Rev or

(36:57):
Reeve, whatever you want to callit.
Used to really like, well, visual electric.
Visual Electric just got bought.By who?
Perplexity. Oh, interesting play perplexity.
Yeah, yeah, that just happened acouple days ago.
This would be happening, right? This acquisition stuff feels

(37:18):
like it should have been happening.
That's. What I was, I think that's it's
just like this is gonna this is gonna keep happening and visual
electric was always good. I always loved visual electric.
I thought they made it really easy.
It was a very lightweight tool. I didn't use it a ton, but like
it was also a great place to go for inspiration.
Like you, you saw really high quality images be created there.

(37:40):
The prompt coherence was a little bit different than
something like a mid journey, but the quality I thought was
always right up there with mid journey.
Not as creative as mid journey but yeah, quality wise well.
That's what we're missing with alot of this stuff.
That's why this sort of thing feels so much like Mid Journey
to me, where it's like put something in, get something good
out. You don't have to be a technical

(38:01):
specialist to get something which that was what always made
Mid Journey good the the ease ofuse and the quality of output
for what whatever skill level you are and made it fun.
Which would they did mid journeysay they were doing a social app
a while ago or thinking about doing social profiles that I am
I. They weren't talking about an

(38:23):
app, but they were talking aboutprofiles.
You know, maybe this is a good time to kind of switch over to
them. We could kind of focus on them
really quick. So they did.
I mean, there's nothing been major major since Style
Explorer, which was coincidentally enough, the last
time that we talked, it was episode 50.
We had Alar on. We went deep.

(38:44):
We talked about some things thatwas kind of like what they led
with last Office Hours. I think I joined a few minutes
late, but I caught most of this Style Explorer.
I mean, there, there are some things that have changed since
they launched it. So they're adding more and more
styles. You know, we obviously had that

(39:05):
one nice little unlock with the hidden search that, you know,
the smart search thing, which people love that.
But now sort of I think like theunlock here, in case you're not
aware with it is now really you can refine this here.
So like I could just type in thecolor red and now I'm going to
get a bunch of red style refs. I could take it further.

(39:28):
I could take, I could do red illustration, right?
And now I'm going to get red illustration.
So, so now this becomes much more interactive, right?
You're not so reliant on, oh, what what do I see here in this,
you know, the explore page? Do I like anything, which is
cool. Like I don't think there would
have been anything wrong with that.
But now you could really be moreproactive with just finding

(39:49):
exactly what you're looking for.So I mean, the smart search
thing is cool. I think again, from like a
different exploration style, youlike a certain look, feel,
general generalization, maybe you don't even know how to
describe it kind of way. Whereas this gives you sort of
like a little bit more of that juice to just like go in and
say, OK, I just want something, let's start with like just
orange, right? And then, you know, blah, blah,

(40:12):
blah. So I.
Like that one right at the top right?
That was like a nice. Soft weird you.
Know bumpy textures? Let's say, do I have?
I don't really want to create that with that, but let's do
give me something. Period.
Have you ever tried that? Yeah, but it's been a long time.

(40:34):
It just it just goes. It's nice.
Let's. Let's run.
Let's run a bunch of Exp on thistoo.
Just curious to see what pops upif we do that.
Also they did mention somebody asked if Exp is going to be on
V8 and David was unsure he wasn't.
I'm not. I don't know, maybe.

(40:56):
Probably not a lot of people. Using like Exp is the juice.
That's literally I, I use Exp all the time.
I we talked about this on the episode with Marco.
Like I think that was episode 49.
It was just like just a little bit and you just get completely
different results like I almost every time a little bit of Exp

(41:17):
beats none, you know? Yeah, yeah, these are, these are
really cool. Like the textures?
Textures awesome. Like this would be a stacking
piece for me. Like I would like use this one
as a as a stack with other things too because it's it
obviously looks like it has a very close up macro texture,
bumpy like smooth at the same time like porcelain almost.

(41:39):
That's cool. Yeah.
And then I want, you know, like,same, right.
So like, again, if you don't like, if you don't know how to
explain that and maybe you don'twant just orange right now, you
kind of again get tossed into a category of other things that
you might like here, like supertextural, focused, tactile.

(42:01):
Yeah. This is like the best.
This is like the best thing that's happened to the.
Explosive. Long time yeah.
I mean, this is awesome, man. I don't I don't know that
there's anything else you could ask for in terms of the
explorer, you know the explorer page.
I I do I don't know about the profile thing though.
Going back to that that you mentioned that I don't see that
in I don't think I saw that. OK, there it does say it, but no

(42:25):
update on timeline I. Wonder how that?
Possibility of an update in seven.
That's personalization. They did talked about
personalization for more nuance control.
OK, that'd be great. Tired of my personalization.
We all know that. Yeah, there's not, I mean,
there's not a there's not a lot outside of the style.

(42:47):
They did mention style Finder inprogress.
So you know, this part, I think style creator is maybe what I'm
thinking about. So you can create your own new
styles. I like that.
And this is not out yet, but this is this is something that
they were talking about. I I could see that one being a

(43:10):
real fun addition, like playing or I don't know how the David
hates drop downs, so I don't know how they do it, but maybe
it's just like if you create an image and just saying like
create a style code out of it, like that's basically that feels
like how they do something like that maybe, yeah.
Yeah, I feel like there's some maybe in the back end there's
some connection to mood boards too, in a sense.

(43:32):
Yeah, something I still love mood boards.
I use them all the time. It's like, that's like my.
Style. See, I haven't been using mood
boards lately, but. There I actually went on to rank
images on my personalization code the other day, which I
haven't done. Look at you a long time.
I just like wasn't vibing with it as of recent.

(43:53):
I was like, you know what? This needs to change.
Oh, you know what they were doing though they did have a
style ranking thing going on too.
Let me dive into this SO. Like a ranking party kind of
thing. Yes, here it is.

(44:16):
We're doing a ranking part for styles.
Tell us which one you like better.
I haven't opened this up. OK, so then yeah, it's just a
ranking system. Which one do I like better?
Don't really care for the like 1.
Interesting. If you guys like it, I didn't
save it. If you want to just.
Like a bunch of. Styles.

(44:36):
Also a very, I'm glad you, I'm glad that you noticed that they
did mention that too. And I think that's extremely
helpful. You know, like that's smart.
But this is interesting because also there's a lot of I, I don't
need 5 billion style codes when 3 billion of them suck, you know
what I mean? So I think like, if this helps

(44:59):
curate, you know, and we just get to better and better styles
too. That's that's amazing.
I don't have to run R40. R40 God, God man, the good old
days of R40S referendum. R40 Oh boy.
Please, I mean like that was just like and then I there's
nothing worse than when I do it and I I pushed like up and put

(45:21):
R40 and then it would just give me like the same style code 4040
runs because referendum it's like.
Oh no. Yeah, did that so many.
Times. Oh my God, that's the worst.
Try to frantically cancel all them.
Cancel. Yeah, I like when you can.
I like when you cancel some and then they still generate anyway.

(45:43):
Yeah. Thanks.
It's just screwed, dude. Yeah.
It's like, no, you didn't. You didn't cancel it before 50%,
so we're going to go ahead and follow through.
The this is I'm like, I'm curious, what did they did they
say anything in the office hoursabout V8 like if anything like
what they were focusing on. I know they're just been kind of
teasing it, but. They haven't been saying that's

(46:05):
a, that's a great question. I, I don't think I caught
anything other than just the point of that They were talking
about whether this is something we'd see in 2025, which
basically David was saying, yeah, we're aiming for 2025.
Obviously, like if I had to predict, you know, this is going
to bump up against the holidays.We're going to run into this,
you know, similar situation where we're like, you know, do
we push this before the end of the year?

(46:26):
Do we go into next year? So they did say something like
expected in 25 possibly next year.
Beyond that, though, they just said this, right?
Like there's nothing really new here.
Better quality, better coherence, better personal.
I mean, that's kind of the same stuff they've been saying.
I'll go back to, you know, some of the things that they've said

(46:47):
in the past and you and I have talked about, which is they were
really hyped, really hyped aboutV8.
You know, in terms of like theirinitial things that they were
scoping out for it, data sets orwhatever they do over there to
to get a grasp on the potential of it.
They're really excited about it.I would imagine they got to get

(47:10):
this consistent character objectreference thing down.
Not not even just better. It needs to be down by the time
that they launched this now because somebody else is already
going to do it. I mean, we're already really,
really close. Like I would, I would say
there's still a little bit of room there.
It's not, it's not perfect. It's not perfect anywhere yet,

(47:34):
but there there's your opportunity.
It needs to at least be at that level.
Otherwise, why do I even? Well, I'm not going to use it
there. I mean it's do you have nano
that can do it, You have seed edit that can do it.
You have runway references, you have Reeve edit now can do it.
You have ChatGPT image that can do it.
You have flux context that can do it.

(47:54):
It's like I I know it's not. A table stakes, table stakes,
table stakes. It's like if, if any of those
companies can just do it like this, they can.
They can obviously do it, but I don't know why they're just not
prioritizing. It doesn't seem like they are,
but the, the data set thing is interesting because I'm pretty
sure if we go back in time, likeI think David said that the V7

(48:17):
was still trained on the V6 dataset.
So like if they're overhauling that, that's probably pretty
important. I don't know if there was
anything with with Meta there that could be.
I don't know if there's anythingthat was in that Meta
partnership sort of announcementthat said there is something
about, you know, data sharing orthings like that 'cause if.
They got yeah. If they had access to meta

(48:38):
stuff. That's some of that.
I don't, I don't think any of that's been like publicly
released like we don't even know.
No, we don't know. It's just a guess if like if I
was if I was trying to get a data set that's probably 1 I'd
want. Do you remember when do you
remember when Meta came out withthat, that first image model?
Yeah. Do you remember that?
It's like a fart in the wind. It was gone here, you know.

(48:59):
It was here today, gone tomorrow.
You know it's. Crazy as Meta let's use their
stuff for free and like people don't use it.
Yeah, I mean, it's like generateimages in Instagram for free.
You can generate it on the Met AI app for free.
And like, it's just like it doesn't get any hype, which is
so weird that it's just like we are clamoring over to pay for
things. That's why I think like that was

(49:19):
a that was a really interesting thing to see come together, You
know, Facebook and Meta from that God, I, I want to call it
Facebook, but meta from that side, you know, it's just, I
feel like, you know, there's just not a lot of good, great
brand equity, you know, when it comes to these new things, like

(49:43):
even just with threads, right? Like that was a real sort of
like thing where you're just like, OK, this feels like the
first big social app in a long time, right?
And then it was just like it wasgone, you know, it was like a
week went by and it was just it's over, you know.
And so like mid journey does have this really, I would say

(50:04):
loyal for the most part, base and good brand equity for the
most part with the people that are in there every day.
So I would, you know, I'm interested to see what they do
with that. Yeah.
No, a number of ways. I mean, I, I think mid, if mid
journey wants to keep up with anything, like at certain point,

(50:24):
like it's, I think it's valid tostart having the conversation.
Like it's going to be really hard for like private models to
keep up. You can't match the data sets of
anyone else, but you just can't.So I don't know.
That, you know what, that did come up though, in the off side,
I did hear something about Dave saying the data set, you know,

(50:45):
from his perspective, wasn't good.
Yeah. And so he's really excited, you
know, and, and I hadn't really heard him say that before, so I
don't know, just kind of was like one of those things that
made my ears perk up. It's just like, how are you
gonna, how are you gonna compete?
Like how does anyone compete with Google?
You just like you just think about it.
It's like they have access to somuch and it's it's like they're

(51:10):
just they can just build in verylike methodical way.
And it's like they Google can fail like a product sucks.
Whatever shelf, it doesn't matter.
Drop in the bucket, like on to the next one.
You know, like tools like I feellike mid journey and runway like
can't fail because you know, they're it's like their whole
thing. Google needs a place, though.

(51:31):
They need a. They need a home.
They need a product. Renaming they.
Have so many. Strategy like they're it just
like I can use, you know, nano banana in a million places, but
why am I not using it on Google directly?
You know, and I mean, maybe theymaybe they don't care about that

(51:52):
so much, but like I would, I would, I would really kind of
care about that. I would also want to know why a
lot of people don't even know nano bananas Gemini, you know,
Google Gemini 2.5, right? It's like, why are those two
different things? You know what I mean?
Like why do we have two names for the same thing?
And then, you know, you got flowand then what is flow and whisk?

(52:13):
And you know, we got VO3, you know, like all these things,
image Gen. you know, OK, we got an image generator, but we got
nano banana. Like what are we?
What are we? Doing.
What are we doing? You know what I mean?
It's just that part of it, to me, is a problem for them.
But you're right. Like I, you know, we were here a
year ago and we were just like, yo, what is Google doing?
Yeah, that was like a year ago. And they like, there was like

(52:35):
really nothing. It's like an I think they were.
Still doing Bard at that time, you know what I mean?
Like there was nothing. They just, they were like an
alligator, like sitting underneath the water, you know,
just like waiting. Like what is everyone like?
We'll develop, we'll you know, we'll wait.
And then I don't even know if there was a VO1.
It just was VO2. Right.
Yeah, yeah. I don't remember if there was or

(52:57):
not, but you know, there was. They waited, they waited, they
waited, they waited. Then they released and it was a
banger, you know, then they, they just sit there and wait and
survey. And then it was, you know, Image
FX or whatever the hell the platform was and they switched
it. It's like now it's flow.
It's like OK, what happened to Image FX?
I, I have a question. I have a question for you here.

(53:17):
So kind of going back to this, this office hours that was I'm
reading this last point here VO2model plan post V8, which
they've talked about. Let's visit this for a second.
Knowing what we know right now. I mean, things can change.
We're, you know, if, if this theearliest we would see this is 3

(53:40):
months from now, the earliest. But that said, what would you
hope this brings to the table for AV2 video model inside a mid
journey? Is there a certain other tool
that you hope it leans more to be like in certain ways?

(54:01):
Are there certain features you would hope to have?
Like what? What would make you run the hell
out of mid journeys V2 video model?
What would they need to have in order for you to use that as
your primary video? You know your daily driver with
video. Unpopular opinion but I kind of

(54:26):
want it to be like. I'm I'm asking your opinions so
it doesn't matter. I would want it to be like Sora
because for mid journey, some ofthe a lot of things I just want
to do it for fun. Like I just want it to be fun
and I want it to be like entertaining.
And I just, you know, there's somany professional uses for this
stuff. But like for this one
specifically, like I just sort of wanted to and I think their

(54:49):
idea of what this is going to beis going to change like whatever
they think it is right now, waituntil these next couple tools
come out and they're going to belike, why even waste our time
with doing some of this stuff? Yeah, I wonder what they're,
what do you, what do you think? There's how.
How do you think they ideate what this video model's going to
be? I think they're just trying.

(55:10):
To like, are they like is there how would you like if if you and
me were in charge of mid journeyand we're like, OK, we need to
do AV2 video model. Yeah.
What would we do to get to feel like we're headed in the right
direction there? Like are we just purely going
off of things that we're seeing and hearing?

(55:31):
I don't feel like mid journey hasn't done anything survey
related with video, you know, like where where are you getting
this? And I would assume like you're
probably aggregating data and research somewhere, but like, I
don't know, are, are there breadcrumbs and signals to see where

(55:52):
things are going besides just gut and intuition and things
that we're seeing? Like do do they have access to
know what Looma's planning on doing next?
Or, you know, Google's planning on doing, you know, like I, I
just don't know what happens behind the scenes.
And I'm just curious about that,like how they go about that.
Like to your point that that thing could be very fluid.

(56:14):
So it's like you wouldn't necessarily want to spend all
day in a boardroom hypothetically on a whiteboard
and then come to a conclusion onsomething and then a month later
that's going to change. That's why it's like when
ChatGPT had text, it was like wedidn't know we wanted text until
we had it. Do you know what I mean?

(56:34):
To me, I don't need like a ton of stuff from mid journey
currently, but I also think likewhat is the long term vision of
theirs? And then we can start to sort of
perverse engineer that. I think they want to be like
immersive 3D or 4D worlds, right?
So you're going to have to have all this stuff.
You're going to have character consistency, dialogue sound like

(56:57):
ambient sound at least at a minimum, you're going to have to
have, you know, interactions, changing camera controls around
within it. So image, you know, video
editing, things like that, like I feel like every step is like
building towards having this immersive world.
So it's like, what do you need? Yeah, that's a good point,
reverse engineering it. I also that that's maybe where
this comes into play with these things that are wild cards,

(57:20):
which would be the secret projects because this feels very
far away. But I think the opportunity is
to engage, you know, like obviously audio, but to engage
more senses, touch, smell. Those things aren't possible

(57:41):
anywhere right now, but they will be at some point when
you're talking about like an immersive world, something you
can step into and feel like you're a part of, you would need
tactile touch. You would need to be able to
smell something for things to beimmersive.
And I and I think like that would be interesting that that's

(58:01):
a whole nother thing, right? Like when we're talking about
brands doing something different, like could you
imagine being able to smell something that you've never
tasted before and decide whetheror not you want to buy it?
You know what I mean? Or you know, just super
interesting, man. I wonder if like there's a part
of that because they did, they were talking about wearables.
They're talking about, you know,like, but we don't know anything

(58:22):
else, you know, is it, is it something like that?
Like imagine like a hotel that'slike if I like a good marketing,
you know, like a good like marketing play, it's like, oh,
I'm going to put on my VR headset and go sit on the front,
you know, the, the front patio of like some hotel in, you know,
Bora Bora or whatever. And just like immersively live
in it for a minute. I'd be like, yeah, I want to go

(58:45):
there. It's like, yeah, feeling
everything smelling. It Oh God, could you imagine how
that that's going to change the the travel industry?
Yeah. It's a marketing play.
See if you could put yourself onthe at the poolside of this
hotel that you're thinking aboutversus a poolside at somewhere
else you're thinking about staying.
Yeah. What make things in real life

(59:06):
better might be like, oh, we can't hide this stuff anymore.
It's real now. People can see it.
It's not like a, it's not like asurprise.
Will that take away from it though?
Like I I'm always like. A big I do I.
Know. I, I, yeah, I like spontaneity
to a degree. Maybe this, maybe we're opening
something big here, but does spontaneity become far more

(59:27):
rare? And I think that is something
that I've not heard anybody talkabout, but I think is absolutely
up for grabs in terms of stealing that away from us.
If everything can be predictableto some degree, or you've got
now this ability to immerse yourself in situations you never
would have been able to before. Yeah, significantly.

(59:49):
And you're right. Like, what if you like
spontaneity? What if you like impulsiveness?
What if you like unpredictability, not knowing
it's coming for you? That would suck, actually, in
some ways. Like when I travel, I like to
just like put me out on the street and figure it out like.
I don't know. I don't want to like find to
some degree, right? Like there's this element of
just being in the moment, right?Like going back to that sort of

(01:00:11):
like being in the present. How are you in the present if
you know everything or you have a good enough idea how things
are going to take shape? Yeah, I mean, you always feel
better, like especially when you're doing something like
that, right? When it's like if you had a
schedule and I was like, we haveto be here and then we're going
to go here for lunch. Then after lunch we're going to
do this and then that. Then it's like if you don't

(01:00:32):
adhere to the schedule, you feellike you're.
Going to be pissed. You're going to be pissed.
Yeah, but if you go and find like I'm walking on the street
and I found like a little bakerythat's got some like fire
looking croissants and I go in there and be like I'll remember
that place. That was awesome, I felt.
Really good about that. There's a different difference
between you bookmarking 20 places that you might want to
see versus having six places that you have to see on Tuesday

(01:00:53):
before 3:00 PM. And if you don't do it, you're
going to be crushed. You know, like.
And then it's like then the rushtakes it and then it's like then
I'm just at work now I'm just like on a clock, like is why?
But I I hope it doesn't get likethis.
I hope it's not everything is. So that's the that's the one
thing is like it feels someone said this in the group chat the

(01:01:14):
other day that it's like I miss when AI video was hard and it
was like, you know what? I do miss when like mid journey
was a little bit harder to crackit.
It felt like a little. Bit especially realism, dude,
because it was like you were, you know, like remember,
remember when we were talking about camera angles and, yeah,
photography types, It was like, no one's, no one's doing this,

(01:01:35):
you know, but this is how you, this is how you get the realism
results right now. You know, like you had to, you
had to do it. So it was like this little, you
know, sort of like secret nugget.
I still, yeah, it feels very nostalgic now.
It does and it's it's that's what I'm hoping doesn't happen
to where like the craft is so minimized to where it's like not

(01:01:56):
fun anymore. Yeah, yeah, I, I do because, you
know, I, I haven't really seen it yet to the level it will be.
But it's like you go on, I don'tknow, something like LinkedIn,
you'll find, you'll see somebodythat is not an AI image creator.
And now they have the ability tocreate something in ChatGPT and

(01:02:18):
they're posting AI image stuff style.
And it's just like, but you didn't go through crawling
through the mud and climbing thewalls and like, you know,
rounding the blocks five times, turning back around and like
having to cut through an alley. Like you didn't do all that.
And now now you can, you know, and you don't have to do it.
And you know, the ChatGPT stuff,I think like obviously, again,

(01:02:40):
like current example, most of the people that do it well are
not those people. But I think by this next real
flip of the switch, right, whether that's the next ChatGPT
image model, it will be. And yeah, that that part of it
kind of stings a little bit, youknow?
Yeah, it's like, it's like if you play in basketball and you
make every shot, it's like, is it fun anymore if you don't

(01:03:02):
miss? You know, like it's like, I
don't, I don't know, I don't want to go fishing and catch a
fish on every cast. That's boring.
It's just no outcome. You don't want to you don't want
to play Madden on rookie mode when you're a good player.
Yeah, I could run for 400 yards and pad my stats.
I I think by the time I got to the third or fourth game in my

(01:03:23):
season, I'd be pretty like, whatam I doing?
You know what I mean? You need some adversity.
You need some challenge. Yeah, it's, it's, I hope I'm not
hoping for things to just stay the way they are.
They're not going to stay the way they are.
I just hope there's something like new that's still like the,
yeah, the horizon to unlock where it's like, oh, what else

(01:03:44):
can I have to go figure out? Like the weevie stuff was like
perfect. You know, it's like, oh, oh, I
like this now because this is like a whole new set of problems
that I can try to figure out. Like how do we do batch
processing? I think maybe we should, we
should do we should do an episode.
We were talking about that we should do an episode maybe for
focused on sort of more that workflow process style for

(01:04:05):
folks. People love it.
Weevie's been a great tool for you.
You sort of like really kind of introduced me to it and then
I've had fun with it. I do think like the scalability
and repeatability of that kind of stuff is going to be really,
really important here in the short term until there's a
better way, you know, but at least for the foreseeable

(01:04:27):
future, that's the better way. It's great to just see
everything. That's that's really why I like
it's all in one. I just like looking at it like
where instead of like going between tabs and like going to
my downloads folder and like it's like it's all there.
You know, you should, you guys should really see what one of
these, you know, episode thumbnail workspaces look like.

(01:04:48):
Yeah, it's. Chaos.
Yeah, baby. There's no organization.
It's just pure like, wow, what is this brain on right now?
But. I hope you come up with some
Jabba the Hutt Princess Leia stuff for this next cover.
Might have to. We've been.
Going all right, man. Well, what do you think?
Wrap it up. Wrap it up.

(01:05:09):
Feel pretty good. I think, you know, like we, we
wanted to, it's been a few weeks, so we wanted to kind of
like touch on a few different areas.
I think a lot's changed. A lot hasn't changed.
I think even next week we'll seesomething.
Something's going to something else is going to be unlocked
here in the next week. I think too, when people are
going further into Sora, for example, we're also going to see

(01:05:31):
other things, right? Like there's that moment where
things get released, and then there's also that moment where
you start to see all these different use cases or tweaks or
things that maybe you didn't think of the first first time
around, Right. That, I think is like the second
wave of excitement that I alwayslook forward to, too.
So we'll touch on that. And man, it's good to see.
It's been too long, dude. So we'll get back.

(01:05:53):
Yeah, it's going to be back. It's we'll get back.
I I'm done traveling after next week for a little bit, which is
good. I'm tired of being out of my
apartment for all this time. I need to be home.
But the other thing is because you guys haven't heard this in a
month, damn it, like and subscribe.
That's it. It's always not that hard.
Just hit a. Button back to wait until the

(01:06:14):
end all right. Hit a button we forgot we
practice. Hit the button, show us some
love guys. Thanks for thanks for being
patient with us. Thanks for coming and we'll see
you guys in the next one. Have a great weekend.
Advertise With Us

Popular Podcasts

Stuff You Should Know
Medal of Honor: Stories of Courage

Medal of Honor: Stories of Courage

Rewarded for bravery that goes above and beyond the call of duty, the Medal of Honor is the United States’ top military decoration. The stories we tell are about the heroes who have distinguished themselves by acts of heroism and courage that have saved lives. From Judith Resnik, the second woman in space, to Daniel Daly, one of only 19 people to have received the Medal of Honor twice, these are stories about those who have done the improbable and unexpected, who have sacrificed something in the name of something much bigger than themselves. Every Wednesday on Medal of Honor, uncover what their experiences tell us about the nature of sacrifice, why people put their lives in danger for others, and what happens after you’ve become a hero. Special thanks to series creator Dan McGinn, to the Congressional Medal of Honor Society and Adam Plumpton. Medal of Honor begins on May 28. Subscribe to Pushkin+ to hear ad-free episodes one week early. Find Pushkin+ on the Medal of Honor show page in Apple or at Pushkin.fm. Subscribe on Apple: apple.co/pushkin Subscribe on Pushkin: pushkin.fm/plus

Dateline NBC

Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations. Follow now to get the latest episodes of Dateline NBC completely free, or subscribe to Dateline Premium for ad-free listening and exclusive bonus content: DatelinePremium.com

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2025 iHeartMedia, Inc.