Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Happy Wednesday, September third, twenty twenty five. You're listening to
Blue Lightning AI Daily. I'm Zan and today's episode was
made with Microsoft Vibe Voice seven B. We've got a
fun one. Dscript just added nano banana as an in
app image model. Yes, banana, No, you don't have to
peel it.
Speaker 2 (00:18):
I'm Pippa and I am absolutely here for Google naming
an AI feature like a smoothie flavor.
Speaker 3 (00:23):
But it's a big deal for creators.
Speaker 2 (00:25):
This is about keeping your characters and products looking the
same across edits thumbnails, adds your series mascot, it finally
stays on model.
Speaker 1 (00:33):
Right. So what changed? Dscript's late August updates surfaced nano
banana as an option inside the app for image generation.
That means you can stick with the same face and
proportions while you tweak lighting, wardrobe, pose or background.
Speaker 3 (00:46):
Basically same person, new prompt. That's the vibe.
Speaker 2 (00:49):
If you've ever had a thumbnail character morph into their
evil twin three versions later, this is the fix, and
it's hooked into that broader Gemini two point five flash
image wave.
Speaker 3 (00:58):
Google's been pushing who it's.
Speaker 1 (00:59):
For create doing imagework, YouTube thumbnails, episodic art, brand mascots,
product shots, ad variants, solo founders, small marketing teams, anyone
ab testing images and tired of retouching every little inconsistency.
Speaker 2 (01:12):
Hmmm, So you can go from studio portrait soft key
light blue blazer to sunset back light leather jacket and
it still looks like the same person.
Speaker 3 (01:19):
That's it.
Speaker 1 (01:20):
That's the promise, Identity preservation plus prompt level edits descript
added it in the late August drop check the descript
change log at dscript dot canny dot io dot change
log for the details.
Speaker 3 (01:31):
Quick reality check for folks.
Speaker 2 (01:32):
This is not a tutorial segment, but in descript you
can pick the model in app and then prompt away
tell it the outfit, the mood, the set, the face
and proportions stay locked.
Speaker 3 (01:42):
Boom consistent brand and providence.
Speaker 1 (01:45):
Google's been pairing this with Synthid style watermarking, so outputs
carry an invisible This is AI signal that's important with
platforms tightening policies around labeled content.
Speaker 2 (01:54):
YEP, advertisers and partners get less nervous when there's traceability.
PC Gamer even flagged deep fake risk angle around this.
Consistency tech like yay for on model characters, but also
don't be sketchy label your stuff source check.
Speaker 1 (02:08):
We saw dscript confirm the update on their change log,
and Each Labs announce support for Nano Banana two in
its image tooling. That shows it's not just a demo,
it's moving into production workflows.
Speaker 2 (02:18):
Shout to Each Labs availability now on their platform and
API if you're building pipelines and axios. Framed Google's push
here as trying to be top banana in image editing.
Speaker 3 (02:28):
Love the pun, but also.
Speaker 1 (02:30):
Facts why this matters for workflow. Three Big wins faster iteration.
You can spin variants fast without losing identity. Great for
thumbnail testing. Fewer manual fixes, no more cloning eyebrows or
patching jaw lines every time you change a shirt. Brand
safety watermarking by default helps you ship with confidence.
Speaker 2 (02:47):
And if you're doing episodic storytelling, webcomics, podcast, art tutorial series,
this saves your sanity. The character you introduce in episode
one doesn't suddenly get a new nose in episode eight, we.
Speaker 1 (02:58):
Should place it in context. Other tools have hit consistency
from different angles. Mid Journey's character reference, Adobe's Generative Match
for brand styles and scenario has been big on character workflows,
but Nanobanana is leaning into same subject identity plus provenance signals,
and now it's inside a mainstream creator app like dscript.
Speaker 2 (03:16):
Also speed the Gemini two point five flash image thing
is about fast, tight edits. If it's quick and it
stays on model, that's catnip for anyone running channels or
ad accounts.
Speaker 3 (03:26):
Time is clout.
Speaker 1 (03:27):
How it changes output. Creators can unify an entire season's look, characters, lighting, grammar, backgrounds,
and still flex scene by scene. Your hero product can
rotate through holidays, collabs, and colorways without mismatched proportions or
off brand textures.
Speaker 3 (03:41):
Oh I get it.
Speaker 2 (03:42):
So your sneaker startup can ship twenty ad shots with
the exact same shoe, silhouette and stitch pattern, just remixing
backgrounds and vibes. Cozy fallin one, neon, cyberpunk in another.
That's the move.
Speaker 1 (03:53):
Availability notes. Descript's Nanobanana option is in the late August
update rolling out in app now. For ongoing updates, see
the Descript changelock. Each lab says their nano Banana integration
is available via platform and API.
Speaker 2 (04:06):
And because we're adults who think about policy now watermarking.
If you're doing brand deals, provenance is not optional this
year Google baking that in by default puts less risk
on you and your clients.
Speaker 1 (04:17):
Let's talk risks and guardrails. Identity preservation is amazing for
your own characters, but consent is key with real people.
The upside is real on model hosts, presenters, mascots. The
downside deep fake concerns. So stick to characters and assets
you own or have explicit rights to use and label
the outputs facts.
Speaker 2 (04:36):
Also, if you're a YouTuber, imagine thumbnail refreshes with the
same host identity across seasons, new hair, new lighting, new
camera angle, but still recognizably you. That kind of visual
continuity plays nice with binge behavior.
Speaker 1 (04:49):
Creators always ask about the how do I get the
best results angle? Without turning this into a tutorial, a
few principles. Start with a clean base identity, keep prompts
concise about the characters defining traits, make edits iteratively, lighting first,
then wardrobe, then background to avoid overloading the model. Save
your house style as prompt snippets so every asset uses
the same grammar.
Speaker 2 (05:09):
And keep your brand palette consistent even with identity locked.
If your tel drifts to c FOAM, your grid starts
looking messy. Pair nanobanana with a color dot style template
in your project bigger picture.
Speaker 1 (05:20):
Google's ecosystem alignment here makes sense. Search trust signals, YouTube, thumbnails,
potential near device edits on Android, brand safe generation in workspace,
catalog compliance for ads, consistency plus providence hits a lot
of those checkboxes.
Speaker 2 (05:33):
Wait, so, if I'm a small team, what does day
one look like? Give me the thirty second Do this now.
Speaker 1 (05:39):
Day one open dscript. Choose nano banana as your image model,
set your base character or product prompt, Generate a handful
of hero shots, then iterate variants, lighting wardrobe background without
changing core identity export with watermarking on by default. Test
variance in your thumbnails or ads.
Speaker 3 (05:55):
That's tight.
Speaker 2 (05:55):
And if you're a dev or a studio doing volume,
hit each labs for the API and shove into your
pipeline so the catalog stays on model across markets.
Speaker 1 (06:03):
No more frank and feed competitive angle. This raises the bar.
If consistency becomes default, audiences will notice when it's missing.
Expect other platforms to highlight identity controls, and creators will
choose tools based on output reliability, not just raw wow factor.
Speaker 2 (06:17):
We love wow, but we love ships on Tuesday even,
more consistent faces and products mean fewer reshoots, fewer Photoshop marathons,
more sleep.
Speaker 1 (06:25):
Put that on a mug quick source roll call so
you can dig deeper. Descript change Log has the Nanobanana
in app update and other late August features Descript dot Canny,
dot io dot Changelock. Each Labs introducing Nanobanana post outlines
capabilities and availability on their platform and API. Axios covered
Google's consistency push in late August as a strategic bid
in image editing, PC gamer flag the deep fate concerns
(06:48):
around identity, preserving.
Speaker 3 (06:49):
Edits and for vibes.
Speaker 2 (06:51):
The name Nanobanana is elite branding, peak meme energy with
serious utility. Top Banana Indeed.
Speaker 1 (06:57):
Final take on model images are becoming the dedault faster variance,
fewer fixes, providence by default. Descript Bringing nano Banana in
app is a clean step toward reliable, brand safe visuals
for everyday creators.
Speaker 2 (07:10):
That's the show. Thanks for hanging with us on Blue
Lightning AI daily. If you want more breakdowns and video
tutorials on your favorite AI tools. Hit blue lightningtv dot
com for news updates and how tos.
Speaker 1 (07:21):
We'll be back tomorrow. I'm Zaan, I'm Pippa.
Speaker 3 (07:24):
Go make something consistent and chaotic but mostly consistent. Bye.