Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
A little less than
two years ago, the thing that
was on my wish list was whetherwe can, in a photorealistic way,
render a nighttime imagewithout using Photoshop, without
using any 3D program, and thatwishful thinking is a reality.
Now we are looking forward tothis journey where AI is our
(00:21):
co-pilot, where works with us,hand in hand, assists us.
Basically, we are creativepeople and the thought process
that we have, if we manage tomake AI understand what we want,
it is of great benefit to us.
It is of great importance, Iwould say, now that I'm seeing
(00:42):
the full potential of it.
Speaker 2 (00:57):
My name is Martin
Klaassen and welcome to Light
Talk.
I'm excited to announce thatI'm embarking on a new series of
interviews and this time itwill be into a sort of an
unknown the world of artificialintelligence.
It might be alien to some, itmight be exciting to others, but
it's sure that it's no longer abuzzword.
It's really here already andit's already creeping into our
(01:17):
daily lives.
So to understand more about theworld of AI and how it will
impact our professional lightingdesign, I'll be interviewing
trailblazers and early adoptersto sort of get into their minds
and find out how they use andwhy they use AI, how it improves
(01:38):
their workflows, maybe, whichkind of AI tools they're using,
how they approach theintegration of AI into their
teams or in their company.
But I will also be asking themquestions about any concerns
that they may have aboutintellectual properties and
about privacy and potentiallyalso the need for
(02:00):
standardization into theindustry of lighting when it
comes to the use and integrationof AI.
So I'm really excited aboutthis and I hope these chats will
give you also some insights andsome meaningful understanding
of what AI has to offer for usas lighting designers in the
(02:21):
future.
Us as lighting designers in thefuture.
Enjoy this next interview withlighting designer Faraz Izzar
Faraz.
Welcome to Light Talk, Thankyou, and for the benefit of our
audience, I think it's probablygood if we can start with a
little bit of context.
Where are you from?
(02:42):
What are you doing in yourdaily life, so people understand
who I'm talking with?
Speaker 1 (02:47):
Sure?
Hi everyone.
My name is Faraz Azhar and I'man architectural lighting
designer based in Dubai.
I've been in the industry nowfor almost 20 years, I think.
Yes, next year will be my20-year anniversary, and mostly
I've been in the region, In fact, anniversary and mostly I've
(03:09):
been in the region.
In fact, I've been in the UAEever since I started my career.
I am a trained architect, but Idecided to have.
My interest was always lightingdesign, so I decided to take
that leap of faith and, yes,that proved to be quite a
fantastic decision for myself,and I'm a part of a
multidisciplinary company calledAE7 in Dubai and we do all
(03:29):
sorts of from architectural tolandscape disciplines, to ID and
master planning.
We have all under one roof, andtwo years back, I started the
specialist lighting divisionover here and I'm leading it at
the moment.
So, yeah, that's in a nutshell.
Speaker 2 (03:46):
Okay.
Was there a specific triggerwhen you were doing
architectural design that saidoh, lighting is actually much
more interesting thanarchitecture.
Was there something specificthat just triggered you to move
to lighting?
Speaker 1 (04:00):
Yes, martin, I
actually, when I landed up here
in Dubai back in 2002, I wouldsay I was fascinated by the city
, of course, and the last partof that fascination was how the
buildings appeared in the nightand at that time it was all
starting.
I mean, it was already there,people were conscious, but there
(04:21):
were a few buildings whichstood out Burj Al Arab, for that
matter.
During that time it stood veryuniquely.
It still stands uniquely now,but even at that time it was
having those color-changingscenarios and the architecture
of it is, you know, it's sobeautiful.
And then when you play withlight on such kind of a
(04:42):
structure, such kind of amonument, it kind of changes the
whole perception.
So that's what triggered me toforay a little bit more into
this domain and discipline andafter two years of my career as
a pure design architect, I got achance to switch over in 2005.
(05:04):
And, yeah, it's been a greatjourney, absolutely, absolutely.
Speaker 2 (05:10):
Now we're here to
talk about AI specifically and
the impact of that on the futureof lighting and lighting design
.
Just as a starter, how do youfeel about AI, what is your
feeling about it and what do youthink just off the bat right
now what it will do to ourprofession?
Speaker 1 (05:33):
I would say, martin,
that we know that there is a lot
of noise right now when itcomes to AI.
A lot of people are saying alot of things but like, in fact,
you said it, it's already here.
It's not that it's going tocome or we have to wait for
something.
It's already here and there isno choice but to adapt.
(05:55):
And, having said that, Istarted my experiments with AI
back in 2022.
It's not that long anyway, andit's developing so fast.
So when I started it, Irealized that this has a great
potential and very soon at thatpoint in time, I thought that,
okay, this has a potential ofgetting into our workflows.
(06:16):
At that time, it was still inthe initial stages and we
couldn't use it, of course, forthe profession as such, but then
, because of the rapidadvancement and rapid
development, I never realizedwhen the time came, when I was
incorporating everything in myworkflows, when I started first,
(06:38):
and then I trained my team tostart on it.
And now I'm happy to say thatwe have happily adapted to those
workflows and we are lookingforward to this journey where AI
is our co-pilot, where it workswith us hand-in-hand, assists
us and it kind of makes ournarratives, the thing that we
(07:04):
have in our mind to build.
Basically, we are creativepeople and the thought process
that we have.
If we manage to make AIunderstand what we want, it is
of great benefit to us.
It is of great importance, Iwould say, now that I'm seeing
(07:24):
the full potential of it andit's there.
There's no denying the factthat it is going to get more,
deeper and deeper into ourworkflows.
Right now, we have a limitationof using it mostly during the
early concepts and in the earlynarratives, but the day is not
far when it finds its way intothe production workflows.
(07:48):
It's starting to do thatactually, and I think it's just
a matter of maybe, if not sixmonths, then maximum a year.
Then I think we would be doingBIM production drawings with the
help of AI.
We would be doing all kinds ofworking drawings.
We'd be doing, I would say, allthe project documentation with
(08:09):
the help of AI.
So it's here and I am actuallyon the bandwagon at the moment
and I love it.
Speaker 2 (08:18):
We'll unpack that a
bit further down.
I want to go one step backbecause there must be a moment.
Just like you were certainlycaught by lighting and lighting
design, was there somethingspecific that caught your
attention in terms of AI?
Because, I mean, for myself,it's just the noise in the
social media and what you readabout it that triggered it for
(08:42):
me.
Not that I got reallyconfronted with AI specifically,
but you see people using it,people like yourself who are
online and promoting that alsoat events.
What was it for you?
Because you are sort of anearly trailblazer in this.
There must be something, maybedifferent, that triggered you to
(09:04):
dive into AI uh, yes, martin, Iwould say that it was.
Speaker 1 (09:09):
Uh, see, the first
thing that caught my attention
was, of course, the quality ofthe visuals, and at that time it
was not doing very good withrespect to lighting design, but
as generic visuals I was.
I used to think that if it isdoing such things with the
architecture, such kind ofartistic interpretation of what
(09:30):
I'm saying, what I'm writing asa prompt, if it is doing such
good things with with a genericlanguage, with just a cognitive
language, sort of recognition,imagine what it would do with
lighting.
So, and the second thing wasthat I started SlowMart and,
though I am using it for a verylong time, at that time also,
(09:52):
there was a lot of noise, but Idecided to filter myself out,
just concentrate on a couple oftools which I thought would be
great in the future, in the daysto come.
And the first tool that piquedmy interest was Midjourney, of
course, and it is still myfavorite tool, by far the most
powerful of them all, I wouldsay, at the moment.
(10:13):
And once I started with it, Irealized that, you know, this
thing is going to work greatwith lighting design once it
understands, and back, I think,during, most probably in 2023,.
I did an interview where RayMaloney asked me that, faraz, do
(10:34):
you think that AI might be ableto do the architectural
lighting renderings for us?
For example, if you got adaytime perspective, can AI turn
it into a nighttime?
So during that time I said thatyes, it is very much possible.
At that point in time it wasjust a wishful thinking.
Yes, it was about 2023 or endof 2022, I suppose At that point
(10:59):
in time it was just a wishfulthinking, but I told him that
yes, it is very much possibleand it is probably something
that I'm looking forward to, tosee that how it reacts.
And I'm surprised to see this,martin, that with the very
latest update that ChatGPT got,you are doing those things.
(11:20):
It's amazing.
I mean, I've just startedexperimenting with it just about
two, three days ago, becauseit's a very latest update.
I think it's just about a weekor eight days old.
It's doing that.
Basically, what you're doing isyou are feeding it the JPEG
image or high-resolution imagethat you got from the architects
(11:44):
or interior designers.
You are in a verystraightforward language.
It's not even limited tocomplex prompting anymore, the
way it started early on.
Right now, like we are speaking, you just converse with your AI
and you have the option to usethe headset as well.
You just speak your thoughts,thoughts, and then it's churning
(12:06):
out, it's turning those imagesinto, you know, night time.
Speaker 2 (12:10):
So you don't talk,
but you can actually use the
voice control to achieve this.
Speaker 1 (12:16):
Absolutely,
absolutely, and I don't know
whether I can share the screenon this in this particular
situation, but I would haveloved to show this and it's it's
here existing sitting on mysystem as well, but it's, it's,
it's a great tool with thisaddition uh, I mean the thing
that was just a visual thinkingjust about two, less than two
(12:38):
years ago.
Speaker 2 (12:39):
Maybe it's, it's
possible now, out of so many
other things, obviously are youthe only one doing that in your
company, or do you have a biggerteam, because you seem to be
the driver?
But you also need people withlike-minded ideas.
Speaker 1 (13:00):
Absolutely, right now
, martin, the whole team is
doing it.
Absolutely, no, no, no, rightnow, martin, the whole team is
doing it.
Because the good thing is,Martin, that at AE7, their own
vision is to promote these tools.
So it sits very well with thecompany vision the use of AI
tools and the use of the AI,whether it's a production tool,
(13:23):
whether it's a visual tool.
So the company kind of promotesit in a way that we get future
ready and we move ahead withtime, we evolve with time.
So right now, just and I'mtalking about it basically right
now but this has happened justin the last two, three days ago,
(13:44):
I gave my whole team a quickdemonstration of what can be
done and what can be achieved,and they are amazed.
And now, since the last threedays, everybody is experimenting
with mid-journey chat, gpt,image generation and even the
BIM workflows.
If my BIM guys get stucksomewhere, stuck somewhere, with
(14:05):
the help of visual aids, withthe help of screenshots, they
are asking AI to resolve theirqueries, to resolve their
problems, and they are gettingthe solutions hands-on.
You don't have to go to the ITsupport, you don't have to
contact the Revit guys or BIMsupport or Autodesk.
You're getting the livesolutions right in front of your
(14:27):
screen and so it's a veryinspiring thing, right A lot of.
Speaker 2 (14:32):
I don't think there's
an age barrier.
I mean, I look at myself I'm 70, and I'm diving headfirst into
it, even though I started mycareer with no computer.
So you can imagine how that haschanged my life.
But I always say never a dullmoment in lighting and lighting
design, the evolution oftechnology and all that.
But I can imagine that theremay be some people that have
(14:55):
some apprehension and they say,oh, but I guess if you, as the
leader, show the way and you caneven demonstrate what kind of
inspiring things can be done,you get probably a very
enthusiastic team following you.
Speaker 1 (15:10):
Absolutely.
And I'm lucky in that respect,martin, because when I explained
, when I started explainingthings to them, I was
apprehensive because, like yousaid, that not everybody takes
it, you know, in a good way.
Some people perceive it to be athreat, some people are still
uncomfortable, Some people thinkthat it will replace us in a
(15:31):
way.
It can never happen.
The humans will always drive it.
It will, it's it's.
It is an assistant to now, itwill be assistant, an assistant
tomorrow and it will be anassistant forever.
It will, it will always be theco-pilot mode which will drive
the things forward.
Ai cannot do anything by itself, so it's the humans that are
driving it.
(15:52):
So I'm lucky that the teamunderstood and they are right
now.
Even right now, as I speak, I'mgetting whatsapp from my team
members.
Let's see, this is what Iproduced.
This is, this is a render thatI did right now.
Speaker 2 (16:03):
So you feel the
younger generation is more adept
to it than the older generation, or does that doesn't?
Speaker 1 (16:10):
matter.
It doesn't matter, martin, itdoesn't matter, Martin, it
doesn't matter.
It's basically, it's thepersonal, I would say the.
You know, person to person theperception varies.
It does not.
I don't see anything withrespect to generation.
There is nothing related to thegeneration in it.
Speaker 2 (16:27):
You mentioned.
Bim and digital twins andthings like that.
Yes, what is the connectionbetween what you currently feel
can be done as lighting designand your lighting design process
and the actual, you know, thewhole digital world that is
being created in terms of, Ithink, the whole building
(16:48):
process and maybe evencommissioning?
Speaker 1 (16:51):
Yeah, you see, it's
still very early to say, but
then we can always speculate,and we can always the way that
things are moving at the moment.
Uh, just some days ago, justless, a little less than two
years ago, uh, the thing thatthat was on my wish list was
whether we can, in aphotorealistic way, render a
(17:14):
nighttime image without usingPhotoshop, without using any 3D
program, and that wishfulthinking is a reality now.
So, similar to that.
I think there are a couple ofthings which would be happening
in the future with respect tothe production workflows as well
.
Right now, mostly things thatI'm talking about, and the
(17:37):
things that I talk about arerelated to the early stages of
the projects, thus the earlynarratives, the conceptual
storyboarding.
But, having said that, the waythings are progressing at the
moment, I am seeing that verysoon, I foresee that there will
be some kind of integration withthe BIM and the CAD workflows.
(17:59):
Maybe there will be AI runscripts which will automate the
processes very fast, and Adobeis already doing it to some
extent.
Doing it to some extent, it'salready bringing in all the
machine learning tools withinthe Photoshop interface and its
(18:21):
other softwares as well, andthere might be an integration
with the lighting simulationsoftwares as well, where you
just I mean, see, okay, you, youwould be doing the fine tuning
later.
But if to for, for a quickstart, if you can tell dialects
or relux that this is the space,this is the property of this
(18:43):
space, these would be thefinishes based on your thinking,
do a basic lighting layout andit should be this kind of a
narrative.
It should, it should work, thisfeeling, it would, it should
work this emotion and dialectsdoes it, should evoke this
feeling.
It should evoke this emotionand Dialects does it for you, it
calculates and it generates theoutput.
So I think that it's allpossible now, martin, it's going
to happen, I'm sure it is.
Speaker 2 (19:04):
I'm sure it is.
It sounds overwhelming to me,to be honest, and I even think
that maybe it needs someretraining from some people in
terms of how to prompt and howto do things.
But overall, I think it's veryexciting what the outcomes can
(19:25):
be and how we can facilitate ourworkflow, as you mentioned, do
you?
Speaker 1 (19:37):
think there are some
applications more suited to this
than others, or does it reallynot matter?
In terms of lighting and lightdesign, martin, there are so
many tools right now and I stillkind of try to stay away from
the noise, so I have got some ofmy own favorites.
I know that there are hundredsof tools out there right now,
but, having said that that thereare hundreds of tools out there
right now, but but there are.
Having said that, there aresome favorites which I use for,
(20:00):
I would say, the visuals, whichI use for videos and, of course,
uh, some some of the people inmy company, because because, uh,
I, I have got a very youngperson join in very recently.
He used to do stage and theaterlighting back in Amsterdam and
(20:20):
I have taken him on board nowand he's a very bright, very
brilliant chap in just a fewweeks of his days in A7.
Martin, he built AI agents togenerate the luminary schedules,
to generate the preliminary,what you call the cost
(20:43):
optimization agents, to generatedark sky compliances, and all
was done via chat, gpt, the proversion not the pro version,
sorry the plus version and itworks so very well.
It works in such a good way.
The first thing is that ofcourse you have to perform your
(21:04):
own checks, definitely, but Iwould say that it has reduced
our time of filling in the dataourselves.
Filling in the data from ourside, you know, drastically, I
would say, 60 to 70% of our timeis, you know, totally cancelled
(21:24):
now.
And it is, and he's stillexperimenting, he's still, you
know, learning on how further hecan automate these processes to
further reduce these times, tofurther produce more tools which
are not visual butdocumentation, more of project
documentation and compliancechecks.
So these two tools that I'mtalking about in particular work
(21:46):
very well.
One is the Dark Sky ComplianceChecker and the other is the
Luminaire Schedule Extractor.
And also the next target isthat he is kind of writing, he's
starting to train an agent tokind of generate adaptive
specification suggestions.
(22:06):
So we just feed it some kind ofinformation that this is the
project and it starts telling usthat where to zero, in which
manufacturers to go to.
Speaker 2 (22:17):
Okay, that's
interesting because let's dive a
little bit into this workflowthing, because obviously what we
want AI to do, I guess for mostof us is to take away all the
boring stuff and thetime-consuming stuff.
Yeah, absolutely.
In the end we work with billablehours, right.
So, even though our knowledgeis what it's all about, but in
the end it's all about time.
(22:38):
So is it possible right now toask ChatGPT or whatever other AI
tool to actually select thelight fixture?
What sort of prompts would youhave to use?
Would you have to reallyspecify all the performance
specifications and then say,well, find me the light fitting
(23:01):
that does this?
Speaker 1 (23:02):
Yeah, I mean it can
do that technically and that's
what he's trying to beta test atthe moment.
So what he's doing is, at thispoint in time, he's just feeding
information to see.
Chatgpt has an option where youcan train your GPT models and
to perform some specific tasks,so that's where he's working at.
So he started on this new modelwhere he's trying to feed as
(23:25):
much information as he can fromvarious data sheets, from the
manufacturer's websites, writinga couple of things from his
site, writing a couple of thingswhich are specific to the
region, and so all of thatbasically serves as the database
for the customized AI agent andfrom within that database, it
(23:50):
also has the possibility toexpand and search further, to
expand whatever information isavailable on the web.
So it starts from within thatdatabase and once it zeroes into
a particular thing, then itstarts looking into more details
.
So it's right now.
We are doing a little beta testof sorts at the moment, but I
(24:12):
think it will be possible verysoon, martin, it should be Right
.
So yeah, because obviouslythat's the.
Speaker 2 (24:17):
You know how we
always search catalogs I mean
online nowadays, or you know andsometimes we have our preferred
suppliers as well, but I canimagine that with all these AI
tools, there's a whole new worldopening, because it's
impossible for us to know allthe light fixtures that are
available in the world, letalone what they can do, and
(24:38):
certainly also new innovationsand things like that.
So I can imagine the enormouspotential there, plus that you
can then also feed in thelocality, the geographical
position of your project, andwhich are nearby suppliers,
potentially, maybe options interms of you know, we always
struggle with value engineering,so I would imagine that can
(25:01):
come in as well.
And then, yeah, we just pushthe button and then our co-pilot
, our design assistant, is goingto do that.
Speaker 1 (25:12):
Definitely, Martin,
and I don't think that today is
far, Martin.
I don't think that is too far.
It's just a matter of eithersomebody else would come up with
some kind of a preset tool, orthese agents which exist right
now.
They would become so advancedthat we give them the
(25:32):
information, the basicinformation, and then they
basically search everything forus.
They kind of fund outeverything that we need and they
just give it us on the platterwithout us researching.
Definitely, we have tocross-check everything, but here
we are talking about reductionof the time.
Speaker 2 (25:57):
So now this is
obviously production work.
Let's step back to the creativepart, because we, as I
mentioned before, we want toremain the creators-in-chief.
We don't want to be a sort of asub-consultant to the AI tool.
We have to remain in controlfor the creative outcome and
(26:18):
manage what the AI tool does forus.
So let's just step back to that.
There's obviously a need todevelop some sort of narrative
first, some sort of conceptualstory to which we can develop
our conceptual lighting design.
(26:39):
Are there specific tools thatyou use for that?
Or is that also becausemid-journey would be more
typically in terms of imagecreation?
But before you arrive there,you probably need to have some
good brief, some good narrativeto sort of prompt the image
(27:00):
creation.
Speaker 1 (27:02):
I use Midjourney for
that narrative as well, Martin.
Speaker 2 (27:05):
So I'll tell you.
Speaker 1 (27:06):
I'll tell you very
typically how I start my
conceptual process.
I'll give you an example here.
Conceptual process I'll giveyou an example here.
So there is this project thatwe recently finished submitting
the schematic design.
It's a hotel somewhere in Saudiand in a desert surrounding.
The architecture was inspiredby the sand dunes, of course,
(27:29):
because of the surroundings andthe place, and it's all about
hospitality.
So what I do is typically Istart some, some kind of a
metaphor comes into my mind andthat's where I start and that's
where I start generating theimagery.
Again, admit, journey, and it'sit's not related to the project
(27:50):
, it's related to the metaphor.
So, for this hotel, what camein my mind was a lantern in the
desert, a desert lanterninviting the travelers to come
in.
Yeah, exactly, I createdtotally, absolutely artistic
(28:12):
imagery first, just to elaboratemore on the narrative, with the
help of the images and with thehelp of total artistic intent.
There was nothing architecturalor lighting was not talked till
here and then the next step isthat from that metaphor I
(28:35):
connect it to, of course,because that's my inspiration
and how I take it from themetaphor to the project, again,
I start generating a series ofimages which sit somewhere
between the, I would say, themood imagery and the artistic
impressions, and then from thenext step is from the artistic
(28:57):
impressions, you go to the exactmood imagery which is close to
your, similar to something.
It is similar to your ownPinterest, basically your
customized Pinterest.
So you generate images of thevision that you are trying to
achieve on the hotel, on thebuilding itself, and again, that
(29:17):
is happening in Mid-Journey.
And ultimately, once yougenerate your actual lighting
render for the project, ofcourse that's not a Mid-Journey
generation.
That is done in all 3Dsoftwares and it's mostly in our
company it's on 3D Max with ourinputs for lighting.
(29:39):
But then the imagery that I hadgenerated from the metaphor
till the mood images from ajourney that also help our 3D
team to build further on thearchitectural lighting part when
they are generating their 3Drenders of the actual project.
(30:01):
So it's a whole process and it'sso far, so good.
It's working well.
The clients understand itbetter.
The clients appreciate the factthat you have that artistic
component right at the beginninguh, a lantern.
And then there was a project uh, it's an upcoming boutique mall
in dubai where, where I started, uh, as uh I, what what came to
(30:26):
my mind was a shimmering jeweland from that shimmering jewel I
developed, developed it furtherto the facade, doing
transitions and there are facetson the facade.
It's quite a dynamic facade.
So that shimmer I translatedinto the dynamic architectural
lighting.
So yeah, it's working well.
So far, so good.
Speaker 2 (30:47):
It feels a bit like
the creation of music, where
sometimes you come with with amelody first and then you add in
the text.
Sometimes maybe you have a textand then you create the melody
to it.
It seems like you create themelody first and then you add
the text to it.
You can say that quite Quiteinteresting, yeah.
(31:07):
Now, obviously I wanted to askyou also about the copyrights of
all this, because, technicallyspeaking, ai tools are public
domain, right, if I'm notmistaken.
So whatever you design issomewhere out there.
People can maybe access it oruse it.
(31:29):
Is there any way to protectthat, to keep that within your
community, your own design team,without being out there?
And I check out all the variousand I can find your designs and
I can I don't know.
I'm just wondering how we canprotect our intellectual
copyright, if I mean, is it ourcopyright anyhow, if you know if
(31:53):
it has been created by AI?
This is a very interestingquestion.
I think yes absolutely.
Speaker 1 (31:58):
And, martin,
basically what's happening is
that most of these techcompanies, like MidJourney, even
ChatGPT and some others, whatthey say is that as long as
you're on the pro account, aslong as you're not on a free
account, it is your own property, you can use it.
(32:18):
And and it's not just my journey, it's the video creator tools
who are saying it, it's even themusic creation tools who say as
had, who have the similar kindof a I mean it's, it's written
in their terms and conditionsand those kind of documentation.
So, yeah, that's from theirside.
From our side, what we can dois that, yes, some people will
(32:41):
find it debatable that AI isgetting the information from
somewhere which belongs tosomeone, but then, martin,
everybody, even the humanartists, they are inspired by
some art form which alreadyexists, so they are also getting
inspiration from somewhere.
The only difference is that AIis kind of, you know, just
(33:04):
computing it to another leveland at a very advanced pace, at
a very speedy pace, butbasically the process is the
same.
It's not stealing, I would say.
It's basically gettinginspiration from the data that
exists.
Speaker 2 (33:22):
Well, there are
people that still feel it's a
compliment if you take theiridea and you use it sometimes.
Yeah, absolutely so.
Sometimes we are dealing withNDAs right Non-disclosure
agreements in projects, whichmake this potentially a bit more
(33:42):
dicey if we are using AI todevelop potentially confidential
information.
Have you come across that andhow do you deal with that if
that's relevant to you?
Speaker 1 (33:55):
Yes, it is relevant,
but then basically, when we are
using these platforms, firstthing is that you are not
specifying the project name.
On these platforms you aregenerating imagery, maybe for a
project that is confidential,more generic, yeah, but
basically you're not feeding AIany information about the
(34:17):
project itself, that who's theclient and what kind of project
it is.
Okay, you might feed what kindof a project it is, but then
again it's a generic information.
Speaker 2 (34:27):
Yeah, you might feed
an image that is potentially
confidential or, I don't know,an architectural creation that
has been given to you as areference.
I mean, there might be a numberof things that are confidential
with you.
I'm just wondering how to dealwith that in this sort of public
(34:48):
debate.
Speaker 1 (34:50):
My thing is, martin,
that I am using most of the
tools that I use.
I do not use it on a kind of apublic mode.
Everything that I generate isbasically stealth mode.
I use the mid-journey version,which does not.
My images don't go public.
They stay as my images.
(35:12):
They don't get even published.
It's one of the top tiers,martin Midjourney's pro version,
so it has a stealth.
It's called stealth mode, bythe way.
Speaker 2 (35:28):
Okay that explains it
.
Okay, that's clear.
That option is there because Ithink, certainly if you're a
highly Okay, that explains it.
Okay, that's clear.
Yeah, yeah, yeah, okay.
So that option is there becauseI think any company, certainly
if you're a highly professionalpractice, you know which
insurances and all kinds ofthings and we have to develop.
You know we were being reliedupon by a client to act
professionally.
These kind of things areimportant to take into
(35:51):
consideration.
And that brings me also to thetransparency issue, because I
feel personally that if you douse AI in the process of
conceptualizing, we probablyneed to tell the client or
somewhere, let them know that wehave used AI tools to achieve
(36:12):
that.
Is that something that you alsosupport?
Speaker 1 (36:15):
Always, I always do
that, martin.
I always tell them Because, see, when I'm creating those
narratives that's where I startthat these narratives have been
created in AI to guide throughthe whole journey of our design
thinking, of our designnarrative, and till the last
(36:38):
mood images that are producedfrom Midjourney, mostly from
Midjourney.
We always specify that this isdone with AI and this is just to
aid the project design thinking, the design thought process,
and this is where we are gettingthe inspiration from.
It's not an actual project.
(36:58):
We are not getting anybuildable sort of an information
from here.
It's basically just theconceptual thing and then once,
once the concepts are approved,then we go to the actual you
know and those things.
But yeah, in my projects, Ialways, in fact, in bold
underline, I always say there'stwo ways to do that.
Speaker 2 (37:22):
You can make an
actual bold statement, or you
can do like in your companies,in the small letters.
Somewhere it's there, but youneed to look for it.
Speaker 1 (37:31):
Mine is always bold,
always bold, martin, and the
clients appreciate it, martin,because they also understand
that, okay, you are using thisparticular tool to come up with
these narratives which alignwith their thinking as well.
So they appreciate it, andmostly I mean touch wood.
So far, so good.
I have done about four or fiveconcepts, very successful
(37:54):
concepts at that, with these, uh, with this kind of a you know
approach, and the clients werevery happy, and the good thing
was that we were, yeah, and thegood thing was, martin, that we
were able to uh, create, uh, wewere able to take that thought
process further into the projectstages, the development stages,
(38:16):
the design development and thewhole narrative which we started
with.
When the project advanced intothe further stages, that
narrative remained.
We did not deviate from it.
So it it's so far, so good,martin, is that?
Speaker 2 (38:33):
something that
obviously, once you have created
your concept and your design,do you use AI also to put that
into a presentation?
Is that a helpful tool tocreate I don't know very
compelling presentations thatthe client you know?
(38:53):
Obviously we can do 3Ds, we cando video walkthrough, a lot of
things.
I would imagine that it goesmuch further than just the
creation.
It is also how do you woo theclient, how do you present it?
Yes, absolutely.
Speaker 1 (39:08):
Yeah, for me it's
always a presentation.
It's a document which basicallydocuments the whole thing, from
the early narratives to the inthe early stages of the projects
.
It's from the early metaphornarrative to the actual project
renders, and that whole courseof it is documented either in an
(39:31):
InDesign format or maybe aPowerPoint format.
I mean, we are not using anyother AI tools to make the
presentation themselves.
That's what we are doingourselves, of course, but the
contents of that presentationare, of course, the contents
within that presentation arefrom these AI tools.
Speaker 2 (39:51):
I would think that
Chet GPT is quite good.
Also in structuringpresentations it does so.
Can I say that you're mostlyusing ChatGPT, the pro version
and the mid-journey.
Are these the two main toolsthat you're using at the moment?
These are the two onesabsolutely that's true, but
there's many more out there.
Speaker 1 (40:13):
There's many more out
there.
But there's many more out there.
Yeah, there's many more outthere.
See, I also sometimes producethese small animations without
the use of, you know, anyexternal 3DS Max or any
animation tool, any properanimation tool.
So for that also I use a coupleof tools like Runway Clink, but
(40:42):
mainly I can say that yes, it'smainly Midjourney and ChatGPT.
Speaker 2 (40:49):
So for someone who's
starting out here in lighting
and lighting design and isinterested in AI, I can imagine
it's quite overwhelming to seeall those AI tools.
You see what's the expressionthe wood through the forest in
AI?
I can imagine it's quiteoverwhelming to see all those AI
tools.
You see what's the expressionyou would through the forest, or
how do you understand or how doyou know which one would be
suitable you happen to?
I mean, we kind of hear aboutmid-journey, but we also hear
(41:10):
about other tools, but I canimagine it's not that easy.
Is it just a matter of trialand error, just getting onto it
and trying and see, or what'syour opinion about?
Speaker 1 (41:22):
it.
I would say that first theperson has to be clear in the
mind and the sense of relevanceand purpose that what is that
the person is actually requiringfrom these AI tools, whether
it's just the visuals or whetherit's structuring of reports or
whether it's some documentation.
So I would say, first step isto be solid on the relevance and
(41:46):
purpose part of it, and then Iwould advise whoever is
listening to this webinar orpodcast, I would advise that
don't go with the trends.
Do your own research, talk topeople and see what the
(42:06):
trailblazers have been doing.
Just don't get blown away withall these so much of noise that
is being created all around.
So I would say choose the tools, not the trend.
That's what I would say.
So once you know what bestmatches your workflows, once you
(42:27):
know what best matches whatyour process is, then that's it,
and then you don't have to godive right in with all the tools
.
You can always start small andthen scale wisely.
So for me as well, I startedwith Midjourney.
I did not know what Runway was,I did not know what ChatGPT was
(42:48):
.
So for me it was firstMidjourney, then came in ChatGPT
and then these video tools andthen a lot of other tools which
I use occasionally.
Sometimes I use music tools aswell.
Speaker 2 (43:01):
Not sometimes I use
them quite a lot, but that's
mainly for my personal you know,yeah, but I guess the
experience that you build upwith big journey or chat, gpt is
transferable to other tools, Iwould imagine yeah once you know
how you work with one, youwould basically more or less
know how to work with all.
(43:25):
And yeah, yeah, one thing thatobviously is part of the whole
learning curve is the prompting,and I know that there's quite a
number of people out there thatcreate those so-called cheat
sheets with all kinds of prompts.
If you want this, this is whatyou need to ask.
I would imagine you have also awhole library of prompt sheets.
Speaker 1 (43:48):
I do have, Martin For
me.
I mean, I haven't created alibrary as such, I just store
everything.
All my prompts are within themid-journey interface itself and
I've just bookmarked them, so Iknow where to look at when I
need to do something similar.
But having said that, Martin, Iwould say that it will all be
(44:10):
very irrelevant.
The prompting in the days tocome would be as easy as me and
you talking right here, and it'salready as easy as me and you
talking ChatGPT has with thislatest update.
It's crazy.
Speaker 2 (44:27):
It appears.
If you want, I spoke to someoneyesterday who shared with me a
cheat sheet for photographicimages, and you can even tell
what kind of camera, what kindof exposure you want, what kind
of angle, what kind of light tocreate images.
I mean, that's amazing.
Speaker 1 (44:48):
Absolutely, and I've
been doing that and that was a
very standard cheat code kind ofa thing for which it's existed
in Mixer and Midjourney sinceit's I think, since it's V4 days
.
Right now it's at version 7.
But these cameras and all theseexposure settings, lens
(45:10):
settings, everything matters.
Yeah, yeah, exactly.
Lens settings, everythingmatters.
If you yeah, yeah, exactly,even at times, the photographer
style, you can, you can actuallyrefer to any famous
photographer, and that's what Idon't want to do.
Technically, for me, camerasand lenses are fine, but people
do refer to, you know, the goodphotographers names, people who
(45:33):
have been known to create goodportraits, to create good
photographers' names, people whohave been known to create good
portraits, to create goodNational Geographic-style photos
, and it matters.
Speaker 2 (45:43):
Even with the type of
camera and the type of lens
Absolutely Amazing, yeah, yeah,yeah.
I want to get back to somethingelse Now.
Looking into the future, do youthink we will need to get some
sort of regulations in place,Because this is, of course, now
a free-for-all?
Whatever you do, there's noregulation, but sooner or later
(46:05):
I'm sure somebody will come upwith ideas about standards and
regulations.
Do you have any thoughts onthat?
Speaker 1 (46:12):
Yeah, and it's
important to do that, martin,
definitely there should be somekind of regulations, because,
from what I can see, this isjust still a start and it will
develop very fast, very soon,and it's necessary to have some
kind of controls in place.
And when I say controls, I'mparticularly talking about the
(46:36):
design houses, the corporations,the architectural practices,
lighting design practices.
So just like they have theirown BIM standards, they have CAD
standards.
I think it's high time to draftAI standards as well.
It's a lot of people thinking,right, yeah, absolutely, so that
you stay ethical, you don'tdeviate too much into the zones
(47:02):
which are sort of kind ofcontroversial, and, apart from
that, you develop your ownstandards of prompts, you
develop your own projectreferences which you can, you
know, always, uh, you put oneach and every project so that
it it gives you a certain kindof a flair as a company, a
(47:25):
certain kind of a trade, atrademark sort of a thing.
So I I think it's high time todevelop those kind of things and
it should be there, thoseregulations should be there.
Speaker 2 (47:35):
We will go hand in
hand with our professional
indemnity insurances.
All that because, if we startusing AI, insurance companies
may also think well, hold on,you know who is responsible.
You're just too good.
Well, of course, as a company,you would still be responsible,
but I think internally we mightneed to think as a practice how
(47:59):
that impacts our insurers coverand things like that.
There's many angles to all this.
Do you think there will be?
Well?
I guess if you would be hiringyou could nowadays hire somebody
with AI experience.
Is that not going to be anofficial job description?
Speaker 1 (48:17):
It will be, and
already people are doing that
actually, martin.
They have started saying thatknowledge of AI prompting is an
advantage, mostly in thecreative fields.
They have started includingthat in their job description
sort of thing, and right nowthey say that it's an advantage.
I think the day is not far whenthey say that it is needed.
(48:40):
Yeah, yeah, for sure.
Speaker 2 (48:42):
Like programmers in
the age of when computers came,
and all that Now, before weleave, I wanted to ask you also
something about the luxurytheorem.
You know that, as the virtualline design community, we have
been collaborating with theDurong Awards in China to create
this Lux Futurum, and this yearyou have been invited as a very
(49:03):
prominent judge, also with yourfuturistic ideas and how to do
this in lighting design.
I just wanted to ask you, alsoin light of AI how do you see AI
influencing the future,specifically, I think, in
(49:24):
lighting design, and how also torelate that with the what I
would say, innovativemanufacturing that takes place
in China, because obviously,with Lufo, we try to bridge East
and West.
90% of our lightingmanufacturing takes place in
China.
(49:44):
There's a lot of innovationtaking place in China, what many
people may not even realize,and so our AI development in the
lighting design practice, theWestern philosophies and the
Eastern philosophies issomething that we're trying to
bring together.
Now that you are officially ajudge for the edition 2025, I
(50:07):
would like to have your views onthat and how you think AI and
innovation in lighting designwill be important for the future
.
Speaker 1 (50:16):
It goes hand in hand,
martin.
I think the innovation nowgoing forward from this point
onwards, wherever the wordinnovation comes, ai comes in by
default.
And when we are talking aboutthe AI in lighting, it will be
not just about the visuals, itwill be not just about the AI in
lighting, it will be not justabout the visuals, it will be
(50:36):
not just about the production,it will also be about the smart
controls, and it's alreadystarting.
You have AI-empowered lightingcontrols, you have AI-enabled
sensors and these are the endusers who are benefiting from
(50:58):
the AI tools which exist indevices.
Devices are talking to eachother based on those AI
protocols.
So, whatever innovation isgoing to happen, it would have
some sort of artificiallyintelligent component built into
it by default going forward.
(51:22):
And yeah, I think there is noway out of it in my opinion, and
yeah, for now we are justtalking about the imagery and
all these kind of things.
But then also in themanufacturing process, martin,
you can have right now,automation is always, of course,
china is leading in all theseautomation processes.
(51:44):
Come in when they bring in somesort of particular.
I would say I am not sure onhow it goes on manufacturing
(52:06):
terms, but I think there couldbe more intelligent
manufacturing as well, wherefactories and all these machines
know how much to produce basedon the project and the product
outcomes, how and when toproduce.
They can start making thethings, ordering the things by
themselves, something like that.
So I think I'm not very muchfamiliar with how it works in
(52:31):
manufacturing, but I think Ithink it's it's already there
with all these integration ofautomation it is for sure, and
you can imagine that in the sameway you create a lighting
design, you conceptualize alighting design with AI.
Speaker 2 (52:50):
There's, of course,
the same way and same process to
develop lighting products.
Right, absolutely, it'sbasically exactly the same
process, only it's not a project, but it's a product.
Speaker 1 (53:05):
Ah, yes, of course
People have started using that.
Oh, I thought that you wereparticularly asking about the
manufacturing, but yeah, theluminaire design.
Yes, this also specifically.
Speaker 2 (53:15):
Luminaire design.
Yes, this also specifically, Ithink, will be part of it.
Speaker 1 (53:17):
Yeah, People are
already starting to do that,
Martin.
They are using these AIprograms to generate smart
luminaire, smart designs.
I know a couple of people havestarted that, but for them also
it's quite early to say whetherthey are succeeding or not.
But yeah, I think eventuallyeveryone would be doing it, even
(53:41):
the big firms.
Speaker 2 (53:48):
There's no doubt
about it.
How does Farah see the futurein terms of you as a lighting
designer, or us as lightingdesigners, remaining the chief
creator, the creator in chief?
Because obviously there is adanger that AI is going to
overpower what we are doing, andI think one of the key things
(54:13):
that I always have felt is thatwe always need to check what
comes out of the AI tools.
As we know, ai hallucinates.
Sometimes it comes up with someweird things that we need to
uncover and make sure that itdoesn't see the daylight in a
way.
So how do we stay in control ofall this?
How do we remain thecreators-in-chief for the future
(54:35):
and how does this maybe idealAI world look for you?
Speaker 1 (54:42):
Martin, I would say
that AI may advance as much as
it wants to, which is good forus, of course, but in the end,
the intuition and sensitivity issomething which will remain
core human values.
Intuition is what will guide usand what will basically enable
us to be at the forefront, and Iwould say that again, ai, no
(55:10):
matter how much it advances, it,is going to be an assistant, it
is going to work in a co-pilotmode, but basically it's us who
are designing the experiences.
Basically, it's us who arecoming up with the narratives as
creatives, and those narratives, those experiences, will always
be in the front.
(55:31):
Ai will come, of course.
Ai will come from the side, itwill assist us, it will work
with us to develop thosenarratives.
But I have started saying thisthing a lot because I keep
getting asked this questionquite a lot, and so I have this
common line that I havedeveloped that it is our own
(55:51):
light that will always guide theway it won't be AI in that
matter.
It is our own light that willalways guide the way.
It won't be AI in that matter.
It is our own creativity, itwill be our own sensitivities,
it is our own intuitions thatwill enable us to be at the
forefront, to not let AIoverpower us in any way, and it
can never do that because itdoes not have the capacity to do
(56:13):
so, no matter.
Do that because, because itdoes not have the capacity to do
so, no matter what people say,it does not have the capacity to
do so for us.
Speaker 2 (56:21):
Thank you so much for
your inspirational uh
explanations on ai and how yousee ai working on uh in our
world of lighting design.
Thank you so much and I hope itwill motivate a lot of people
and also inspire a lot of peoplethat have been listening to
this.
Thank you so much.
Thank you so much and I hope itwill motivate a lot of people
and also inspire a lot of peoplethat have been listening to
this.
Thank you so much.
Speaker 1 (56:37):
Thank you so much,
martin.
It was a pleasure to be hereand lovely talking to you.