All Episodes

August 3, 2023 56 mins

[rerelease] What you’re reading right now was written by an artificially intelligent — though not sentient — neural network designed to write descriptions for podcasts. Or was it? You don’t know, and that’s what makes AI so fascinating, cool, and scary. Since Josh loves all of those feelings, he sat down with the founder of controversial AI-generated art startup Midjourney, David Holz, to unpack the future of creativity. Discussed: Porn, therapy, families, Hitler. This one’s a doozy!

See omnystudio.com/listener for privacy information.

Mark as Played
Transcript

Episode Transcript

Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Today's episode of What Future with Joshua Topolski is a
re release. It's a really interesting listen. We will be
back next week with an all new episode.

Speaker 2 (00:11):
Thanks for listening, hey, and welcome to What Future. I'm

(00:34):
your host, Joshua Topolski. Now, I don't know if you
have been following any of this stuff with this new
artificial intelligence machine learning driven AI driven bots, So there's
several bots that do this. One has recently become widely
available to the public. It's a bot called mid Journey.

(00:57):
From a user perspective, From a person's perspective, what you
can do is you can sit down and type something
and it will generate based on all of the images
it's ever looked at, which are billions of images on
the Internet and wherever else. Databases they feed it and
stuff like that. It will generate what it thinks you
want from a prompt from a sentence. You actually use

(01:20):
it through Discord, which is like kind of a chat
you know network, which was popularized by gamers. But Discord
you can basically go and talk to the discord bought
from mid Journey, and you can say, for instance, Dracula
explaining his vamporism, to a crowd of onlookers, and it
will generate four different images that it thinks capture your idea,
and they're insanely accurate. They look like somebody painted a

(01:43):
picture of something that you wanted. It is essentially like
the closest thing to being able to visualize a dream.
I don't know that I've ever had my mind more
blown by anything a computer has ever done than this
piece of software. I mean, it is hard to articulate

(02:05):
what it feels like when you write a sentence of
something that seems completely impossible and then see a pretty
good representation of it in a matter of like thirty seconds,
forty five seconds, maybe a minute, to see your first
four versions of something. Let's just say you're not an artist,
You're not a designer. You're not going to make your
living off of doing paintings for magazines or whatever. You're

(02:27):
just a person from a pure, like thrill level as
a person, I think this is fucking amazing. It is
like the most fascinating and most amazing thing I've almost
maybe I've ever done on a computer. And I've done
a lot of stuff on the computer, you know what
I mean, But this is like trippy I mean it's
fucking insane. It's like, like I said, it's sort of

(02:47):
like as close as you might get to you know,
you had a dream, and you can then see the dream.
I would also say, what's interesting is that it is
like a computer dreaming. I mean, what it is is
you giving This is like what I consider to be
fairly abstry input to a computer and the computer deciding
making all of these really creative decisions about what that
thing should look like. Anyhow, I'm not an artist. I'm

(03:10):
not a painter. I have been working with mid Journey
to create art for this podcast, and you can see
some of the prompts for these, some of the ones
that I did science fiction paperback book cover about society
in the future. One of them is a phrase from
Blade Runner attack chips on fire off the shoulder of Oryon,
which is Rucker. Howard has this monologue at the end

(03:31):
of Blade Runner, and that's one of the things he says,
and it created imagery based on that sentence, Dracula explaining
his vamporism to a crowd of onlookers. You can see
several variations. I mean, they are fucking beautiful pieces of art.
In my opinion, like legitimately beautiful pieces of art. There
is some art in figuring out how to get this
thing to do what you want, or to at least

(03:53):
create a result. Is that is pleasing? Now? Anyhow, what's
interesting about this? There's many interesting things about it, and
like I'll just go down the list of some of
the ones that I'm thinking about. First off, there's obviously
this question about art, like what is art and is
this art? And what kind of art is it? Meaning
As a guy who who's run a lot of newsrooms
and a lot of publications, I could see this is

(04:15):
very functionally important in like an organization that needs original
art for things, but maybe it doesn't have the budget
or the time to generate original art for everything they'd
like to generate original art for right, So there's an
implication there, like for me that I'm like, oh, that's
really interesting, Right, that's really exciting. I follow a bunch
of designers and artists on Instagram and they've been talking

(04:36):
about this for a while. I mean, this opens up
an enormous amount of serious questions, like, for instance, the
bots are obviously taking content and material and analyzing it
and learning from it, and in some cases, replicating it
in some way from a real artists right from historic
you know, pieces of art up to modern pieces of art.
As far as I know, in essence, these ais can

(04:57):
go and look at and then learn from. But there's
this little bit of controversy, or not a little bit,
maybe a lot from some artists who say this is
you know, it's theft of our work. You know, they're
using things we've created without any license to do so
and creating new works based on it. That argument, to
me is a little bit like every artist uses somebody

(05:19):
else's work to create what they do. I mean, as
we know, remixing in music has become one of the
baseline ways you make music now, no pun intended on baseline.
So the idea of like sampling somebody else's art to
create something new is not new. I think what's sort
of insane and threatening on a bunch of different levels
is that this is creating real art, really interesting pieces

(05:43):
of art and imagery that have real applications, whether it's
hanging in a gallery or using for an illustration in
a magazine or whatever, and it is just removing a
person completely. Basically, this image that I created for the
podcast is a perfect example, I could sit with a
designer and tell them about what I wanted and show
them examples can you make something like this, and we
could work through it over and over again until we

(06:04):
got to something that fell right. The idea that I
could just say it and it could be like, this
could be it. This could be the art for the show.
And that's one job that a artist is not going
to get now, like for sure, right, there's an implication
for people who work in these fields that is way
different than what we're talking about. Like I was like,
at a pure human level, this is thrilling, but on

(06:25):
the flip side of that, there's entire industries that potentially
are wiped out by this. What does this open up,
I think is a question that I don't know the
answer to, which is, in five or ten years, this
is going to be so much more capable to create
things like this, capable to a point where I think
it's likely in the next five to ten years you
can simply tell it to do something whatever it is,

(06:48):
and it will create a perfectly photorealistic version of it.
I mean, and there are versions of this where you
can say, you know, show me this thing and show
it to me in these different styles, and it'll show
you an image in the style of this painter, or
or like it was a photo taken from this era,
or like it was you know, shot on a certain
kind of film. What that means going forward is almost
like kind of frightening. Like people are like talked a

(07:11):
lot about deep fakes, you know, and they're like, oh yeah,
like they're going to fake a voice, or they're going
to fake a person's face or whatever. Like this is
essentially like we're getting to the point where you can
just fake any situation. You can just create visually any
situation you can think of. And I think the logical
thing is that eventually, pretty soon, I would imagine it'll
be able to do this with video, right, and I

(07:32):
think with moving images sound is not too far behind it.
You start to think of how this could be applied
to all sorts of other things. I mean, presumably, if
it can do this with art, with visual art, I
think it can do it with other forms of art. Right.
Will we discover when it's like you could have any
art available to you, any type of content available to you,
perhaps that like what you want is somebody else's brain

(07:55):
in mind, Right, like I want to like understand or
see something or hear something from somebody else's brain, but
I don't know what it's like if if the other
brain can just create any of those things that I
would be intrigued by, Like this art is a great example. Clearly,
this non brain entity can create things that surprise and
delight me, that feel as authentic and original as any

(08:17):
art that I've looked at. It's obvious that the systems
that are creating this are very advanced, and they are
only going to get better. They're not going to get worse.
There is no going back to a state where this
is not possible. And so when you think about what
that looks like down the road, like maybe not everybody

(08:38):
feels this way, but I can kind of like in
the middle of my brain, I get this like very
upsetting feeling when I think about what space actually is,
which is like this endless nothing and actually nothing and
what is that like? It's very upsetting to think about
to me. When I think about like the future of
this stuff, it's sort of a similar kind of weight
in the middle of my brain, which is like where
does this go? Like it feels like all of real

(09:00):
is almost called into question by the technology, and maybe
I'm overstating it, maybe I sound crazy. I'm not saying
the computer sension, or it's alive, or it's got a
soul now or anything. But there's something in between the
lines of all this where it's just sort of like
it leaps beyond even my understanding of what is happening,
Like it leaps to a place that's almost like, I

(09:21):
don't want to say spiritual, but it leaps to a
kind of almost religious place where it's like, how can
this be? You kind of feel like when you do it,
how can this be?

Speaker 3 (09:30):
Like?

Speaker 2 (09:30):
How is it possible? My guest today is David Holds,
the founder and CEO of mid Journey. David, thank you

(09:53):
for being here.

Speaker 4 (09:54):
Thank you.

Speaker 2 (09:55):
Just before this, I said, can I say CEO? And
you didn't want me to, but I've done it anyway.
We're all going to have to live with the repercussions.

Speaker 4 (10:01):
I no, I've been exposed.

Speaker 2 (10:03):
Okay, Let's say you and I met at a party.
Let's pretend we're at a cool party. You don't know
where I'm coming from, and I'm like, what do you do?
And you say, I'm the founder CEO of mid Journey
and I go, what's that? How would you describe it
to somebody just randomly at a party.

Speaker 4 (10:17):
I try not to. I'm pretty low key, but.

Speaker 2 (10:21):
If they ask, they're like, what does mid Journey do?

Speaker 3 (10:23):
Yeah? I I never really wanted a company. I just
kind of wanted a home, and so like mid Journey
is sort of meant to be like my new home
for the next ten years, to work on a lot
of cool projects that I care about, with with cool people,
and that hopefully are are good for everybody else too.
I know we have sort of themes that I want
to work on, but I had to put in three words.

(10:43):
It's like reflection, imagination, and coordination. Like in order to
flourish as a civilization, we're gonna have to like make
a lot of new things. And making new things involves
those three words. Wow, and we need a lot more
around them, like infrastructure, new fundamental forms of infrastructure really
around each of them. We were actually originally working more
on the reflections and coordination tools. We were doing some
imagination stuff, but then there were certain breakthroughs on the

(11:04):
AI side that were happening. It was about like a
year and a half ago. Now it looks like everything's
blowing up. But like a year and a half ago
in San Francisco, we all went to the same Christmas
parties and stuff. All the AI people are kind of
out in here, and we were kind of all together,
and I'm like, these diffusion models, it seems different than
the other stuff.

Speaker 4 (11:20):
And they're like, yeah, no, this is different, and well,
what are you gonna do?

Speaker 2 (11:23):
What are you gonna do?

Speaker 3 (11:23):
We're all kind of just talking, and I'm eventually like,
I think there's going to be a human side of this,
that it's not just about making pictures, but that there's
a sort of a back and forth. There's like a
lot more to this that's gonna be hard to figure
out from just optimizing a single number and up computer program. Right.

Speaker 4 (11:38):
There may be some taste involved, and no one knows
what that is. And I'm like, i think there's something
I have to contribute to, right, Yeah.

Speaker 2 (11:45):
Can you imagine? Though I'm a guy you just met
at a party. I've got no context whatsoever about mid
jourdy and you just told me that, which, all, by
the way, all very interesting. I have many questions related
to what you just said. Yeah, I had dumb it
down a little bit only because maybe not every single
person will know, but ah, mid Journey is known right now.
The company has risen to kind of a place in

(12:06):
the spotlight because it is what I think we're all
sort of talking about now, is like an AI art
tool or a tool to create art based on artificial
intelligence and machine learning and all of these sort of
other very complex technologies that are kind of fusing together
to make something that is relatively new. So I think
most people would say, you've built a tool that can

(12:27):
take human language tax like basic like English prompts or whatever.
Maybe you do in different languages, I don't know, and
convert a prompt like a description of something into a
piece of art that is created basically wholly by a machine.
Is that correct?

Speaker 3 (12:42):
Yeah? I try to avoid like the word art almost
to be honest, okay, because I think that it's like
not really about art. It's about imagination and sometimes people
use their imaginations for art, but usually not right, And
so I usually think of it as we're trying to
create these machine augment imaginative powers. Sometimes I almost call

(13:03):
it like a vehicle, right, you know, to really like
to ask like what are we doing? Like is it
like the invention of photography and how it changed painting, right,
And I tend to say no, it's much more like
the invention of the combustion engine and the car. And
like when we invented cars, they're faster than us, but
we didn't chop our legs off.

Speaker 4 (13:18):
We don't have to really move somewhere.

Speaker 3 (13:19):
You move through vehicles, So it's kind of like a
vehicle for a magines If you really have to go somewhere,
you're going to unit these vehicles like jets and boath
and cars.

Speaker 4 (13:27):
We never have a little robot as like our icon.

Speaker 3 (13:29):
It's like a sailboat, right, you know, very much trying
to kind of help people explore and imagine these like
seeds of like esthetic possibilities.

Speaker 2 (13:38):
I mean, it's interesting that there's a little bit of
like a defensive stance you have to take now because
the art aspect of it gets under the skin of
a certain part of the audience that's like, wait a second,
you know, what is this thing doing? What does it mean?
What does it mean for all these different industries. Yeah,
I think a lot of people feel and maybe you
guys have had to play some a new round of
defense because of it. That this is an engineer to

(14:00):
kind of like up end industries, right, But you're saying
you don't really view it that.

Speaker 3 (14:05):
Way, No, And that to me is actually very uninteresting.
Like the idea of like making fake art is really uninteresting,
like who cares? Or making fake photos it's really like
to me is not like I think, what's interesting is
making stuff that never could have existed before. I don't
like it when somebody makes a deep fake photo of
the dog, we make it really ready to do that.

Speaker 4 (14:22):
Other ones do that well.

Speaker 3 (14:23):
To me, the most interesting images are the ones that
don't look like anything we've ever seen before. They don't
look human, they don't look like the AI made. They
look like something new, and all we know is that
it's this new thing, it's this new frontier.

Speaker 2 (14:34):
Right. I should tell you that the art for this podcast,
as of right now, and I don't think it's going
to change, is generated by mid Journey Cool, and it
ended up producing results that I think are like at
once very familiar to me, like in terms of stylistically,
there's something very familiar to me about it, but there's
also something about it that is like totally original, I
think to your point, now, I'll tell you this, like,

(15:03):
I'll give you my stance a little bit, because one
of the reasons I wanted to talk to you, one
of the reasons I want to talk about this at
all is, as you know, I'm a huge nerd and
I've spent my entire life like being, you know, sort
of mesmerized and interested in emerging technology in all sorts
of different forms. And when I started using mid Journey,

(15:25):
mid Journey's producing something that to me feels I'll try
to avoid using the term arte, it feels like it's
creating something very original. I could say, like, Okay, I
know where some of this stuff is coming from, Like
I can kind of understand, like there's certain styles that
are present, or if you give it a prompt to
get a certain style, you can get that. But to me,

(15:45):
it was like, and I still feel this having processed
it now for you know, weeks and months, maybe the
most amazing thing that I've ever seen a machine do.
I totally understand the idea that you're not trying to
build a tool that is like a new photoshop, although
I think or applications that are obvious that are in
that realm. When I first asked about what it was

(16:05):
you use three words. What were the three words that
you used.

Speaker 4 (16:08):
Was reflection, imagination, and coordination.

Speaker 2 (16:11):
Okay, so coordination and reflection. I want to talk about
like what that means because I understand the imagination part,
and I think I understand how you are thinking about
like what mid Journey does now in that department. But
tell me about like those roots of reflection and coordination,
Like what was this before it was? What it is?

Speaker 3 (16:29):
We were working on a lot of things trying to
like understand human minds, like individually to help people reflect
and then also to kind of help people come together
and like work on things better. And so we were
doing a lot of like quantitative psychology and like structured
thinking to kind of like create like boost up of
a hive mind as fast as.

Speaker 4 (16:47):
You can kind of a future is going to say
lots of weird things.

Speaker 2 (16:50):
No, that's good. Are you saying that? Like the roots
of this are kind of like can we get this
thing to think on a collective level for us to
like solve problems? Yeah?

Speaker 4 (16:58):
I think there's two areas.

Speaker 3 (16:59):
There's both like how do you help somebody think about
like who they are, what they want and just kind
of like deal with their things. Uh. And then there's
also like how do you help them find like the
right people anything big nets other people, So how do
you kind of find the people?

Speaker 4 (17:13):
And I don't know.

Speaker 3 (17:13):
When I was like twenty, I would say, you have
to have your goals and then you align people who
share the goals.

Speaker 4 (17:19):
And then I've done that.

Speaker 3 (17:20):
And it turns out that the second the goals change,
the groups blow apart because like it's about values or something.
And then if you a lot of people on values
and then over like five or ten years, it blows
apart again because it turns out that our values like
change in our lives, our experience change, right, And so
then maybe this idea is like we need some higher
than values, and maybe it's aesthetics. It's like not about
what's like right or wrong or what's important on importance.
It's like really deep down, it's about like what we

(17:41):
feel is beautiful and what we feel is ugly that
like really leads to the things that we value, the things.

Speaker 4 (17:45):
That we actually tried to build.

Speaker 2 (17:47):
It's interesting.

Speaker 3 (17:47):
And so there's this idea of like maybe aesthetics themselves
are like some of the highest things, and maybe aesthetics
can be like a foundational layer of like a social
world in a way that like is beyond where it is,
because right now it's like the Internet, what is it?

Speaker 4 (18:00):
It's about?

Speaker 3 (18:00):
Like Facebook, it's like who's your mom and who went
to school with? And then like on Twitter, it's like
almost like like you say one thing a day that
pisses people off and then half of them will follow you, right,
And like those are both shitty foundations for like a
better social world. I would never want to build a
team that way, so that there's something really interesting on
like mid journey where people come together and they're like, man,
you love like Egyptian space pyramids too, That's like me,

(18:22):
And then like you have nothing else in common, but
you both love Egyptian space pyramids and it actually like
means something really deep, right. I think that like aesthetics
have the potential to be a foundation of a better
social and cordant player in a way that's like really
hard to understand, but that is actually like really interesting.

Speaker 2 (18:40):
I mean that's a fascinating and frankly I have so
many questions around just the basic concept there, but like
I would agree with you that aesthetics do tend to
bring people together. I mean, but aesthetics conceptually, the idea of,
you know, having a taste or a preference for something,
there's a limit I would imagine too, people who identify

(19:02):
around an aesthetic position. Meaning my mother, who's a wonderful,
wonderful and extremely insane person. She could talk about things
she's visually finds beautiful or whatever, but I would not
say it's like a central part of her personality or
something that she has an enormous amount of interest in. Right.
The thing about Facebook is that a raw opinion or
sharing something like oh, I found this article interesting or

(19:24):
whatever is very straightforward in the sense of we all
know what an idea is or an interesting article or
an opinion. But I don't know that everybody thinks on
an aesthetic level. Maybe I'm not giving everybody enough credit.
It's possible. I think you're right.

Speaker 3 (19:39):
People don't think about it, but it's there, right, Like
I tried this, I'd like, what are your aesthetics that
lead to your values, that lead to your goals? Like
you can ask the question and almost nobody can answer it. Right,
It's a really hard question, But all of a sudden
you give them something like mid journey and it's like
you can make a picture of anything, what do you want,
and like everything just spills out and then they go
through this whole like heroes and mid journey and like
the process of looking through that journey, like you like

(20:01):
it's all there and it's very clear, like a lot
of stuff comes out.

Speaker 2 (20:06):
Actually, But if I'm like a and I'm gonna give
you like a really extreme example, and so forgive me
if this feels like a like a a gotcha or whatever.
But if I'm like a neo Nazi, for instance, yeah,
I might love Star Wars. Let's say, although I always
find it fascinating when like people who are really into fascism,
like are like I like Star Wars, I'm into the
Rebels or whatever. I'm like, you know, it's but okay,
let's say I like Star Wars. You like Star Wars,
but like one of us is a white supremacist and

(20:27):
one of us isn't. We may share some aesthetic interests, right,
or we may both love a certain artist, right, you know,
we're Lichtenstein fans or whatever. But like, at the end
of the day, deep down, I don't know that that
aesthetic preference has any deeper resonance on who we are.
There's a limit, right, well, so.

Speaker 3 (20:45):
Like for example, like you know, you're a rebel obviously,
and then like a Nazi nowadays is also rebel in
their own way, so you do have something in common. Yeah,
but like, but there's probably also other things. I mean,
that's that is a leap, I would say. I mean,
I get what you're saying. They're definitely going against the grain, right,
I got it. Yeah, I get the grain. Yeah, I know,
I mean they were Yeah, definitely a lot of us

(21:06):
are rebels, and they're types of rebels, but we are rebels.
But now, like I think there are other things too, right,
So you don't want to just lock on rebels. You
want to have something it's like a little bit broader
and more interesting. It's the question is after you make
a bunch of picture of rebels, what's the next thing
you do?

Speaker 4 (21:21):
You know, and then what's that's? What's that all come together?

Speaker 2 (21:24):
You know? I mean, now we're like very far afield
from Like I've got a mid journey bought that I
can talk to and it can make images for me.
I mean, how would you describe it? You describe it
as ai?

Speaker 4 (21:33):
Yeah, I mean it is it is ai. I don't like.

Speaker 3 (21:35):
I kind of avoid the words ai and art actually
both together. Weirdly, problem with words like AI is that
people give it a lot of agency and like will
and purpose and meaning right and so where it's like
this thing it doesn't have a story or a narrative
or like any will.

Speaker 2 (21:48):
Right, ascension doesn't have a soul.

Speaker 4 (21:51):
It does learn actually from lots of people, and it
changes and there's a co evolution. It's almost like mid
Journey is a flower and then the users are bees,
and like the flower is trying to be beautiful for
the bees, but the bees pick which flowers are the
ones that get to survive, and so like there's this
coevolution between the flowers and the bees.

Speaker 3 (22:08):
Like there's not a lot of will. There is some will,
there's a will to be beautiful, and then there's only
weird about flowers being beautiful because we find them beautiful too.
It's like, what does that mean? It's because they're not
really for us specifically. It's like why do both us
and the bees find something beautiful? Like it's sort of
speaking some weird objective thing.

Speaker 2 (22:23):
No, And I can understand that from a philosophical level.
I mean, like, what is it doing.

Speaker 3 (22:27):
It's a program, Yeah, it's a program. It's a program
with a lot of models in it. There's a model
that models language, and there's something that models the connection
between language and images. There's other thing that tries to
model what images look like.

Speaker 2 (22:39):
Right.

Speaker 3 (22:40):
There's actually also models that try to understand like beauty,
like what is beautiful actually? And then there's other models
that try to understand like trade offs between like diversity versus,
creativity versus like how literal should you be?

Speaker 4 (22:52):
How metaphorical should you be? How do you read things?
And so it's kind of a it's.

Speaker 3 (22:56):
Like a structure and there's a lot of like ducta
and you know, it's it's weird because like people will
be like is it alive? Like, well, how is it
understand thing? If I say something like sadness or happiness?
How is it able to make an image of an
emotion that it's never had? Like they go ask these
questions like what is this like that's it doesn't not
like a piece of software, you know, but it's not
an AI because it's never had those experiences, right, Like

(23:18):
what does it mean?

Speaker 4 (23:18):
There's a lot of really interesting questions.

Speaker 2 (23:20):
I think a lot of people they hear AI they
think there's like a machine somewhere with like a glowing
red orb in the middle of it, and like it's
like pulsing. Yeah, exactly, And there's like some neural net.
You've built some custom hardware where there's like there is
the neural net. It's like a digital brain. It's like software. Right.

Speaker 3 (23:36):
These programs they do share things with our brains, like
like how an airplane share something with a bird, Like
they both share aerodynamics and physics and the sky. Like
these things are sharing some physics of thoughts.

Speaker 2 (23:48):
Right with us. Right. But I'm just saying, like, it's
not how you built software. The software does some pretty
sophisticated things. It is hosted on like a AWS rack
somewhere essentially, I mean maybe now use AWS. So what
is the product? Like, you've got investors, right, No, you
don't know your boots dropped? Yeah, okay, now I listen.

(24:08):
I've paid for a subscription. I'm a mid journey free now,
So is that the product people pay for subscriptions to
use it? Yeah?

Speaker 4 (24:15):
I try to have very honest business.

Speaker 3 (24:16):
It's like you're not going to run on your computer
ground the cloud and then we're gonna pays takes money
and then we'll take some margin on that, and that's
the business.

Speaker 2 (24:22):
And you feel like that's a good foundation for like
whatever this thing is going to be, Like you can
build off of that. Yeah, you don't have like Mark
Andrews in company being like, I'll give you X number
of billions of dollars if you can let me turn
this into whatever Mark Andrewson wants.

Speaker 4 (24:35):
We do have a lot of vess coming to us
offering us lots of money.

Speaker 2 (24:37):
You're not taking the money.

Speaker 4 (24:38):
No, we haven't taken anything so far.

Speaker 2 (24:40):
That's pretty amazing. Can the business be profitable like this?

Speaker 4 (24:42):
We're profitable already? You are, Yeah, that's one reason not
to take money is we're already profitable.

Speaker 2 (24:47):
Well, I mean, if you're making money, it's definitely good
reason not to take it, right Yeah.

Speaker 3 (24:50):
I mean if people come to us and they offer
us money and I'm like, what am I going to
spend it on? And they're like it's good to have it,
you should have it, And I'm like, where you have money,
We're trying to spend it already, and they're like, well,
you just have money.

Speaker 4 (25:03):
They just take us take our advice.

Speaker 3 (25:04):
It's not about the money, it's about advice, or like
they try to make those arguments and so far having
curt a very compelling argument.

Speaker 2 (25:10):
So you're happy to iterate on this product where it's
at now and let the user sort of maybe dictate
some of the direction because of the way that they're
using it.

Speaker 4 (25:18):
Yeah, I mean it's kind of beautiful.

Speaker 3 (25:19):
It's like we make something and people like it, they
pay us money, and then if they don't like it,
we don't make money.

Speaker 4 (25:25):
But like so we have We're like we're trying to
make something.

Speaker 3 (25:26):
People like because it supports our stuff, and like it's
very sort of honest to straightforward, and it's the easy business.

Speaker 4 (25:34):
I keep it this way. I would keep it this way.

Speaker 2 (25:36):
I mean, presumably there's commercial applications for this, right, Yeah.
I think of this because I'm like a guy who
runs like media businesses. Yeah, I think, oh wow, there's
all the time, like I want art for something, and
I'm actually going to get into a bunch of questions
about the art side of it. But all the time,
like I'm in a newsroom, I'm like publishing you know,
twenty stories a day or fifty stories a day or whatever,
and every one of those pieces has some art attached

(25:58):
to it, Like presumably you're already doing more enterprise level
stuff where like I just want like some design for
a story or for a blog that I'm writing or
whatever that you could generate that any sort of infinite
iterations of original pieces of art, Like, is that a
part of the business.

Speaker 3 (26:14):
I would say, we're a consumer business that also has
like some professionals. So it's by like seventy percent consumers
and thirty percent professionals. Right, the professionals are mostly using
it for like brainstorming and concepting. Then the consumers are
having fun and sort of having these reflective, spiritual personal experiences.
I'm not that excited by professional use, even though, like

(26:35):
I'm happy when I see people are finding it to
be useful. Right, the regular people have definitely been a
lot more motivating and inspiring to me, even the professional uses.

Speaker 4 (26:53):
I have very little interest in the world as it is.

Speaker 3 (26:55):
I want to like make it really different, and it's
much easier do thing really different fornsumers than it is
to like have that immediately impact the sort of professional worlds, right,
and so like for video game people come to me
to like well, and like they literally they have to
file us under their photoshop budget because like the video
game is already budgeted out and it takes sixteen months,
and I have to wait for them to make their
next video game. And I'm like, this is bullshit, right,

(27:16):
I'm so happy that my business isn't reliant on somebody
finishing their video game in sixteen months, you know.

Speaker 4 (27:21):
And that's what that world.

Speaker 2 (27:22):
Is like, right, Listen. I thought a lot about this, Like,
if I'm making a video game, especially if I'm like
an independent developer, like an indie dev I need art,
I need assets, like I want to make like I
want to make this world that hasn't been made before,
and like normally, and this actually gets into this part
of the conversation I want to have about art and
about the sort of implications of it. You know, I

(27:43):
might go and hire an artist or whatever to do that,
but now, like mid Journey, potentially, like if I'm using
in that way, I can create assets and backgrounds and
scenery or even brainstorm off of that to build something from.
Like That's that's not the exact thing, but a kind
of iteration of it. But there is a certain very
vocal segment of people out there in the world, and

(28:03):
there are people who are artists who are you know,
digital artists, or who are working artists today, or even
people who are doing fine art that's like hanging in galleries.
And they're like, one, this is theft because it's using
our work. It's using work that is out there that
is available to see as inspiration for these works. And
two it's like they're not getting anything when it does
create new work. Not only is it making their jobs

(28:26):
more sort of obsolete, but it's also like doing it
on the backs of all of their work. It's not
a non compelling argument. There is some reason to think
that all of those sort of notions are in some
way true. Like what's your take on that.

Speaker 3 (28:40):
There's a lot of misunderstandings around the technology, and it
makes sense that like artists really aren't going to understand
what this is doing. Some of my favorite images I've
made with any of these models that looked artistic. We're
trained only on photos and so but this is is
it's a system that understands what images look like like.
If you've seen enough photos in your life and then
you see it anything you could describe the painting without

(29:01):
having ever having been trained on paintings, and so like,
what this is the thing that understands images, and then
it understands language of the connection with languages and images,
And there's some elements of like knowing what a style
looks like requires having seen the word and the style before.
So there's like some connections to it, right, But like,
largely speaking, it's not I think working like the way

(29:22):
they think it is. And so the problem is that
the artists are scared about being in the data set.
But literally you can just take one of their pictures
and feed into one of these models where that ever,
never have it seen it before, and it can make
pictures like that.

Speaker 4 (29:32):
So it's not about the training data. First off.

Speaker 3 (29:34):
If it understands images, it's game over for that battle.
It wants to seem enough general images enough to like
know what textures are and know what colors are. You
can show it a picture and it can make pictures
all like that, never having seen that specific car asia before.

Speaker 2 (29:46):
Right, So I mean you know that obviously raises like
all kinds of weird questions about like you know, how
fine tuned does that get? Can I pick any artists
like a photographer I like on Instagram and say in
the style of this Instagram photographer, and like it'll do
something well.

Speaker 3 (30:01):
I mean, you could certainly put a photo of theirs
into another service and I'll give you a photo that
looks like it. So you know, that's I think that's
really the more that's kind of the technical thing, right,
And so basically, if these systems understand images, they'll be
able to copy anything you show them, regardless of whether
they're trained on them. So I think the training data
is the wrong battle to fight, But there is potentially
a battle to fight over like use of these tools,

(30:22):
like what is good and what is bad use? And certainly,
like the law covers out already, if you make something
that's really derivative of another artist, like too derivative, it
does not okay even legally, right, So like there is
it is covered a little bit by law already. Maybe
there should be something more strict because like it's getting easier.
But that's that's the battle to fight. I think it's

(30:42):
like what's too similar, not like this training data thing,
you know.

Speaker 2 (30:47):
I think about like CGI in a way, if you're
building an environment for like a film or something, right,
like and you're like I want to make a mountain
or whatever. You're not going to hand draw every polygon
that builds the mountain, right, the computer is going to
figure out and even now, like it'll just basically terraform
a mountain right in unreal or whatever.

Speaker 4 (31:04):
Yeah.

Speaker 3 (31:04):
Yeah, And once upon a time most people couldn't read
and write, and now everybody can. And there are more
writers now and more readers now professionally than they're ever
were before.

Speaker 2 (31:12):
Right, Well, it's kind of like photography, yeah, right, Like
everybody has like a kind of pro grade camera in
their pocket all the time now, and so like we
were just a wash in really high quality photos, whereas
like if you go back fifty years, not even twenty
five years, the best phone camera you could have was
really shitty and was obviously low quality. We weren't a

(31:33):
wash in just photos everywhere. Right, And like in the
last twenty five years, pretty much everybody's become like somewhat
of a pro am photographer. Yeah, maybe this is a
straw man. I don't want to throw strawma at you,
But like, is there a question about like deep fakes
and sort of like creating reality that does not exist?
Is that something that you guys grapple.

Speaker 3 (31:51):
With yeah, I mean it's a real risk for us. Specifically,
we did some special algorithms. It's very hard to make it,
make it deep fake what it does. If you ask
me to make a photo, it'll look realistic, but there's
like something to it in the lighting and the shading
and the hues where it's like just far enough away
from a photo that it looks very realistic, but your
body like knows it's not a real image immediately.

Speaker 2 (32:13):
What if I'm imagining something that looks exactly like a
real image, We're not doing that right now. My imagination
has a limit.

Speaker 4 (32:18):
Yeah, right now it does. Yeah, yeah.

Speaker 2 (32:20):
Would do you think that limit will be lifted for
certain users? Maybe for this guy, I guess the very
creative ideas. Maybe let me check it out.

Speaker 4 (32:27):
There's lots of pros and cons to doing that.

Speaker 3 (32:28):
So we found that when we flipped it over that boundary,
sometimes it looks perfect, and then sometimes it looks really
like Uncanny Valley zombie like right, it's like upsetting. And
so right now, if we flip it, it's like kind
of say fifty fifty fifty some times looks perfect, sometimes
looks like uncanny, and the uncanny is so like upsetting
to me, as like a visual aesthetic person, I don't
want to make anything that looks like that, right, and
so like I just it's better to just not.

Speaker 2 (32:50):
Allow that at all.

Speaker 4 (32:51):
Maybe in the future it'll be so good that it
never looks unkenny and it'll take the technology is not.

Speaker 2 (32:56):
Yeah, I mean there's no chance, just to be clear,
there's no chance that in like five five years from
now that we won't be at a point where mid
Journey or other programs like it will be able to
create completely photorealistic, if not full moving images for sure,
still like in five years time, right.

Speaker 4 (33:13):
Yeah, Yeah, There's got to be multiple directions here.

Speaker 3 (33:15):
I think one I'll be trying to like make photo
realistic duplications of reality, and I think the other want
to be like making things that are sort of super real,
like beyond real, right, And I think the beyond real
stuff is where it's both interesting as a human and
probably where all like like consumer and commercial stuff is.

Speaker 2 (33:29):
I will say, I'm unabashedly like a fan of this thing,
but like I also can understand people's fear about it.
But people are afraid of a lot of things that
computers do, and for very good reason. I would also say,
and this is kind of your problem. People are afraid
of people like you. I don't mean you personally, You're
a lovely person as far as I know, but like
you are like, hey, I am interested in imagination all

(33:50):
these things. And like if you ask like a Mark Zuckerberg,
like the early stages of Facebook, you know, he would
be like, I just want to connect people, you know,
I just want people to like get together in this
social and environment or whatever and connect. But like, actually
down the road, as that thing developed, Mark Zuckerberg made
a lot of like really crazy, weird, bad decisions. They
don't have to go on record by agreeing with me,
but I think in your heart you know it's true.

(34:12):
And so what do you do to protect against like
these things that feel like creative decisions now? Right, Like,
we couldn't have seen the misinformation machine that Facebook was
going to become, with like all these bad actors and
all you know, sort of the ways that you could
abuse the systems, Like we didn't know that that was
going to be a thing until like we started to
see the actual abuse. How do you protect against the

(34:33):
things where you've got to take in like the worst
of humanity? Like, are you doing that on an active basis?

Speaker 3 (34:38):
Right?

Speaker 2 (34:38):
Because like the thing with a tool like this is
that the best parts of humanity will find like amazing
things to do with it. But there is an equal
opposite actor there, right, who will do the worst things
with it. So tell me, like how that you build
a product like this and don't let it become destructive.

Speaker 4 (34:54):
Yeah.

Speaker 3 (34:55):
So my life philosophy is that creators imbuing their values
and the things they create, whether they know it or not,
and that those things have a way of spreading those
values even when they're no longer around. That does actually
put a lot of blame on people like Zuckerberg. It
implies that he made Facebook with the wrong values. I
don't know, Mark, But an interesting example that I like

(35:17):
to think about is the defaults of something like Facebook
versus MySpace. Like obviously he was aware of MySpace. We
know that, right, definitely with the main compeditor.

Speaker 4 (35:25):
Yeah.

Speaker 3 (35:25):
And when I remember going out to MySpace for the
first time and my page was blank and it' said
I had one friend, I was like, who's my friend?
Oh my god, it's Tom, Right, Who's Tom? You know,
it's this nice guy he's the guy maker of MySpace.
This is cool, Like Tom's my friend. He must care
about me. I bet I could make other friends that
I don't know, like my face's places where I can
make friends, right, and Tom cares. And when you sign

(35:46):
on to Facebook, you have no friends and markus certainly
not your friend.

Speaker 2 (35:51):
I never consider this. But he's not your first friend
on Facebook. That's definitely like what the fuck does that mean?
What the fuck does that mean?

Speaker 3 (35:58):
Not only is he not your first friend, but you
have no friends when you join right right right? When
you join Facebook, you are this friendless non person then,
and you have to try to grab out to anybody
who you already know, like please, somebody who already knows
me be my friend on Facebook? Right?

Speaker 2 (36:12):
Interesting? Huh?

Speaker 3 (36:14):
And like these there's like these really deep details that
are made by real people who have values, Like he
had to think about this, obviously he thought about Like
he's not dumb, like he must have thought about it.

Speaker 2 (36:26):
I mean maybe he wanted to be your first friend,
but they were like actually, like MySpace, Tom could sue
us for like IP stuff like infringement if we do
the same thing that he did.

Speaker 3 (36:36):
You know, I think we know he wasn't that cautious
about being suited because it happened, right, Uh, that's true.

Speaker 5 (36:41):
I mean.

Speaker 3 (36:52):
There's a lot of interesting things like that. I think
that actually maybe everything is that way. The goal is
like not to not make things, but to make things
with like really good values and and to have people
with good values making things, and like that making things
is not equivalent between any people.

Speaker 2 (37:09):
I agree with you, but like what is the expression,
like the road to Hell's paved with good intentions? Whatever?
I mean, I agree that that you can avoid some
of these mistakes like if you have a different set
of like goals or values, But like do you already
do things with mid Journey where you're trying to sort
of protect against like misuse, right, Like yeah, obviously like
hate speech or images of violence. I mean I definitely

(37:31):
like tried some stuff that I didn't think was like
going to produce a violent result, and it was like,
we don't do like this kind of image or whatever.

Speaker 4 (37:37):
Yeah.

Speaker 2 (37:37):
I actually have a question about porn, which is a
big one. Go ahead, Yeah, I mean my guess is
if you wanted Mid Journey to create like incredible original
like porn scenes, because there's a lot of pornography on
the Internet, right, would you say there's quite a bit
of it, and it's all a visual medium basically. Yeah,
I mean there's obviously some erotica out there. There's somewhere
You've got the X rated Mid Journey instance running right

(37:57):
where I can create like full on porn scenes, right,
don't lie to me. I know the truth. Somebody there
is doing it.

Speaker 4 (38:04):
Yeah.

Speaker 3 (38:04):
You know, when I first thought about this problem, I
was like, who wants an AI generated booty?

Speaker 2 (38:10):
Who doesn't?

Speaker 3 (38:11):
And then like, honestly, as the albhims get over time,
like I see some booty, then I'm like, it's a
pretty nice booty, Like it's pretty good, pretty good. Yeah,
Like it obviously can do really good like just how
you can make beautiful anything else.

Speaker 2 (38:22):
I mean that's a huge deal though, Like I can't
even do like a Renaissance painting of nudes like tasteful
artistic news with mid Journey correct, No, right, Like is
there a tier where I can do nudes? This is
really just I'm asking for myself. But like, you know, no,
you're not gonna let anybody ever do a nude.

Speaker 3 (38:40):
I you know, I think it's about like, what is
the thing that like helps the world, like what and
so like. For example, there are two things we have tried.
I can give you two stories. Okay, Well, one is
when their system wasn't filtering well enough, you'd have people
trying to basically create like their fantasy person basically and
they're like becomes super fixated on like this redhead whatever,
like it becomes this very specific thing over time. Right,

(39:02):
I don't know if that feels healthy. It's certainly a market.

Speaker 2 (39:06):
Right, I mean, by the way that phrase, I don't
know if it's healthy, but it's certainly a market. Is
like ninety percent of the things that are available online,
like literally social media is like I don't know if
it's healthy, but there's certainly to.

Speaker 3 (39:18):
Someone's going to do it, and I think it's not
going to be healthy right now. There are other things
that we tried. So for example, we did this thing
where we created this chat room and we called it
not safe, don't judge, and we threw like a hundred
people into it, and we turned off all the filters,
oh my god, just to see what would happen, Oh
my god, And it was really interesting. We put them
all in and we go, there's no filters everybody. You

(39:38):
can do whatever you want, but everyone else is going
to see what you see. There's got to be some
people who would be shameless in that scenario. It was
very quiet at first, and then someone goes boobs and
then there's some booth pictures and someone goes like ass
and that it was a good ass picture, and everyone's
like kind of startled it for us, like they didn't
know what to do, right, and then somebody goes, uh,
fifty percent orgy in a Walmart and it just like

(39:59):
these piles of good bodies in a Walmart sounds very disturbing.
And then all of a sudden, everyone else goes, uh,
it was fifty percent orgy in space, alien orgies, and
then all of a sudden everyone starts losing their minds
and it gets really strange. Eventually it went to like
Bill Cosby eating out Hitler, Like it got pretty intense.

Speaker 2 (40:14):
Oh my god, I mean that's a very, that's a very,
that's a full cancel on that image. I would say, yeah, everything, But.

Speaker 3 (40:21):
What was happening was like it became so absurd, Yeah,
that everyone just started to kind of like let go
of all of the bullshit that they knew that like
that they would normally be outraged of. And when somebody
finally did Bill Cosby eating out Hitler, like that was
like an hour in okay.

Speaker 2 (40:41):
Yeah, and is that when you shut it down? Was
that this was that when you closed that?

Speaker 4 (40:44):
I shut it down shortly after.

Speaker 2 (40:45):
Yeah, But that's like such a small sample and like
it went immediately to a place that would defend like
probably ninety nine percent of the normal users of the Internet.

Speaker 3 (40:56):
But I think what's interesting to hop about psychological experience
of all people had in this room as it went
from like boobs right to like, you know, they kind
of escalated to like fifty percent orgy in Walmart.

Speaker 2 (41:05):
Isn't that what always happens though, like you're testing the limits.

Speaker 3 (41:08):
No, but like no, but what happened was it's like
at some point they kind of like let go right
during this process and they were like it doesn't matter anymore. Yeah,
Bill cosbyah Hitler, that's really funny. Or someone else did
like Michael Jackson's asshole and it did like a buttthole
where the hole was Michael Jackson's face.

Speaker 4 (41:24):
It was funny, it was weird, you know, those people.

Speaker 2 (41:27):
Thought it was funny, but like a very large audience
would not think that was funny. So the thing so like,
I mean, it's not funny, like at it at a
kind of basic level, like you know, the Cosby stuff
is really fucked up and Hitler is Hitler. So like
at a really kind of basic level. If you're like
in good taste, that's very very not in good But
there was no taste anymore. It was like everyone just
like lost it.

Speaker 3 (41:45):
They're like, look, nothing matters, like it's all bullshit, like
it doesn't really like everyone kind of let go. It
felt very ethartic. At first, they were really shy, and
by the end they had all let go. Yeah, it
was kind of a beautiful process. I don't know though,
but like it went to a place that was pretty offensive, right.
I mean, I'm glad that you don't allow that particular
type of use in the broad I think it was

(42:06):
really interesting and I would say everybody who was involved
in the Spear and stelt it with like cathartic, right,
and a positive like spiritual experience, right, because they realized
how pent up they were in stupid ways and like
maybe the last thing was bad, like we could.

Speaker 4 (42:17):
Say that was bad, but there was something.

Speaker 2 (42:19):
No the last thing was bad. The last one was bad.
I don't I don't want to be like a you know,
like the policing culture or whatever. But I mean you
could do it, but no. But the reality is like,
actually like it. I think that raises an interesting sort
of scenario, And it's like, what do people do when
given this kind of unbridled power to create whatever's in
their mind? Like I like to think people come up
with like really cool stuff that's like awesome, but definitely

(42:41):
for sure, there's a segment of the audience. And this
is actually gets back to what I was asking, which
is like, so you ran an experiment with a roomful of people.
They were just like users, like test like beta users
or something.

Speaker 4 (42:51):
There was a bunch of users.

Speaker 3 (42:52):
We did it for one hour, right, and I said,
if anybody leaked an image, I would ban them for life.

Speaker 2 (42:56):
Right. So that's your little kind of window into it.
You're like, Okay, this could get pretty crazy. Oh yeah, obviously,
the way you've built the system is that you cannot
do those things. I guess. Like the question is like
do you have to be constantly vigilant about like the
ways that the thing might be abused? Like how do
you counter like abuse you haven't even thought of yet.

Speaker 3 (43:14):
We have like forty moderators who kind of watch things
and then they just they have they have a little
slash band commands, so they say slash band titties and also,
no key is the word titties anymore?

Speaker 2 (43:23):
Right? Are you actively like yesterday, was there something that
mid Journey produced that was like a surprise to the moderators.

Speaker 4 (43:30):
I know that there are words that were banned today.

Speaker 2 (43:32):
Like what what was banned? I'm super curious, Like, but
today you're way far into it. There's like, how many
people have used mid Journey? Do you know the numbers? Million,
millions of people. Yeah, so millions of people have been
in there. But you're still today as of October fifth
or whatever. Yeah, you've banned words. I'd love to know
what the last band word was.

Speaker 3 (43:51):
Moderators came back recently and they're like, David, we want
to unband the following words blood, bloody, sexy, kill, killing, cutting, disturbing,
and gut.

Speaker 2 (43:59):
Wow. What an image.

Speaker 3 (44:01):
They're like, what do you think, David? We could probably
unban those things. And I was like, Okay, let me
think about this. Uh, child with guts, Bible across the ground,
disturbing huge pools of blood and like, ooh, yeah, we
probably don't want that right where I was like a
little girl cutting themselves, Like oh yeah, that seems bad right.

Speaker 2 (44:17):
Well key, But so here's my question for you, And
I think you've got like kind of a crazy responsibility.
And I'm not saying this to be a jerk at all,
but like you're just like a guy who's interested in
creating this product and create this kind of beautiful and
imaginative and exciting images and beyond. But you're not like
a linguist. I don't know you're all of your background,
but I mean, like you're not like an ethicist. Do

(44:39):
you employ an ethicist at the company? Do you employ
like linguistic experts? How diverse is the team? I think
these are like things that people are going to want
to know, which is like you mentioned the Bill Cosby
Hitler thing, and I can think of like a bunch
of people who are not like a white Jewish guy.
And I say this as a white Jewish guy who
would be much more offended about some of that stuff,
or people with different experience.

Speaker 3 (44:59):
That as an example of a pretty outrageous thing that
was my outrageous and I get that, and no, no, I
understand it.

Speaker 2 (45:04):
Like you were in this experiment, somebody took you to
this crazy place and then you're like, all right, we
got to shut it down. This is sort of what
I was trying to get to is like, how do
you make a company that has all the lofty and
interesting and exciting ideals I think you have, but also
protect against building a product that ultimately ends up repeating
the mistakes of the facebooks or the twitters of the world.
And the question does come down to, like when you're

(45:26):
having those conversations, who's in the room, who's having that
conversation with you? Like, what are you going to do?
This is my being putting my hardcore journalist hat on,
Like what are you going to do to make sure
that you have conversations with a big enough set of
people and with a smart enough set of people who
are experts in these fields, like in the fields of
like ethics and linguistics and like you know, history, and

(45:46):
that it's a diverse group, like to actually make a
product that serves everybody and not just one that feels
like cool to like a couple of you know, Jewish guys,
like us, but may not work for a million other
people in the world.

Speaker 3 (45:57):
Yeah, I mean there's a lot of questions there. I'm
okay not serving everybody. Like if this is I Maginty's
a two million person thing, is ever bigger than that, I'm.

Speaker 2 (46:04):
Happy with that. But you want to make it inclusive,
I would assume, Yeah, I want it to be inclusive.

Speaker 3 (46:08):
But also if it's only two million people, I'd be
okay with that, Like I don't, like, I'm not I
don't have this like this desire.

Speaker 2 (46:13):
You want two million of the same people, though you
don't want two million to the same people, two million
white Jewish guys.

Speaker 3 (46:18):
If it makes two million white Jewish guys really happy
and improves their lives in a significant way, like they've
made the world better. Now, obviously I'd like to make
it diverse, like and we try really hard there. But
like I mean, at the end of the day, it's
it's more important that it's good for the people who
interact with it then that it has as many people
as possible.

Speaker 4 (46:35):
And that's the first trade off.

Speaker 3 (46:37):
That's the first that's a huge trade off, because most
people decided to not make that trade off.

Speaker 2 (46:40):
No, I agree with you that, like, if you're thinking
of like the infinite audience, obviously you don't want to
be like every person should be in this thing or
using this thing or whatever. But like, I guess it's
such a sensitive space where like you've built a tool
that can create something out of nothing, Like you build
a tool that can make a dream look like real basically,
And so yeah, you know, how do you do it

(47:00):
the right way? But I feel like here's a chance
to bring a bunch of people into the conversation that
were never there at Google on day one. When I
think about any new technology like this, I always think now,
and perhaps because I've been so abused by the technology
companies that have existed before us, you know what could
go wrong? Right? And how do you prevent that?

Speaker 3 (47:20):
Yeah, there are a lot of things we do, so
like I do office hours every week for four hours
where I just talk to as many people as I can.
Sometimes I'll do a theme thing like I brought up
like twelve women once and I said, like, let's have
a women panel, and I want to ask everybody how
do you feel about bikini photos? Like should I ban bikini?

Speaker 2 (47:38):
And that's one way of getting the women's side of things.

Speaker 3 (47:41):
Because every single day I heard some asshole dude who's like,
hits are natural. I like, bikini photos, have as many
as you can, and then like women who are uncomfortable,
And I was like, you know what, I just want
to hear a bunch of women talk about this issue
of how do you feel aout the bikini photos and
like I will do whatever you say.

Speaker 4 (47:55):
I was like, should I ban bikinis? That was like
the simplest question.

Speaker 2 (47:58):
Did you ban bikinis?

Speaker 3 (47:59):
They decided group like we do not want you to
band bikinis, like ninety five percent. It was like pretty unanimous,
but we want you to hide them so that none
of us ever have to see some dude making a bikini.

Speaker 4 (48:10):
And so that's what we did.

Speaker 2 (48:12):
It's a good middle ground. To me, this is so
weird because like the reality is like the naked human
body is that I'm not like, on its face offensive
to me in any way, Like it's like very normal.

Speaker 4 (48:22):
I agree, Yeah, And.

Speaker 2 (48:22):
It's like funny to think that you've got a buffer
against like people abusing the system who are making weird
like you know, sexual bikini photos or whatever.

Speaker 3 (48:30):
Yeah, I mean what the women basically said on the
on the whole is that they're like they're basically even
we like a little cleavage, but like.

Speaker 4 (48:35):
What an average guy thinks it's sexy.

Speaker 3 (48:37):
It's really easy for most of us to find creepy
and unwelcoming, and so basically we don't have we don't
feel like we should have to see that like against
our will.

Speaker 2 (48:44):
That's so true both in AI and in life.

Speaker 3 (48:47):
There's a lot of these sort of nuanced things like technically,
it probably should be able to do a tasteful nude,
but it shouldn't be able to do like a hyper
sexualized nude. Like technically like that seems right, you know,
but it's it's hard. That's a really hard boundary, you know.
I mean it's a question of art, right, Like, yeah,
what's porn? It's like, well, you know when you see it,
and it's like, but there's different levels of that, right,
have we even trying to teach the system? Actually lately

(49:09):
some of these nuances. We have certain users who go
in and they rate images randomly. Right, We find that
on the whole, people very rarely say anything is offensive,
like very rare, so when they say it, it's interesting, right,
And then we and we aggreate all those together, and
then we teach the AI. We're like, hey, regardless of
whether or not something is offensive, this is how people
are responding to your images.

Speaker 2 (49:27):
Interesting.

Speaker 4 (49:28):
And then what it does it actually changed its behavior?

Speaker 2 (49:31):
Do you worry you're you're creating a kind of prudish AI? Like,
do you worry that, like you're actually making a sexually
repressed AI that like is going to be weird about
sex and human bodies.

Speaker 3 (49:41):
I think the question is, like when we build these technologies,
like what battles do you want to fight?

Speaker 4 (49:45):
And where do we want to push the world forward?

Speaker 2 (49:47):
Right?

Speaker 3 (49:47):
And like me, I want the world to be more imaginative,
like and I want to push the boundaries of like
aesthetics in creation. I think that's really interesting and it
is really worthwhile. But I can be a little picky.
I'm not as interested in doing that or violence sexuality. Right,
there is an argument we have to push the batteries
of sexuality. Let's make the world way more sexual. Someone
else can do that. I just don't feel.

Speaker 4 (50:08):
Spiritually compelled for that.

Speaker 3 (50:09):
Yeah, But I think that, like there's this broader thing,
which is like letting people reflect. The average person comes
in here and they say something like Maltese dog in
heaven and I reach out. I'm like, hey, why'd you
do that? That's interesting and they go because my dog
just died. And I'm like, oh shit, are you okay?
And they're like yeah, this is making me feel better.
Where there was like another woman and she goes like
she was putting in these weird lyrics and I'm like,
what are you doing? Like these don't show up on

(50:30):
Google and she goes, When I was very young, my
older brother died and all he left me was this,
like this cassette tape.

Speaker 4 (50:35):
Of these thongs.

Speaker 3 (50:36):
And I'm literally just putting lyrics in and I'm feeling
closest person never got to be part of my life.

Speaker 4 (50:39):
Wow, it's not always death.

Speaker 3 (50:40):
There was one person who was like Temple of Donuts,
Like why are you doing Temple of Donuts? And like, well,
I'm an atheist, but I don't really understand worship or religion,
but I do understand like donuts and sweets. It's like
combining all the things I don't understand one of the
things I do understand, and I'm like trying to understand,
like what is worship?

Speaker 4 (50:55):
The Hong Kong girl. So she said, I'm a woman.

Speaker 3 (50:58):
I'm in Hong Kong, and the one thing your parents talk,
I never want you to be as an artist. And
so I'm a banker and I'm a good banker. But
now as I'm starting to get to use Mid Journey,
I'm starting to get to feel like I'm getting to
be the person I never got to be, and I'm
having to think about that.

Speaker 4 (51:11):
And so like these are like the good stories.

Speaker 2 (51:13):
They're like, no, those are great stories.

Speaker 4 (51:14):
Somebody else is.

Speaker 5 (51:15):
Just like huge chits covered in blood and it's like
I don't care about that person. That's not a real
human story, and like maybe there's something going on there,
but it's not interesting. Like there's so many interesting things
going on, and I want to create a space for that,
and I'm doing that. There's a path that we see
over and over again with people in Mid Journey almost
call like the heroes in Mid Journey. And what happens
is they come in and they realize they can make

(51:37):
pictures of something they like. For me, with cats and cyberpunk, right,
I'm like, okay, I make sappunk cats.

Speaker 4 (51:41):
I'm like, okay, I'll make tarpunk shitte.

Speaker 3 (51:43):
Pre Charpuk Ninjas and make Starapunks and I'm making chappok everything.
And then all of a sudden, like you combine all
the things you like and then you just burn out
and you're like, oh my god, I never liked sypunk.

Speaker 4 (51:51):
I never want to see starpunk again. Starpunk isn't me.

Speaker 3 (51:54):
And then and then it's like month one, Month one,
and then month two is You're like, but then who
am I? And then you start looking at everybody else's pictures.
You're like art deco, am I art deco? Or like
vapor ways, am I vapor waves? And then you start
like looking at everything and you're and you're kind of
saying like you know, and you're trying to do this
path of like who am I?

Speaker 4 (52:11):
What is my real aesthetic? And then you learn a lot.

Speaker 3 (52:13):
People learn like all this hard history and all these movements,
and they start putting things together into like the sense
of who they are. And then like month three is
like you have this like aesthetic universe and you're starting
to like apply it to everything. You're like, it's like,
you know, it's a little bit of this, a little
bit of that. It's all these things together and you're
like creating all this stuff, and it's like you're like
having to like your people are paying.

Speaker 4 (52:32):
Off pathetic debts.

Speaker 3 (52:33):
They're like exploring the nature of their identities and then
they're like expressing it. They're like it's like they're working
on all this shit, and like it's really really healthy
and it's literally just regular people and almost nobody shares anything.
It's crazy, like almost no pictures ever shared and almost
no pictures ever sold, right, And it's just like it's
mostly just regular people having this like really healthy experience.

Speaker 2 (52:54):
So to be clear, basically you see this as a
form of therapy. Is that correct?

Speaker 4 (52:59):
At least thirty percent of all the use is literally
art therapy.

Speaker 2 (53:02):
Right, Wow, mental health through ai I was.

Speaker 3 (53:05):
Entirely unexpected, but it's really important. It's clearly this tool
for reflection. And then people are starting to meet each
other like and they're starting to like form these groups
and they're like pushing these aesthetic boundaries and discovering new
things and like that's really beautiful and it's obviously part
of an honest and positive future, right, and like that's
what I care about.

Speaker 2 (53:22):
Okay, really quickly, and then we got to wrap up,
But do you think that like there's a future state
where it's like mid Journey is its own Instagram.

Speaker 3 (53:28):
There's gonna be like that, but it's crazier. I think
the future is more of like, well, it's more like
liquid imagination swirling around the room and like forming mountains
and little trees and animals and little ruins. You're trying
to figure out how to get people's surfboards or boats
like surf like oceans of liquid imagination, like discover entirely
new lands. But it's like very different thing, and it's

(53:49):
like it forms like a new substance that you kind
of can create the world with and manifest through and
like reflect through, and like that's what it's about.

Speaker 4 (53:58):
It's like creating a new substance.

Speaker 3 (53:59):
It's really not about like making an Instagram or making
poor or huge tips. It's obvious that all that stuff
will happen, but that it doesn't matter, Like it's not
the real thing. It's like there was a civilization before
engines and after engines, and now the fun thing is
moving to a civilization that has these engines of imagination,
and how does that transform things?

Speaker 4 (54:19):
Like how did engines stright?

Speaker 3 (54:19):
I think we have highways, we have boats, we have
like huge international trade, Like there's.

Speaker 2 (54:23):
Like a lot of stuff.

Speaker 3 (54:24):
Yeah, that's all stark. A lot of people in technology
feel like we have no pasts. A lot of regular
people literally feel like we have no future, right, But
like I feel like like we're really mid journey in this,
Like we have this rich and beautiful past behind us
and this like wondrous and unimaginable future ahead of us, right,
And like the whole goal of making anything is to
figure out what we can be and what that can
be in like a positive and explorative and wonderful, humane way.

Speaker 4 (54:47):
And like I don't know. That's what I'm trying to
do and hopefully it shows up a little bit in
the stuff.

Speaker 2 (54:52):
But like I agree, I'm so on board with that sentiment.
Like I one hundred percent agree, Like we don't know
yet what all of this is going to be be.
It's like we have to figure that out. And that's
why like people are like we're done, and it's like, no,
we just really got started, like with technology, I think,
and what it can do. I agree, Like you, you
are echoing a sentiment that I have definitely spoken on

(55:12):
on more than one occasion. David, this is first off,
super fucking interesting shit that you're building. Extremely fascinating conversation.
We should do like a check in like a year
from now or something to see all of the new
mid journey things that have been created.

Speaker 4 (55:24):
So it's gonna get really scary. Even the next six months,
six months is going to be really intense.

Speaker 3 (55:28):
Like six months is the farthest I can see twelve
months actually, I actually don't.

Speaker 2 (55:32):
Okay, we'll do a six month check in. We'll see
if bid journey is that it's like three quarter journey.

Speaker 4 (55:38):
It's gonna be.

Speaker 3 (55:39):
Yeah, it's gonna be.

Speaker 4 (55:39):
It's it's gonna be moving really fast.

Speaker 3 (55:41):
It's kinda seemed frightening to a lot of people, but
it's like it's it's like an honest shot at the future.

Speaker 4 (55:47):
You know.

Speaker 2 (55:48):
I'm ready, David, Thank you so much. Well, that is
our show for this week. We'll be back next week
with more what Future, And as always, I wish you
and your family the very best.
Advertise With Us

Popular Podcasts

1. Las Culturistas with Matt Rogers and Bowen Yang

1. Las Culturistas with Matt Rogers and Bowen Yang

Ding dong! Join your culture consultants, Matt Rogers and Bowen Yang, on an unforgettable journey into the beating heart of CULTURE. Alongside sizzling special guests, they GET INTO the hottest pop-culture moments of the day and the formative cultural experiences that turned them into Culturistas. Produced by the Big Money Players Network and iHeartRadio.

2. The Joe Rogan Experience

2. The Joe Rogan Experience

The official podcast of comedian Joe Rogan.

3. Dateline NBC

3. Dateline NBC

Current and classic episodes, featuring compelling true-crime mysteries, powerful documentaries and in-depth investigations.

Music, radio and podcasts, all free. Listen online or download the iHeart App.

Connect

© 2024 iHeartMedia, Inc.