Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
Welcome to Tech Stuff, a production of iHeart Podcasts and
Kaleidoscope IMA's Veloscian, and today will bring you the headlines
of the week, including a genetically edited rodent, the Wally Mouse. Then,
on today's Tech Supports segment, we'll talk to four of
Form Media's Jason Kebler about what the future of AI
movies could look like. All of that on the weekend
(00:22):
Tech Is Friday. It's March seventh. I'm excited to be
back in the studio this week with our producer Eliza Dennis.
Speaker 2 (00:33):
We're glad to have you Stateside.
Speaker 1 (00:34):
Yes, it's felt like I was away for a long time.
Speaker 2 (00:36):
I'm wondering if that had something to do with this
news cycle though.
Speaker 1 (00:39):
Yeah, there's a lot, lot lot to cover, so should
we jump in.
Speaker 2 (00:43):
Yeah? Absolutely. So.
Speaker 1 (00:45):
It was a bit of a confusing week when it
comes to chips and semiconductors, and I'll come back to
why it was confusing. But Monday saw President Trump hold
a press conference with the Taiwan Semiconductor Manufacturing Company aka TSMC.
The clues in the name the company manufacture semiconductors and
(01:06):
they produce ninety percent of the world's super advanced semiconductor chips.
These are the chips that power AI training models, but
also devices and basically are the backbone of the new
global economy. However, the vast majority of the manufacturing takes
place in Taiwan, and so many in Washington and beyond
(01:26):
have worn that TSMC's dominance in the chip industry could
create a national security risk, given that Taiwan is squarely
in the bullseye of China's territorial ambitions. But this week,
the Taiwanese company pledged to invest one hundred billion dollars
in manufacturing chips on US soil.
Speaker 2 (01:44):
You know, this is so interesting to me because it
comes after multiple announcements over the last couple of months
about investments in things like data centers and AI infrastructure.
And that was with Stargate, and then Apple actually recently
made a pledge to make more products domestically with themestic contractors.
Speaker 1 (02:01):
Yes, I think they talked about five hundred billion dollars.
But what was really interesting was that as soon as Tuesday,
when Trump addressed Congress, he talked about his aggressive desire
to dismantle the Act that actually TSMC is using in
part to fund its semiconductor manufacturing in the US. The
Chips Act was biden error legislation that basically created a
(02:23):
platform for manufacturing semiconductor chips in the US. I don't know,
I don't know how to square those two things, but
that actually brings us to our next headline, which is
a breakthrough indirectly interpreting and reading brain waves and converting
them to text.
Speaker 2 (02:40):
The superpower I want.
Speaker 1 (02:41):
Yes, exactly what. You may be able to buy it
if met had anything to do with it, because they
announced that in partnership with the Basque Center on Cognition,
Brain and Language in Spain, researchers have been able to
decode unspoken language, often reconstructing full sentences directly from brainwaves
and not even requiring any surgical intervention. This is all
(03:03):
stuff which can be measured outside the head.
Speaker 2 (03:05):
Yeah, and that's really the breakthrough here, right, because other
research from companies like Neurolink have been extremely invasive, you know,
electrodes being implanted into the brain. Invasive.
Speaker 1 (03:18):
Yeah, that's right. And this research is all about kind
of putting monitors on the skull or around the head
to be able to read brain waves without having to
directly hook into the brain, which is obviously much less scary,
and there's an amazing promise for people with cognitive impairments or
brain injuries to be able to convert their thoughts into
text and therefore speech. But there are also some concerns.
(03:41):
Right the Vox headline was Meta's brain to text tech
is here. We are not remotely ready, And of course
the big concern here is privacy if private companies can
actually read our thoughts. But there's actually a long way
to go before this research leaves the lab. Nonetheless, the
experiment was kind of a maze. So thirty five volunteers
(04:02):
sat under magnetic brain imaging scanners and typed on a keyboard.
Based on prior training, an AI model was able to
predict what they were writing, and meture research is accurately
decoded between seventy and eighty percent of what people typed.
In other words, with seventy to eighty percent certainty, it
could know before I clicked a T that I was
(04:23):
about to click the T. And so the real promise
here is actually a data from this research is beginning
to give neuroscientists a path to understanding how abstract thoughts
are converted into language by the human brain.
Speaker 2 (04:38):
Then I think the other part of this is that
we're getting closer and closer to this idea that we
can have wearables that do this kind.
Speaker 1 (04:45):
Of tech totally. But of course, a wearable headset that
can can actually read your thoughts and translate them into
language is something that you know, conceivably could change a
lot of people's lives. In another kind of science fiction
becomes science fact story, it's about the wooly mammoths. The
headline from MPR was just irresistible hoping to revive mammoths,
(05:07):
scientists create wooly mice. Yeah, and I think one of
the scientists that we knew we could do it, but
we didn't know they would be this cute and they're
worth a look. But the story is about a company
called Colossal Biosciences, and they are, by their own account,
the first and only de extinction company.
Speaker 2 (05:27):
Okay, this was a concept I had never heard of
until this week.
Speaker 1 (05:30):
Yeah, this one's been one that's I've been intrigued by
for it for a long time, and I hope we'll
be able to cover it on an episode of the
story before too long. But Colossal's website points out that
nine hundred and two species are extinct and more than
nine two hundred are critically endangered, and their mission is
to restore extinct species to preserve biodiversity. It's a little controversial.
(05:52):
Some people think there are more efficient ways to do
conservation than reviving extinct species, you know, But to that
I would say, I mean, look at the wooly mouse.
This is whether or not you think this is the
most efficient investment. It is absolutely wild. So picture a
mouse with fluffy, orange tan fur that looks like it
(06:13):
got very wet and then got a blow dry at
the salon. You've got the picture.
Speaker 2 (06:18):
They are extremely cute, and.
Speaker 1 (06:19):
The way Colossal made them was first studying the wooly
mammoth genome and then genetically engineering mice by modifying seven
key genes to make them more like wooly mammoths. You know,
the wool obviously being the most visible element, but also
some things that were invisible, like the way the mice
store fat and their fat metabolism makes them much more
(06:40):
able to survive in the cold. And according to Colossal,
the plan is to implant wooly mammoth esque modified embryos
to Asian elephants by twenty twenty eight. This week, was
also the Oscars and we both saw the movie that
won Best Live Action.
Speaker 2 (07:00):
Please tell people about it. It's wonderful.
Speaker 1 (07:03):
So it's a Belgian Dutch copro called I'm Not a Robot?
What did you make of it?
Speaker 2 (07:08):
I was extremely tickled by this promise.
Speaker 1 (07:11):
So, for those who haven't seen it, the film was
written and directed by Victoria Wamadam and it's about a
music producer who fails a series of capture tests and
in so doing us to question whether she's in fact human.
Speaker 2 (07:26):
I mean the minute I knew that we were having
a capture test as part of the plot to this movie,
I was all in. I don't know if you have
this feeling, but I hate failing captured tests, especially when
you have to click I'm not a robot and all
you have to do is choose squares that show images
of street lights or motorcycles or bikes. How can I
(07:46):
get that wrong?
Speaker 1 (07:47):
Yeah? So she's failing the tests again and again, even
though it looks like she's doing it right. And then
she gets to pop up with another quiz and one
of the questions is did your parents die before you
met them? And she answers, she answers, yes, and I
don't want to spoil the whole plot. It gets pretty eerie,
but it's a fascinating film well worth a watch. You
can check it out actually on the New Yorker website
(08:08):
because they were involved in releasing the film and on YouTube,
and as a tech nerd, I was rooting for them
to win the Best Live Action Short and did.
Speaker 3 (08:18):
Yes.
Speaker 2 (08:18):
Congratulations team. I'm not a robot, So.
Speaker 1 (08:22):
Stick around as well after the break for a look
at how AI was used in this year's OSCAR nominated
feature films, including The Brutalist, and for a conversation with
Jason Kebler about what it's like to attend an AI
film festival. Stay with us, Welcome back. The Oscars were
(08:45):
on Sunday, so we're going to stick with movies. Back
in twenty twenty three, the Hollywood Writers' Strike was this
fascinating early example of a very public negotiation over how
AI might could, and even would disrupt and displace human labour. Ultimately,
the Writers Guild of America signed an agreement with the
Alliance of Motion Picture and Television Producers that Generative AI
(09:09):
would not reduce or eliminate writers and their pay. But
this was not a commitment by the industry not to
use generative AI in filmmaking, far from it. In fact,
this January, the editor of the Triple Oscar winning movie
The Brutalist told an industry publication that he had used
generative AI a few times in post production. Some of
(09:31):
the actors in The Brutalist, namely Felicity Jones and Adrian Brody,
performed their roles with a heavy Hungarian accent, and they
even had some dialogue in Hungarian. To prepare for the roles,
Brody and Jones spent months with a dialect coach to
perfect their accents, but as The Brutalist editor David Joncho,
a native Hungarian speaker, pointed out, English speakers can have
(09:54):
a hard time pronouncing certain sounds. In post he tried
to perfect the Hungarian in dialogue, and first the team
had the actor's reader of the lines in the studio.
Then they tried having other actors say the lines, but
that also didn't sound right, so Yoncho turned to AI.
He fared Brody and Jones's voices into the program respeecher
(10:15):
and then using his own voice, Yoncho refined certain vowels
and letters for accuracy, a process that could have been
done without generative AI, like in an audio editors such
as pro Tools, but Respeecher made the process much more
efficient and of course Adrian Brody won the Oscar for
Best Actor. As us Say Today reported, not all viewers
(10:35):
would pleased with the news. Don't think it's too reactionary
to say this movie should lose the Academy buzz. It
was getting one person posted on eggs. But the manipulation
of vocal tracks is not uncommon in movies. Deadline noted
that combinations of vocal tracks will use in performances like
Romy Mallock's Oscar winning portrayal of Freddie Mercury, and Respeecher
may have been used in another film nominated for Best
(10:57):
Picture this year, Amelia Perez. The rise of generative AI
has been remarkably fast in creative industries. But one big
question I have is how far could this go and
how soon? And to answer that, we want to turn
to our friend Jason Kebler at four or four Media,
who not too long ago attended a film festival of
AI generated movies. Jason, welcome back to the show.
Speaker 3 (11:20):
Hey, thanks for having me.
Speaker 1 (11:21):
Before we get into that film festival you went to
could you just explain how respeech it works and how
it was used in the editing process for the Brutalist.
Speaker 3 (11:29):
Yeah. So, respeacher is an AI voice synthesizer, and so
it takes training data of an actor's voice and runs
it against a large language model. So in this case,
it would probably be examples of the Hungarian language, et cetera.
And it would take Adrian Brodie's voice and make it
more closely match other examples of Hungarian language. And it's
(11:56):
very interesting because this technology is sort of one of
the first native AI technologies that was widely used commercially,
not just Respeecher, but another company called eleven Labs has
become really famous for like Eric Adams, the mayor of
New York City, did a calling campaign to various communities
(12:18):
in New York City where he spoke English, but then
eleven Labs translated his voice into like fifteen different languages.
And it's not just like a robot voice reading it
sounds like Eric Adams speaking Mandarin or Eric Adams speaking Hungarian.
And so increasingly this is being used in movies, not
just Respeecher, but also eleven Labs and other tools like it,
(12:41):
and it really is like one of the first big
commercial uses of generative AI in movies.
Speaker 1 (12:47):
To me, it feels like it's not that far away
from other post production tools that have been super charged
by AI, like description, podcast editing, or other tools like that.
Speaker 3 (12:56):
Yeah, I mean it's really interesting because I think that
music had this a long time ago, with things like autotune,
and it's like many, many, many popular artists use autotune,
and this is a very similar technology. I mean it's
it's in the same family of technologies at least. So
it just becomes a question of how much post can
(13:17):
there be for the human performance to still be there.
And I think it's a really open question at this point.
I think if you asked me a while ago, I
would say they're changing the performance in some fundamental way.
But I think everything in a movie is so carefully edited,
so carefully shot. They do hundreds of takes for certain
(13:38):
scenes and then splice together different takes and cuts, and
so I think it really is a spectrum of what
you are willing to accept if you're in the Academy
and need to decide whether someone is worthy of an
award for this, I think audiences sort of have to
accept it because it's being done, and it's been done
for a long time. And I think that if you
start like having purity tests about this sort of thing,
(14:01):
I think it's going to be pretty difficult to know
which movies to see and which are not to see,
because ye, honestly, the only reason we know that this
was used at all was because the editor talked about
it to the media.
Speaker 1 (14:12):
Yeah. And also, I mean, to be fair to Adrian Brody,
I doubt that many Academy members would have voted against
him on the basis to his owncungarian accident wasn't quite perfect,
So I'm not sure that this was like the key
input to his victory. But what you said about like
the role of post production and what that means visa
v like the original product made me think about this
(14:33):
AI generated film festival that you went to. So, first
of all, what made this an AI generated film festival?
How much of the films were AI generated?
Speaker 3 (14:43):
Yeah, so it varied for each movie, but I think
that if you walked in off the street, you would say, oh,
these films were made with AI And what I mean
by that is each movie had visuals that were clearly
AI generated, like a lot of the backgrounds were constantly
changed in a way that if you were using a camera,
(15:03):
they wouldn't happen. A lot of people had like faces
that were morphing from scene to scene. One thing I
will say though, is that TCL was very clear that
all of the scripts were written by humans, and all
the voices were done by humans, and all of the
music was done by humans. The artificial intelligence was limited
(15:24):
to the visuals in different movies.
Speaker 1 (15:27):
Can you just take me back to kind of how
you got invited and what questions you had going in?
Speaker 3 (15:32):
Yeah, So I went to the Chinese Theater in Hollywood,
which is ironically where the oscars are. It's like the
same complex. And that theater is owned by TCL, which
is a Chinese TV manufacturer, and like a lot of
other TV manufacturers at this point, they have their own
free streaming TV service if you buy a TCL TV,
(15:56):
And TCL is the first company to put fully AI
generated movies on its streaming service. And so this was
a premiere of five films that were created using generative AI.
And so I had been writing basically about this technology
for a while and they invited me to come watch them.
Speaker 1 (16:14):
So, despite the fact that you'll perhaps more on the
skeptical side, they welcome you into the film festival.
Speaker 3 (16:19):
I was pretty shocked that they invited me, because honestly,
I had written about a trailer that they released for
an AI generated film and I kind of dunked on it.
I said, it was really terrible. It's called Last Train Paris,
and it was like an AI generated rom com. And
in the YouTube video, it's like the lip syncing of
the audio and the lips is like really bad. The
(16:42):
characters move incredibly robotically, and it has this very dreamlike
quality to it that is very common with AI generated visuals,
where it's not like a cool effect. It's like, wow,
this is really distracting because the background is constantly swirling
and changing and things are popping in and out. And
after I wrote that article, they still decided to invite me,
(17:03):
So I thought that was brave of them.
Speaker 1 (17:05):
But what did you think, I mean, what were you
kind of expecting going into it?
Speaker 3 (17:08):
Going in? I thought that they would be pretty bad,
to be totally honest with you, just because the state
of the art at the time. This was back in December,
which it was only three months ago, but at the time,
AI video generators were pretty bad, and I didn't think
(17:28):
that TCL had access to some proprietary system that we
hadn't seen before. I figured that they would be using
the state of the art that you can find on
the internet, and I think that those tools are not
very good, and so I thought that they would be bad,
to be totally honest with you, and they were bad.
Speaker 1 (17:48):
Can you describe some of the highlights on the low Lights?
Speaker 3 (17:51):
Yeah? I thought that the films themselves were just they
felt pretty rushed. So one of them was called The
Slug and it's about a woman who turns into a slug.
She has a disease that turns her into a slug
and it feels like The Substance, which is another you know,
Oscar nominated film. The visuals on it are wild. Things
are just like constantly changing. Her face is changing, the
(18:13):
you know, the food is changing. There's a lot of
like weird screams that happen that are not super well
timed with the dialogue. And then also there's like a
scene where the woman takes a bath and there's like
a close up on some bath salts and like the
text on that label is like an alien language because
(18:33):
AI has like a really bad time generating text, and
I guess you can take it with a grain of
salt or say like, hey, this is early technology. But
when you're watching something as a viewer in a movie
theater on this giant screen and the text is completely
not even in English, it's like, wow, it really takes
you out of the narrative.
Speaker 1 (18:52):
I would say, I mean it's a weird idea, right,
because I mean you mentioned this is for TCL, the
Chinese TV manufacturer, and the assumption be like, they don't
want you to change the channel, right, they want you
to have their own channel on kind of in the
background so that you know your attention is with them
and they can sell you ads whatever it may be.
But that's very different to like putting hundreds of people
in a movie theater and kind of fulcing them to
(19:14):
watch with full attention, right, yeah.
Speaker 2 (19:16):
Yeah.
Speaker 3 (19:16):
And it's very interesting because before the movies played, two
TCL executives addressed the audience, and it was very interesting
the difference between what they were saying and what the
filmmakers were saying, because the TCL executives were business people
and they were saying our research shows that almost no
one changes the channel once they're watching something like this,
(19:38):
like they are watching it in the background usually, and
so their hope is that you're just going to be
too lazy to change the channel.
Speaker 1 (19:46):
So inspiring creative brief.
Speaker 3 (19:49):
Right, right, And then the other executives said, like, we're
going to use this as part of our targeted advertising strategy,
which was pretty dystopian. And then the actual filmmakers came
on and said, you know, we put our heart and
soul into this, and we think this is the future
of the industry. So that was kind of like a
whiplash situation for me in the audience.
Speaker 1 (20:12):
When we come back, more from Jason Kebler about the
rapid advances in generative AI video technology and how the
state of the art is evolving in real time, stay
with us. Welcome back to our conversation with Jason Kebler
(20:37):
from four or four Media, where we continue our conversation
about a recent AI film festival he attended. There was
one film though, which I think was like a kind
of blended documentary and AI film that you thought was
potentially a bit more interesting.
Speaker 3 (20:53):
Yeah, I thought it was pretty cool. I mean, it
still had a lot of problems, but It was called
The Best Day of My Life, and it was mountaineering
documentary where a mountaineer who got trapped in an avalanche
is talking directly to the camera, like the actual person
is talking directly to the camera recounting his story, and
(21:14):
as he is telling his story, they flashed to generative
AI depictions of what he is saying, And so I
thought that was kind of interesting because this is something
that happened to the guy. He obviously didn't bring a
camera with him at the time, and you were able
to sort of like see what he was describing.
Speaker 1 (21:34):
In a way that was actually viscerally compelling, or in
a way that's still felt a bit uncanny and jarring.
Speaker 3 (21:40):
In a way that made me think that maybe this
has potential in the future, but this isn't quite there yet,
because it similarly like the there's various scenes in the film,
and the guy who's happening to changes in each scene.
It's like his face looks different in different scenes. He
was under snow because it was an avalanche, and then
(22:02):
in the next scene all of the snow had turned
to mud, and then it turned back to snow, and
it was like, similarly took you out of the narrative,
but I thought that the idea behind it was pretty
interesting and I could see that being a direction that
future documentaries go.
Speaker 1 (22:23):
And was what was the feeling like in the room?
I mean, who else was in the audience? What was
the general takeaway from this experience?
Speaker 3 (22:30):
The mood in the theater was one of incredible optimism
and excitement. It was a mix of people who had
worked on these films and people who have like a
lot of money invested in the idea that this is
going to be the next big thing in Hollywood. And
so the mood in the theater was one of incredible
(22:50):
optimism and excitement. Meanwhile, the films like Objectively are not good.
They're really They're all on YouTube now and if you
go watch them, like the comments brutal, there's not a
lot of views on them. I think on some of them,
the comments that you even been turned off because people
are like, how could you dare put this on my television.
So I did think it was interesting because it reminded
(23:13):
me of things that I had been to in the past,
for like virtual reality or for cryptocurrency, things like that,
And a lot of people have said like generative AI
is the new crypto, it's the new metaverse, it's the
new virtual reality. And I think that AI there's like
a lot of snake oil out there, but undeniably companies
(23:35):
are leaning into it in a way that's going to
affect us and affect workers and affect people in the industry.
Speaker 1 (23:42):
It's also interesting where companies fall in terms of how
vocal they want to be about how they see the
AI future unfolding. Right, Like, obviously for Chinese TV manufacturer,
alienating Hollywood doesn't really matter that much, right, whereas like
full Hollywood studios had to behave very differently.
Speaker 3 (24:01):
Yeah, it's super interesting, and that's a great point because,
as you said, like the Writer's Guild strike was partially
about generative AI in the writer's rooms, a lot of
voice actors, going back to Respeecher, voice actors in both
the video game world and the animation world are really
worried that AI voices are going to replace their jobs
(24:24):
or that they're going to get less work because AI
is going to be used to generate voices for animation
and video games. And then, of course, like you said,
a lot of companies are laying off their workers in
a bunch of industries and then realizing, oh wait, the
AI is not good enough to do these jobs yet.
And so there's a real tension about it because fundamentally,
(24:45):
this is an automation technology. It's designed to replace human
labor or do things that sometimes humans can't do. And
I do think that a lot of companies are going
to be able to differentiate themselves by saying we do
not use AI, we respect human artists, we don't want
to do that. And then some companies are going their
(25:06):
total opposite way, like TCL, which has very little original programming,
very little relationships in Hollywood. They don't care if they
piss off directors and actors and things like that because
they're just trying to make a name for themselves, so
they're able to be more aggressive about this.
Speaker 1 (25:21):
So I guess, on the one hand, you have like
TCL and more or less fully AI generated films. On
the other hand, you have the brutalist where you know
at the margins AI was used and respeech, she was
used to do some accent correction. Do you see like
ultimately a convergence between those two things, or do you
think it will remain that like AI is either used
(25:42):
in like premium productions for optimizing posts, shall we say.
And on the other hand, you have like this kind
of wild west of full AI generation, which is a
long way off from being consumable.
Speaker 3 (25:53):
Yeah, I mean, I do think it's a spectrum and
slippery slope, if you will. And Special Effects have in
general been incorporating a lot more AI over the last
few years. I think one that was really interesting to
me was when the first deep fakes were sort of invented,
maybe like five or six years ago, where you can
(26:13):
like replace someone's face with another face. Star Wars had
tried to generate like Carrie Fisher after she had died
for one of the Star Wars films, and apparently they
spent like millions of dollars doing this. And then someone
on Reddit using deep fake technology was able to do
something that was almost indistinguishable from what Lucasfilms had done,
(26:36):
like on their computer at home, for free. And so
I do think that we're going to see a lot
more of this stuff in films, but you may not
even notice that's happening when they start replacing artists, replacing musicians,
replacing actors with AI. I think that's I personally think
that's a problem, and I think that that's when you
(26:57):
end up with a lesser product. Yeah, I don't know.
I hope that AI is going to be used to
make films better, not to create tons of low budget,
poorly made films that are designed to scratch a specific
itch or perform an algorithm, which we're definitely gonna see
a lot.
Speaker 1 (27:15):
Of itist you're a humanist at HUT, Yeah, yeah, And
I mean you mentioned that this film festival was a
couple of months ago. Has the state of the art
change since then? I was playing around with this Google
deep Mind product called vo two. At least on like
a scene by scene basis, you can make pretty good
photo realistic depictions, but then like a couple of seconds each.
(27:37):
I don't think they've figured out that any means how
to stitch them together or make continuity. But how is
the state of the art devolving?
Speaker 3 (27:44):
It's changed a lot in the last three months. There's
been a lot of Chinese companies that have released video
models in the last just a couple of weeks, like
ten Cent, which is a massive Chinese company, released a
new video model that seems to be better than most
publicly released video models. You know, it was sort of
immediately used by people to create non consensual pornography, which
(28:07):
is quite upsetting and is what a lot of people
are using these tools for on the internet. But basically
it's like every week there's a new model and they're
they're constantly leapfrogging each other. So you know, one will
be able to generate hands better than another, one will
be able to generate faces better than another, one will
have like better movement when you try to make these
(28:28):
people move, or they require less training data, meaning you
can make videos based on one input image versus having
to feed hours of footage into a model to create
something else. And so you know, these are things that
like AI nerds spend a lot of time caring about,
and I would say that there is a big generational
(28:49):
difference between them. But as like a consumer of these things,
you might not know that this is happening behind the scenes.
But the short version is basically it's getting easier to
make a generated video, it's getting cheaper to do it,
and the quality is getting better and it's changing on
like a day to day basis. At this point, Jason,
(29:14):
thank you so much. Thank you so much for having me.
Speaker 1 (29:20):
That's it for this week. For tech Stuff, I'm oz Voloshin.
This episode was produced by Eliza Dennis and Victoria Dominguez.
It was executive produced by me Carrot Price and Kate
Osborne for Kaleidoscope and Katrina Norvell for iHeart Podcasts. The
Heath Fraser is our engineer. Kyle Murdoch mixed this episode
and he also wrote our theme song. Join us next
(29:40):
Wednesday for tech stuff The Story, when we'll share an
in depth conversation with the neuroscientist David Eagleman about people
who develop romantic relationships with AI. Please rate, review, and
reach out to us at tech Stuff podcast at gmail
dot com.
Speaker 2 (30:00):
Eight