Episode Transcript
Available transcripts are automatically generated. Complete accuracy is not guaranteed.
Speaker 1 (00:00):
The New York Times with the racient article what do
you do when AI takes your voice? These are two
voice actors. They happen to be married to each other.
And if I wasn't a giant freaking moron, I would
have had Hanson get the audio ready?
Speaker 2 (00:14):
You heard I like it when you beat up on
yourself like that. If I wasn't a giant freaking moron, well.
Speaker 3 (00:22):
Call a spade a spade. Yes.
Speaker 2 (00:23):
A neuroscientist friend of mine told me the other day,
they don't believe there's a chance that that does you
any harm by calling you bad names. You know how,
we've heard for a long time you shouldn't. Don't call
yourself things that you would never allow somebody else to
call you. It's bad for your blah blah blah. Anyway,
this neuroscientist said, there's no way that that's doing you
any harm. So because they regularly say that about themselves.
Speaker 3 (00:45):
So I don't know.
Speaker 1 (00:47):
At this point in my life, I kind of accept
the way my brain works. I just I've been doing
this such a long time. I'm reading this article, I'm
listening to the audio clips, and I'm thinking, wow, that's something.
As I'm getting ready to present it on the show,
and yet, as a giant freaking more on a GFM
(01:09):
as we're known, it hadn't occurred to me to get
the audio to Hanson so he could play it for you,
the listeners, who are the only reason this show exists,
you giant freaking moron.
Speaker 3 (01:20):
Anyway, the self beating will now end.
Speaker 1 (01:23):
Oh but so anyway, these two voice actors who do
a lot of voiceover work. The first thing that was
interesting to me was they are very much the not
an announcer voice. They're the Hey, I went down to
the Apple store and I was looking at phones, kind
of the everyman, not terribly impressive voice, and her main
(01:44):
presentation is, you know, I'm a girl who works in
an office thing. But they're very well thought of and
employed and the rest of it. And it's a tough
gig man that's super competitive, as you might guess. But
they both had the experience recently of hearing things on
the radio on TV that.
Speaker 3 (02:01):
Sound exactly like them, clearly.
Speaker 1 (02:07):
Mind from their voices, but they didn't do the work,
and so now they're looking at the legalities and find
good luck suits and yeah, I would say they have
a chance of getting something done now. But I happen
to be chatting with a friend over dinner last night,
(02:28):
and a very smart guy. He's an engineer and well
acquainted with a lot of the technology, and we were
agreeing that the technology will move so swiftly the legal
system cannot conceivably keep up with it.
Speaker 3 (02:43):
No, I don't believe so.
Speaker 1 (02:45):
Well, never you know, regular whether in terms of regulations
or people like this who've been aggrieved.
Speaker 2 (02:51):
Well, what the voice thing?
Speaker 3 (02:53):
Though?
Speaker 2 (02:53):
Particularly some people sound the same, So how would you
even nail that down? Anyway? Do you remember when?
Speaker 3 (02:59):
Well? And how closest? Too close? All right?
Speaker 2 (03:01):
But do you remember when Shannon Farren, who works at
KFI Radio in Los Angeles worked here where we work
and Rachel Bell, who works at used to work at
ktt H where we're on the air in Seattle. They
both worked here at the same time, and they were friends,
and they sounded exactly alike. Do you remember that, Michael,
(03:22):
They had the same voices, the same inflection.
Speaker 3 (03:25):
It was weird. I'd never experienced that before. They were
friends and sounded exactly alike.
Speaker 2 (03:31):
But there's no way if you lifted one of their
voices for AI, one of them couldn't.
Speaker 3 (03:35):
Say no, that was my voice, that was my accounts.
Speaker 1 (03:38):
Yeah, shout out tool down anyway, great women of radio.
Really terrific house anyway. So yeah, good luck with that.
And I don't say that in a cynical way. I
say it in a sympathetic way. But ages and ages ago.
When I was in college Gladys it was the hot
summer of nineteen eight whatever the hell it was, and
(04:01):
there I was sitting in class in my favorite, favorite professor,
ir H. Carmen, who taught constitutional issues. I'm fascinating, man,
I just one of my heroes. He'd written a book
which I still have called Cloning in the Constitution, which
is now wildly out of date. But the point of
the book is that the progress in this case biological science,
(04:23):
decoding the genome, cloning, all that stuff was moving so
swiftly the task of adapting law to it was going
to be incredibly challenging. And that was at the pace
of change of the nineteen eighties. So doctor Carmon, you
were prescient indeed anyway, moving along. Just to get a
couple of more exhibits before the Court of Public Opinion,
(04:45):
we had the headline. I think it was yesterday, Microsoft
unveils creep This is the words of Breitbart, creepy AI
powered windows that track everything you do. It doesn't bother
me as much as it does some people because that
data will be stored on your device and only on
your device. But it can go back twenty seven seconds,
(05:06):
that's what they say. A big tech wouldn't lie to us, Jack.
But you can go back twenty seconds, twenty seven seconds
on your computer and oh, where's that window?
Speaker 3 (05:15):
Where'd I do? What happened?
Speaker 1 (05:16):
No, just go back twenty seven seconds there. It is
safe for you, all powered by AI. I don't know
enough to know whether I should be worried or not.
But this is also getting a lot of attention in
the New York Times. Once again, this Russian woman loves China.
Too bad, she's a deep fake. And it goes into
how for fun profit and propaganda purposes, Chinese people, merchants
(05:39):
and the government are really drilling down on the deep
fake thing and producing videos of attractive women who are
talking about, you know, the glories of China or how
you are to get a mail order pride or order
the salv or what have you, and the quality of
it is it's somewhere between shocking and oh my god,
(06:02):
and this is going to surround us soon. An utterly
one hund convincing Donald J. Trump in high definition announcing
that he's calling for the invasion of Delaware. Just stamp
out Joe Biden and his evil spawn and and and
then where are we?
Speaker 2 (06:22):
I was listening to an NPR segment yes yesterday with
somebody on there talking about some sort of global body
that regulates AI, and it was the most unicorn fanciful
good luck with that story I've ever heard. Like, you're
gonna get China and North Korean Russia on board with
following the rules for artificial intelligence.
Speaker 1 (06:43):
I seriously don't know how those people's brains work.
Speaker 2 (06:47):
I don't either.
Speaker 3 (06:48):
What does the world look like to you? Deon?
Speaker 2 (06:51):
Want to end with something hilarious around AI?
Speaker 3 (06:54):
No, let's avoid anything amusing. This is from Groc. What
is Groc again? Which one is Groc? Wow? I haven't
heard Grock referenced in forever.
Speaker 2 (07:05):
It's it's twenty year, it's one year.
Speaker 1 (07:08):
You know.
Speaker 2 (07:08):
You can ask it questions things anyway, Elon Musky is yesterday.
You can ask it questions and it gives you answers.
And Elon Musk said, this is the quality of humor
we want from Groc. And this is pretty funny for
being written by Ai so well. Somebody asked Groc, the
question is done just a cheap ripoff of Star Wars,
(07:32):
and Groc answered, No, Dune is a very expensive ripoff
of Star Wars.
Speaker 3 (07:37):
The actual ripoff was cheap.
Speaker 2 (07:39):
Frank Herbert watched Star Wars and Opening Night in nineteen
seventy seven, then cribbed a bunch of notes, changed the
name of the planet Tattween to Aracus. I don't know
these names because I don't watch this stuff, change the
name of Luke Skywalker to Paul atriot Is, and pretty
much left everything else exactly the same. The expensive part
was building a time machine so they could go back
(07:59):
and I'm in published Dune in nineteen sixty five, twelve
years before Star Wars even came out. Frank Herbert spent
the entire family fortune building his time machine and even
had to sell his family ranch in California and the
family stock in general electric Most scholars consider it one
of the most expensive, if not the most expensive cases
of plagiarism of the twentieth century.
Speaker 1 (08:20):
Wow, that is some like disturbingly good sarcasm.
Speaker 2 (08:24):
If Groc was able to write that sarcastic, funny answer
to that question.
Speaker 3 (08:31):
That's scary.
Speaker 1 (08:33):
If Ai has a grasp of irony, eh now I'm
freaked out.
Speaker 2 (08:38):
That because that's troublingly funny and clever.
Speaker 3 (08:43):
Wow. Yeah, I'm sorry, I'm disturbed.
Speaker 2 (08:46):
Of course, we part was building a time machine to
go back to nineteen sixty five.
Speaker 3 (08:53):
That's really good.